LocalAIis a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format.
To use LocalAI Embeddings within Flowise, follow the steps below:
gitclonehttps://github.com/go-skynet/LocalAI
cdLocalAI
LocalAI provides an API endpoint to download/install the model. In this example, we are going to use BERT Embeddings model:
In the /models folder, you should be able to see the downloaded model in there:
You can now test the embeddings:
Response should looks like:
Flowise Setup
Drag and drop a new LocalAIEmbeddings component to canvas: