# LocalAI Embeddings

## LocalAI Setup

[**LocalAI** ](https://github.com/go-skynet/LocalAI)is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format.

To use LocalAI Embeddings within Flowise, follow the steps below:

1. ```bash
   git clone https://github.com/go-skynet/LocalAI
   ```
2. <pre class="language-bash"><code class="lang-bash"><strong>cd LocalAI
   </strong></code></pre>
3. LocalAI provides an [API endpoint](https://localai.io/api-endpoints/index.html#applying-a-model---modelsapply) to download/install the model. In this example, we are going to use BERT Embeddings model:

<figure><img src="/files/G8msiSbPdKFlVgxC90V6" alt=""><figcaption></figcaption></figure>

4. In the `/models` folder, you should be able to see the downloaded model in there:

<figure><img src="/files/bhZkXqO3LkLXDk5scPzQ" alt=""><figcaption></figcaption></figure>

5. You can now test the embeddings:

```bash
curl http://localhost:8080/v1/embeddings -H "Content-Type: application/json" -d '{
    "input": "Test",
    "model": "text-embedding-ada-002"
  }'
```

6. Response should looks like:

<figure><img src="/files/gUoe2njzc7Ri6Gku8F4E" alt="" width="375"><figcaption></figcaption></figure>

## Flowise Setup

Drag and drop a new LocalAIEmbeddings component to canvas:

<figure><img src="/files/90zApX0ODjl6gj7W5AYy" alt=""><figcaption></figcaption></figure>

Fill in the fields:

* **Base Path**: The base url from LocalAI such as <http://localhost:8080/v1>
* **Model Name**: The model you want to use. Note that it must be inside `/models` folder of LocalAI directory. For instance: `text-embedding-ada-002`

That's it! For more information, refer to LocalAI [docs](https://localai.io/models/index.html#embeddings-bert).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.flowiseai.com/integrations/langchain/embeddings/localai-embeddings.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
