# ChatOllama

## Prerequisite

1. Download [Ollama](https://github.com/ollama/ollama) or run it on [Docker.](https://hub.docker.com/r/ollama/ollama)
2. For example, you can use the following command to spin up a Docker instance with llama3

   ```bash
   docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
   docker exec -it ollama ollama run llama3
   ```

## Setup

1. **Chat Models** > drag **ChatOllama** node

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2Fgit-blob-bde86b4973bdb31361b80f54fd01f1ad7abf77c5%2Fimage%20(139).png?alt=media" alt="" width="563"><figcaption></figcaption></figure>

2. Fill in the model that is running on Ollama. For example: `llama2`. You can also use additional parameters:

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2Fgit-blob-d3bec65d2769f5e12a2497efa76d687e886abc9d%2Fimage%20(140).png?alt=media" alt=""><figcaption></figcaption></figure>

3. Voila [🎉](https://emojipedia.org/party-popper/), you can now use **ChatOllama node** in Flowise

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2Fgit-blob-004a46b3c6799bfc5875e1885e7ac1d558d58192%2Fimage%20(141).png?alt=media" alt=""><figcaption></figcaption></figure>

### Running on Docker

If you are running both Flowise and Ollama on docker. You'll have to change the Base URL for ChatOllama.

For Windows and MacOS Operating Systems specify [http://host.docker.internal:8000](http://host.docker.internal:8000/). For Linux based systems the default docker gateway should be used since host.docker.internal is not available: [http://172.17.0.1:8000](http://172.17.0.1:8000/)

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2Fgit-blob-d814dc411850b1ea40c90ef6bec0fbe188aee449%2Fimage%20(142).png?alt=media" alt="" width="292"><figcaption></figcaption></figure>

## Ollama Cloud

1. Create an [API key](https://ollama.com/settings/keys) on **ollama.com**.
2. In Flowise, click **Create Credential** and select **Ollama API**, and enter your API Key.

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2FiX6xV5nZtSHA8Qrpe9v0%2Fimage.png?alt=media&#x26;token=5af24e71-ac98-4924-9244-5ce0ad33aa90" alt="" width="435"><figcaption></figcaption></figure>

3. Then, set the **Base URL** to `https://ollama.com`&#x20;
4. Enter the models that are available on Ollama Cloud.

<figure><img src="https://823733684-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F00tYLwhz5RyR7fJEhrWy%2Fuploads%2FIrnW1sBbky79Vzd8fEfF%2Fimage.png?alt=media&#x26;token=5780ebd8-480d-48a1-a194-fd7aa75a363e" alt="" width="394"><figcaption></figcaption></figure>

## Resources

* [LangchainJS ChatOllama](https://js.langchain.com/docs/integrations/chat/ollama)
* [Ollama](https://github.com/ollama/ollama)
