ChatOllama
Prerequisite
For example, you can use the following command to spin up a Docker instance with llama3
Setup
Chat Models > drag ChatOllama node
![](https://docs.flowiseai.com/~gitbook/image?url=https%3A%2F%2F1820151947-files.gitbook.io%2F%7E%2Ffiles%2Fv0%2Fb%2Fgitbook-x-prod.appspot.com%2Fo%2Fspaces%252Fy8ifwt9BYklr92KDdr48%252Fuploads%252FEJhgB4xsHLtIveTCBf3P%252Fimage.png%3Falt%3Dmedia%26token%3D4dd64310-cd71-43e3-b395-b2fea2d2c5d7&width=768&dpr=4&quality=100&sign=8c50408d&sv=1)
Fill in the model that is running on Ollama. For example:
llama2
. You can also use additional parameters:
![](https://docs.flowiseai.com/~gitbook/image?url=https%3A%2F%2F1820151947-files.gitbook.io%2F%7E%2Ffiles%2Fv0%2Fb%2Fgitbook-x-prod.appspot.com%2Fo%2Fspaces%252Fy8ifwt9BYklr92KDdr48%252Fuploads%252Fkyp6OampL8upiJ4NQ9Lg%252Fimage.png%3Falt%3Dmedia%26token%3D120d89b4-6a82-45ac-8f35-4e5e0ed1c994&width=768&dpr=4&quality=100&sign=2de7c9f6&sv=1)
Voila 🎉, you can now use ChatOllama node in Flowise
![](https://docs.flowiseai.com/~gitbook/image?url=https%3A%2F%2F1820151947-files.gitbook.io%2F%7E%2Ffiles%2Fv0%2Fb%2Fgitbook-x-prod.appspot.com%2Fo%2Fspaces%252Fy8ifwt9BYklr92KDdr48%252Fuploads%252F1P5K6WzharofTWaKYVBZ%252Fimage.png%3Falt%3Dmedia%26token%3D8897d9ce-633f-45a5-9ba3-53e3f77d35ac&width=768&dpr=4&quality=100&sign=e0d61b03&sv=1)
Additional
If you are running both Flowise and Ollama on docker. You'll have to change the Base URL for ChatOllama.
For Windows and MacOS Operating Systems specify http://host.docker.internal:8000. For Linux based systems the default docker gateway should be used since host.docker.internal is not available: http://172.17.0.1:8000
![](https://docs.flowiseai.com/~gitbook/image?url=https%3A%2F%2F1820151947-files.gitbook.io%2F%7E%2Ffiles%2Fv0%2Fb%2Fgitbook-x-prod.appspot.com%2Fo%2Fspaces%252Fy8ifwt9BYklr92KDdr48%252Fuploads%252FEdlhZ76XqNYIZVF3ziHA%252Fimage.png%3Falt%3Dmedia%26token%3De7acbd38-a3e2-40f4-8d25-415b671babce&width=768&dpr=4&quality=100&sign=e5494706&sv=1)
Resources
Last updated