Zep Memory
Last updated
Last updated
Zep is long-term memory store for LLM applications. It stores, summarizes, embeds, indexes, and enriches LLM app / chatbot histories, and exposes them via simple, low-latency APIs.
You can easily deploy Zep to cloud services like Render, Flyio. If you prefer to test it locally, you can also spin up a docker container by following their quick guide.
In this example, we are going to deploy to Render.
Head over to Zep Repo and click Deploy to Render
This will bring you to Render's Blueprint page and simply click Create New Resources
When the deployment is done, you should see 3 applications created on your dashboard
Simply click the first one called zep and copy the deployed URL
Clone the Repo
Add IN your OpenAI API Key in.ENV
Allow firewall access to port 8000
If using Digital ocean separate firewall from dashboard, make sure port 8000 is added there too
Back to Flowise application, simply create a new canvas or use one of the template from marketplace. In this example, we are going to use Simple Conversational Chain
Replace Buffer Memory with Zep Memory. Then replace the Base URL with the Zep URL you have copied above
Save the chatflow and test it out to see if conversations are remembered.
Now try clearing the chat history, you should see that it is now unable to remember the previous conversations.
Zep allows you to secure your instance using JWT authentication. We'll be using the zepcli
command line utility here.
After downloaded the ZepCLI:
On Linux or MacOS
On Windows
You will first get your SECRET Token:
Then you will get JWT Token:
Set the following environment variables in your Zep server environment:
Add a new credential for Zep, and put in the JWT Token in the API Key field:
In the Zep node Connect Credential, select the credential you have just created. And that's it!