Memory
LangChain Memory Nodes
Memory allow you to chat with AI as if AI has the memory of previous conversations.
Human: hi i am bob
AI: Hello Bob! It's nice to meet you. How can I assist you today?
Human: what's my name?
AI: Your name is Bob, as you mentioned earlier.
Under the hood, these conversations are stored in arrays or databases, and provided as context to LLM. For example:
Memory Nodes:
Separate conversations for multiple users
UI & Embedded Chat
By default, UI and Embedded Chat will automatically separate different users conversations. This is done by generating a unique chatId
for each new interaction. That logic is handled under the hood by Flowise.
Prediction API
You can separate the conversations for multiple users by specifying a unique sessionId
For every memory node, you should be able to see a input parameter
Session ID
In the
/api/v1/prediction/{your-chatflowid}
POST body request, specify thesessionId
inoverrideConfig
Message API
GET
/api/v1/chatmessage/{your-chatflowid}
DELETE
/api/v1/chatmessage/{your-chatflowid}
Query Param | Type | Value |
---|---|---|
sessionId | string | |
sort | enum | ASC or DESC |
startDate | string | |
endDate | string |
All conversations can be visualized and managed from UI as well:
For OpenAI Assistant, Threads will be used to store conversations.
Last updated