What is persistent ChatMemory?
Persistent ChatMemory stores conversation history outside application memory, allowing conversations to survive restarts.
Why persistence is important
Stateless deployments and distributed systems require memory to be externalized. Persistent memory enables long-lived user sessions.
Typical use cases
- User chat history retention
- Multi-instance deployments
- Session recovery
Understanding ChatMemoryStore
In LangChain4j, the ChatMemoryStore interface represents the persistence layer for your conversation history. While ChatMemory acts as the manager that decides which messages to keep (based on logic like a sliding window), the Store is the actual physical or virtual location where those messages are saved and retrieved. This separation of concerns allows you to swap out how data is stored without changing your application's logic.
InMemoryChatMemoryStore
LangChain4j provides one public implementation of ChatMemoryStore, which is InMemoryChatMemoryStore. This storage mechanism is transient and does not persist data across application restarts.
Example
package com.logicbig.example;
import dev.langchain4j.data.message.AiMessage;
import dev.langchain4j.data.message.ChatMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.ollama.OllamaChatModel;
import dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore;
public class PersistentChatMemoryExample {
public static void main(String[] args) {
ChatModel model = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("phi3:mini-128k")
.numCtx(4096)
.temperature(0.7)
.build();
InMemoryChatMemoryStore store = new InMemoryChatMemoryStore();
ChatMemory memory = MessageWindowChatMemory.builder()
.id("user-1")
.maxMessages(5)
.chatMemoryStore(store)
.build();
memory.add(UserMessage.from("Hi there!"));
AiMessage aiMessage = model.chat(memory.messages()).aiMessage();
memory.add(aiMessage);
System.out.println("-- retrieving messages from the store --");
for (ChatMessage message : store.getMessages(memory.id())) {
System.out.println(message);
}
}
}
Output-- retrieving messages from the store -- UserMessage { name = null, contents = [TextContent { text = "Hi there!" }], attributes = {} } AiMessage { text = "Hello! How can I assist you today?", thinking = null, toolExecutionRequests = [], attributes = {} }
Conclusion
The example shows that messages are stored in a memory store, allowing the same conversation state to be reused across executions.
Example ProjectDependencies and Technologies Used: - langchain4j 1.10.0 (Build LLM-powered applications in Java: chatbots, agents, RAG, and much more)
- langchain4j-ollama 1.10.0 (LangChain4j :: Integration :: Ollama)
- JDK 17
- Maven 3.9.11
|