What is ChatMemory?
ChatMemory represents conversational state in LangChain4j. It stores past user and AI messages and ensures that each new model invocation receives the relevant conversation context.
Why ChatMemory is needed
Without memory, each interaction is stateless. ChatMemory allows the model to recall facts introduced earlier, maintain continuity, and produce coherent multi-turn conversations. Check out basic LLM memory tutorial here.
Common use cases
- Multi-step user conversations
- Chatbots requiring context awareness
- Interactive assistants with session-level state
Example
package com.logicbig.example;
import dev.langchain4j.data.message.AiMessage;
import dev.langchain4j.data.message.SystemMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.ollama.OllamaChatModel;
public class ChatMemoryExample {
public static void main(String[] args) {
// 1. Create model
ChatModel model = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("phi3:mini-128k")
.numCtx(4096)
.temperature(0.0)
.build();
// 2. Create chat memory (keeps last 10 messages)
ChatMemory memory = MessageWindowChatMemory.withMaxMessages(20);
memory.add(SystemMessage.from("You are a calculator "
+ "please return only "
+ "the result of the calculation," +
"no wordings"));
// ---- Turn 1 ----
UserMessage user1 = UserMessage.from("Given x=3 and y=4, "
+ "what is x+y?");
memory.add(user1);
AiMessage ai1 = model.chat(memory.messages()).aiMessage();
printConversation(user1, ai1);
memory.add(ai1);
// ---- Turn 2 ----
UserMessage user2 = UserMessage.from("What is x*y?");
memory.add(user2);
AiMessage ai2 = model.chat(memory.messages()).aiMessage();
memory.add(ai2);
printConversation(user2, ai2);
UserMessage user3 = UserMessage.from("what is the sum of "
+ "all previous calculations?");
memory.add(user3);
AiMessage ai3 = model.chat(memory.messages()).aiMessage();
memory.add(ai3);
printConversation(user3, ai3);
}
private static void printConversation(UserMessage userMessage,
AiMessage aiResponse) {
System.out.println("-------");
System.out.println("User message: " + userMessage.singleText());
System.out.println("AI response: " + aiResponse.text());
}
}
Output------- User message: Given x=3 and y=4, what is x+y? AI response: 7 ------- User message: What is x*y? AI response: 12 ------- User message: what is the sum of all previous calculations? AI response: The sum of all previous calculations (x + y and x * y) would be: 7 + 12 = 19.
Conclusion
The output confirms that the model correctly recalls information introduced earlier in the conversation. This behavior is possible only because previous messages were retained and supplied through ChatMemory.
Example ProjectDependencies and Technologies Used: - langchain4j 1.10.0 (Build LLM-powered applications in Java: chatbots, agents, RAG, and much more)
- langchain4j-ollama 1.10.0 (LangChain4j :: Integration :: Ollama)
- JDK 17
- Maven 3.9.11
|