We have seen tool calling in these tutorials using LangChain4j's low-level API. The following tutorial shows how to use tools with AiServices.
Extending LLM Capabilities with Tools
Tools (function calling) allow LLMs to execute Java methods, enabling them to perform actions beyond text generation. AI Services automatically expose annotated methods as tools that the LLM can call when appropriate. This is useful for:
- Performing calculations or data processing
- Querying databases or external APIs
- Executing business logic or workflows
- Accessing real-time information
@Tool Annotation
We use @Tool annotation in our Java code to tell LangChain4j that this method is an AI tool.
@Tool("adds two numbers")
int add(int a, int b) { return a + b; }
@Tool#value allows language model to understand the tool's purpose and its intended use.
How Tool Calling Works
- AI Service registers all
@Tool methods with the LLM
- When user query requires tool execution, LLM requests specific tool
- AI Service executes the tool with provided parameters
- Tool result is sent back to LLM for final response generation
- All tool executions are logged in the response metadata
Tool Parameter Descriptions
We can add descriptions to tools using the @P annotation for parameters:
Definition of PVersion: 1.10.0 package dev.langchain4j.agent.tool;
@Retention(RUNTIME)
@Target({ PARAMETER })
public @interface P {
String value(); 1
boolean required() default true; 2
}
Quick Example
@Tool
int add(@P("first number") int a, @P("second number") int b) {
return a + b;
}
Example
The following example uses Ollama and phi3:mini-128k which is good for a demo and learning but not good for production-grade applications because it has limited reasoning capabilities and accuracy for complex tasks.
In following example we are using custom tools made up names 'glip' and 'zorp' so the LLM relies strictly on the tool's output instead of its internal "intuition"
package com.logicbig.example;
import dev.langchain4j.agent.tool.P;
import dev.langchain4j.agent.tool.Tool;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.ollama.OllamaChatModel;
import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
public class ToolsExample {
static class Calc {
@Tool("first integer glip second integer")
int glip(@P("First integer") int a,
@P("Second Integer") int b) {
System.out.printf("tools called 'glip' with params: %s, %s%n", a, b);
return a + b;
}
@Tool("zorp two integers")
int zorp(@P("First integer") Integer a,
@P("First integer") Integer b) {
System.out.printf("tools called 'zorp' with params: %s, %s%n", a, b);
return a * b;
}
}
interface Assistant {
@SystemMessage("Only use tool results, don't use your own knowledge "
+ "or assume anything yourself. Return short answer. "
+ "The operations 'glip' and 'zorp' are custom operations.")
@UserMessage("{{it}}")
String chat(String message);
}
public static void main(String[] args) {
OllamaChatModel model =
OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("llama3.2:latest")
.numCtx(4096)
// Lower temp for more reliable tool calling
.temperature(0.0)
.build();
Calc tools = new Calc();
Assistant assistant =
AiServices.builder(Assistant.class)
.chatModel(model)
.tools(tools)
.chatMemory(MessageWindowChatMemory.withMaxMessages(10))
.build();
String message = "What is '15 glip 27' and '8 zorp 9'?";
System.out.println("user Msg: " + message);
String response = assistant.chat(message);
System.out.println("Response:\n" + response);
System.out.println("--------------");
String message2 = "What is glip of above two results?";
System.out.println("user Msg: " + message2);
String response2 = assistant.chat(message2);
System.out.println("Response:\n" + response2);
}
}
Outputuser Msg: What is '15 glip 27' and '8 zorp 9'? tools called 'glip' with params: 15, 27 tools called 'zorp' with params: 8, 9 Response: The result of '15 glip 27' is 42. The result of '8 zorp 9' is 72. -------------- user Msg: What is glip of above two results? tools called 'glip' with params: 42, 72 Response: The result of 'glip' of 42 and 72 is 114.
Another Example
package com.logicbig.example;
import dev.langchain4j.agent.tool.Tool;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.ollama.OllamaChatModel;
import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.SystemMessage;
import java.time.Instant;
public class ToolsExample2 {
static class SystemTools {
@Tool("Returns the current system time in milliseconds")
public long systemMillis() {
System.out.println("tool called systemMillis");
long epoch = System.currentTimeMillis();
System.out.println("tool systemMillis return value: "+epoch);
return epoch;
}
@Tool("Converts epoch milliseconds in long to a UTC timestamp")
public String utcFormat(long millis) {
System.out.println("tool called utcFormat param: "+millis);
String string = Instant.ofEpochMilli(millis).toString();
System.out.println("tool utcFormat return value: "+string);
return string;
}
}
interface TimeAgent {
@SystemMessage("""
You are a helpful assistant with access to tools.
You may call multiple tools in sequence.
Use tool outputs as inputs to subsequent tools when needed.
Return short answers.
""")
String chat(String userMessage);
}
public static void main(String[] args) {
ChatModel model = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("llama3.2:3b")
.temperature(0.0)
.numCtx(4096)
.build();
ChatMemory memory = MessageWindowChatMemory.withMaxMessages(10);
TimeAgent agent = AiServices.builder(TimeAgent.class)
.chatModel(model)
.tools(new SystemTools())
.chatMemory(memory)
.build();
String response1 = agent.chat("What is the current system time?");
System.out.println("response1: '"+response1+"'");
String response2 = agent.chat(
"Now convert the system time into a human-readable UTC format."
);
System.out.println("response2 "+response2);
}
}
Outputtool called systemMillis tool systemMillis return value: 1769583987389 response1: 'The current system time in seconds since the Unix epoch (January 1, 1970) is 1769583987.' tool called utcFormat param: 1769583987 tool utcFormat return value: 1970-01-21T11:33:03.987Z response2 The current system time in a human-readable UTC format is January 21, 1970 11:33:03 UTC.
Conclusion
The output shows how the LLM recognizes when tool execution is needed, requests the appropriate tool with correct parameters, and incorporates the tool results into its final response. The AI Service proxy handles tool registration, parameter parsing, execution, and result integration automatically, making tool calling as simple as annotating methods with @Tool.
Example ProjectDependencies and Technologies Used: - langchain4j 1.10.0 (Build LLM-powered applications in Java: chatbots, agents, RAG, and much more)
- langchain4j-ollama 1.10.0 (LangChain4j :: Integration :: Ollama)
- slf4j-simple 2.0.9 (SLF4J Simple Provider)
- JDK 17
- Maven 3.9.11
|