How to integrate Spring AI LLM streaming in Langgraph4j¶
In [1]:
Copied!
var userHomeDir = System.getProperty("user.home");
var localRespoUrl = "file://" + userHomeDir + "/.m2/repository/";
var springaiVersion = "1.1.0";
var langgraph4jVersion = "1.8-SNAPSHOT";
var userHomeDir = System.getProperty("user.home");
var localRespoUrl = "file://" + userHomeDir + "/.m2/repository/";
var springaiVersion = "1.1.0";
var langgraph4jVersion = "1.8-SNAPSHOT";
Remove installed package from Jupiter cache
In [2]:
Copied!
%%bash
rm -rf \{userHomeDir}/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/
%%bash
rm -rf \{userHomeDir}/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/
In [3]:
Copied!
%dependency /add-repo local \{localRespoUrl} release|never snapshot|always
// %dependency /list-repos
%dependency /add org.slf4j:slf4j-jdk14:2.0.9
%dependency /add org.bsc.langgraph4j:langgraph4j-spring-ai:\{langgraph4jVersion}
%dependency /add org.bsc.langgraph4j:langgraph4j-springai-agentexecutor:\{langgraph4jVersion}
%dependency /add org.springframework.ai:spring-ai-commons:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-model:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-client-chat:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-ollama:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-openai:\{springaiVersion}
%dependency /resolve
%dependency /add-repo local \{localRespoUrl} release|never snapshot|always
// %dependency /list-repos
%dependency /add org.slf4j:slf4j-jdk14:2.0.9
%dependency /add org.bsc.langgraph4j:langgraph4j-spring-ai:\{langgraph4jVersion}
%dependency /add org.bsc.langgraph4j:langgraph4j-springai-agentexecutor:\{langgraph4jVersion}
%dependency /add org.springframework.ai:spring-ai-commons:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-model:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-client-chat:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-ollama:\{springaiVersion}
%dependency /add org.springframework.ai:spring-ai-openai:\{springaiVersion}
%dependency /resolve
Repository local url: file:///Users/bsorrentino/.m2/repository/ added. Adding dependency org.slf4j:slf4j-jdk14:2.0.9 Adding dependency org.bsc.langgraph4j:langgraph4j-spring-ai:1.7-SNAPSHOT Adding dependency org.bsc.langgraph4j:langgraph4j-springai-agentexecutor:1.7-SNAPSHOT Adding dependency org.springframework.ai:spring-ai-commons:1.1.0 Adding dependency org.springframework.ai:spring-ai-model:1.1.0 Adding dependency org.springframework.ai:spring-ai-client-chat:1.1.0 Adding dependency org.springframework.ai:spring-ai-ollama:1.1.0 Adding dependency org.springframework.ai:spring-ai-openai:1.1.0 Solving dependencies Resolved artifacts count: 53 Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/slf4j/slf4j-jdk14/2.0.9/slf4j-jdk14-2.0.9.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/slf4j/slf4j-api/2.0.9/slf4j-api-2.0.9.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/langgraph4j/langgraph4j-spring-ai/1.7-SNAPSHOT/langgraph4j-spring-ai-1.7-SNAPSHOT.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/langgraph4j/langgraph4j-core/1.7-SNAPSHOT/langgraph4j-core-1.7-SNAPSHOT.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/async/async-generator/4.0.0-beta2/async-generator-4.0.0-beta2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/bsc/langgraph4j/langgraph4j-springai-agentexecutor/1.7-SNAPSHOT/langgraph4j-springai-agentexecutor-1.7-SNAPSHOT.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-commons/1.1.0/spring-ai-commons-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-context/6.2.12/spring-context-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-aop/6.2.12/spring-aop-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-beans/6.2.12/spring-beans-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-core/6.2.12/spring-core-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-jcl/6.2.12/spring-jcl-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-expression/6.2.12/spring-expression-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/micrometer/micrometer-core/1.15.5/micrometer-core-1.15.5.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/micrometer/micrometer-commons/1.15.5/micrometer-commons-1.15.5.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/hdrhistogram/HdrHistogram/2.2.2/HdrHistogram-2.2.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/latencyutils/LatencyUtils/2.0.3/LatencyUtils-2.0.3.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/micrometer/context-propagation/1.1.3/context-propagation-1.1.3.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/module/jackson-module-jsonSchema/2.19.2/jackson-module-jsonSchema-2.19.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/core/jackson-annotations/2.19.2/jackson-annotations-2.19.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/core/jackson-core/2.19.2/jackson-core-2.19.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/core/jackson-databind/2.19.2/jackson-databind-2.19.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/knuddels/jtokkit/1.1.0/jtokkit-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-model/1.1.0/spring-ai-model-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-template-st/1.1.0/spring-ai-template-st-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/antlr/ST4/4.3.4/ST4-4.3.4.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/antlr/antlr-runtime/3.5.3/antlr-runtime-3.5.3.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/micrometer/micrometer-observation/1.15.5/micrometer-observation-1.15.5.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-messaging/6.2.12/spring-messaging-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/projectreactor/reactor-core/3.7.12/reactor-core-3.7.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/reactivestreams/reactive-streams/1.0.4/reactive-streams-1.0.4.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/antlr/antlr4-runtime/4.13.1/antlr4-runtime-4.13.1.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/github/victools/jsonschema-generator/4.38.0/jsonschema-generator-4.38.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/classmate/1.7.0/classmate-1.7.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/github/victools/jsonschema-module-jackson/4.38.0/jsonschema-module-jackson-4.38.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/datatype/jackson-datatype-jsr310/2.19.2/jackson-datatype-jsr310-2.19.2.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/github/victools/jsonschema-module-swagger-2/4.38.0/jsonschema-module-swagger-2-4.38.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/swagger/core/v3/swagger-annotations-jakarta/2.2.30/swagger-annotations-jakarta-2.2.30.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-client-chat/1.1.0/spring-ai-client-chat-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/modelcontextprotocol/sdk/mcp-json-jackson2/0.16.0/mcp-json-jackson2-0.16.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/io/modelcontextprotocol/sdk/mcp-json/0.16.0/mcp-json-0.16.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/networknt/json-schema-validator/2.0.0/json-schema-validator-2.0.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/ethlo/time/itu/1.14.0/itu-1.14.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/com/fasterxml/jackson/dataformat/jackson-dataformat-yaml/2.18.3/jackson-dataformat-yaml-2.18.3.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/yaml/snakeyaml/2.3/snakeyaml-2.3.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-ollama/1.1.0/spring-ai-ollama-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-retry/1.1.0/spring-ai-retry-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/retry/spring-retry/2.0.12/spring-retry-2.0.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-web/6.2.12/spring-web-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-webflux/6.2.12/spring-webflux-6.2.12.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/ai/spring-ai-openai/1.1.0/spring-ai-openai-1.1.0.jar Add to classpath: /Users/bsorrentino/Library/Jupyter/kernels/rapaio-jupyter-kernel/mima_cache/org/springframework/spring-context-support/6.2.12/spring-context-support-6.2.12.jar
Initialize Logger
In [4]:
Copied!
try( var file = new java.io.FileInputStream("./logging.properties")) {
java.util.logging.LogManager.getLogManager().readConfiguration( file );
}
var log = org.slf4j.LoggerFactory.getLogger("llm-streaming");
try( var file = new java.io.FileInputStream("./logging.properties")) {
java.util.logging.LogManager.getLogManager().readConfiguration( file );
}
var log = org.slf4j.LoggerFactory.getLogger("llm-streaming");
How to use StreamingChatGenerator¶
In [5]:
Copied!
import org.bsc.async.AsyncGenerator;
import org.bsc.async.FlowGenerator;
import org.bsc.langgraph4j.NodeOutput;
import org.bsc.langgraph4j.StateGraph;
import org.bsc.langgraph4j.action.AsyncNodeAction;
import org.bsc.langgraph4j.action.EdgeAction;
import org.bsc.langgraph4j.action.NodeAction;
import org.bsc.langgraph4j.prebuilt.MessagesState;
import org.bsc.langgraph4j.serializer.std.ObjectStreamStateSerializer;
import org.bsc.langgraph4j.spring.ai.generators.StreamingChatGenerator;
import org.bsc.langgraph4j.spring.ai.serializer.std.SpringAIStateSerializer;
import org.bsc.langgraph4j.spring.ai.tool.SpringAIToolService;
import org.bsc.langgraph4j.streaming.StreamingOutput;
import org.bsc.langgraph4j.utils.EdgeMappings;
import org.reactivestreams.FlowAdapters;
import org.reactivestreams.Publisher;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.messages.Message;
import org.springframework.ai.chat.messages.MessageType;
import org.springframework.ai.chat.messages.UserMessage;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.chat.model.ChatResponse;
import org.springframework.ai.model.tool.ToolCallingChatOptions;
import org.springframework.ai.ollama.OllamaChatModel;
import org.springframework.ai.ollama.api.OllamaApi;
import org.springframework.ai.ollama.api.OllamaChatOptions;
import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.OpenAiChatOptions;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.ai.tool.annotation.Tool;
import org.springframework.ai.tool.annotation.ToolParam;
import org.springframework.ai.tool.function.FunctionToolCallback;
import reactor.core.publisher.Flux;
import java.util.List;
import java.util.Map;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.Flow;
import java.util.concurrent.atomic.AtomicReference;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.function.Supplier;
import java.util.function.UnaryOperator;
import static java.util.Optional.ofNullable;
import static org.bsc.langgraph4j.StateGraph.END;
import static org.bsc.langgraph4j.StateGraph.START;
import static org.bsc.langgraph4j.action.AsyncEdgeAction.edge_async;
import static org.bsc.langgraph4j.action.AsyncNodeAction.node_async;
enum AiModel {
OPENAI_GPT_4O_MINI(
OpenAiChatModel.builder()
.openAiApi(OpenAiApi.builder()
.baseUrl("https://api.openai.com")
.apiKey(System.getenv("OPENAI_API_KEY"))
.build())
.defaultOptions(OpenAiChatOptions.builder()
.model("gpt-4o-mini")
.logprobs(false)
.temperature(0.1)
.build())
.build()),
OLLAMA_QWEN2_5_7B(
OllamaChatModel.builder()
.ollamaApi( OllamaApi.builder().baseUrl("http://localhost:11434").build() )
.defaultOptions(OllamaChatOptions.builder()
.model("qwen2.5:7b")
.temperature(0.1)
.build())
.build());
;
public final ChatModel model;
AiModel( ChatModel model ) {
this.model = model;
}
}
var chatClient = ChatClient.builder(AiModel.OLLAMA_QWEN2_5_7B.model)
.defaultOptions(ToolCallingChatOptions.builder()
.internalToolExecutionEnabled(false) // Disable automatic tool execution
.build())
.defaultSystem("You are a helpful AI Assistant answering questions." )
.build();
var flux = chatClient.prompt()
.messages( new UserMessage("tell me a joke"))
.stream()
.chatResponse()
;
var generator = StreamingChatGenerator.builder()
.startingNode("agent")
.mapResult( response -> Map.of( "messages", response.getResult().getOutput()))
.build(flux);
for( var item : generator ) {
System.out.println("Received: " + item );
}
import org.bsc.async.AsyncGenerator;
import org.bsc.async.FlowGenerator;
import org.bsc.langgraph4j.NodeOutput;
import org.bsc.langgraph4j.StateGraph;
import org.bsc.langgraph4j.action.AsyncNodeAction;
import org.bsc.langgraph4j.action.EdgeAction;
import org.bsc.langgraph4j.action.NodeAction;
import org.bsc.langgraph4j.prebuilt.MessagesState;
import org.bsc.langgraph4j.serializer.std.ObjectStreamStateSerializer;
import org.bsc.langgraph4j.spring.ai.generators.StreamingChatGenerator;
import org.bsc.langgraph4j.spring.ai.serializer.std.SpringAIStateSerializer;
import org.bsc.langgraph4j.spring.ai.tool.SpringAIToolService;
import org.bsc.langgraph4j.streaming.StreamingOutput;
import org.bsc.langgraph4j.utils.EdgeMappings;
import org.reactivestreams.FlowAdapters;
import org.reactivestreams.Publisher;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.messages.Message;
import org.springframework.ai.chat.messages.MessageType;
import org.springframework.ai.chat.messages.UserMessage;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.chat.model.ChatResponse;
import org.springframework.ai.model.tool.ToolCallingChatOptions;
import org.springframework.ai.ollama.OllamaChatModel;
import org.springframework.ai.ollama.api.OllamaApi;
import org.springframework.ai.ollama.api.OllamaChatOptions;
import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.OpenAiChatOptions;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.ai.tool.annotation.Tool;
import org.springframework.ai.tool.annotation.ToolParam;
import org.springframework.ai.tool.function.FunctionToolCallback;
import reactor.core.publisher.Flux;
import java.util.List;
import java.util.Map;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.Flow;
import java.util.concurrent.atomic.AtomicReference;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.function.Supplier;
import java.util.function.UnaryOperator;
import static java.util.Optional.ofNullable;
import static org.bsc.langgraph4j.StateGraph.END;
import static org.bsc.langgraph4j.StateGraph.START;
import static org.bsc.langgraph4j.action.AsyncEdgeAction.edge_async;
import static org.bsc.langgraph4j.action.AsyncNodeAction.node_async;
enum AiModel {
OPENAI_GPT_4O_MINI(
OpenAiChatModel.builder()
.openAiApi(OpenAiApi.builder()
.baseUrl("https://api.openai.com")
.apiKey(System.getenv("OPENAI_API_KEY"))
.build())
.defaultOptions(OpenAiChatOptions.builder()
.model("gpt-4o-mini")
.logprobs(false)
.temperature(0.1)
.build())
.build()),
OLLAMA_QWEN2_5_7B(
OllamaChatModel.builder()
.ollamaApi( OllamaApi.builder().baseUrl("http://localhost:11434").build() )
.defaultOptions(OllamaChatOptions.builder()
.model("qwen2.5:7b")
.temperature(0.1)
.build())
.build());
;
public final ChatModel model;
AiModel( ChatModel model ) {
this.model = model;
}
}
var chatClient = ChatClient.builder(AiModel.OLLAMA_QWEN2_5_7B.model)
.defaultOptions(ToolCallingChatOptions.builder()
.internalToolExecutionEnabled(false) // Disable automatic tool execution
.build())
.defaultSystem("You are a helpful AI Assistant answering questions." )
.build();
var flux = chatClient.prompt()
.messages( new UserMessage("tell me a joke"))
.stream()
.chatResponse()
;
var generator = StreamingChatGenerator.builder()
.startingNode("agent")
.mapResult( response -> Map.of( "messages", response.getResult().getOutput()))
.build(flux);
for( var item : generator ) {
System.out.println("Received: " + item );
}
Received: StreamingOutput{node=agent, state=null, chunk=Sure}
Received: StreamingOutput{node=agent, state=null, chunk=,}
Received: StreamingOutput{node=agent, state=null, chunk= here}
Received: StreamingOutput{node=agent, state=null, chunk='s}
Received: StreamingOutput{node=agent, state=null, chunk= a}
Received: StreamingOutput{node=agent, state=null, chunk= light}
Received: StreamingOutput{node=agent, state=null, chunk= joke}
Received: StreamingOutput{node=agent, state=null, chunk= for}
Received: StreamingOutput{node=agent, state=null, chunk= you}
Received: StreamingOutput{node=agent, state=null, chunk=:
}
Received: StreamingOutput{node=agent, state=null, chunk=Why}
Received: StreamingOutput{node=agent, state=null, chunk= don}
Received: StreamingOutput{node=agent, state=null, chunk='t}
Received: StreamingOutput{node=agent, state=null, chunk= scientists}
Received: StreamingOutput{node=agent, state=null, chunk= trust}
Received: StreamingOutput{node=agent, state=null, chunk= atoms}
Received: StreamingOutput{node=agent, state=null, chunk=?
}
Received: StreamingOutput{node=agent, state=null, chunk=Because}
Received: StreamingOutput{node=agent, state=null, chunk= they}
Received: StreamingOutput{node=agent, state=null, chunk= make}
Received: StreamingOutput{node=agent, state=null, chunk= up}
Received: StreamingOutput{node=agent, state=null, chunk= everything}
Received: StreamingOutput{node=agent, state=null, chunk=!}
Received: StreamingOutput{node=agent, state=null, chunk=}
Use StreamingChatGenerator in Agent Executor¶
Set up the agent's tools¶
In [6]:
Copied!
public class WeatherTool {
@Tool( description = "Get the weather in location")
public String execQuery(@ToolParam( description = "The query to use in your search.") String query) {
// This is a placeholder for the actual implementation
return "Cold, with a low of 13 degrees";
}
}
public class WeatherTool {
@Tool( description = "Get the weather in location")
public String execQuery(@ToolParam( description = "The query to use in your search.") String query) {
// This is a placeholder for the actual implementation
return "Cold, with a low of 13 degrees";
}
}
Create Agent executor¶
In [9]:
Copied!
import org.bsc.langgraph4j.spring.ai.agentexecutor.AgentExecutor;
import org.bsc.langgraph4j.NodeOutput;
import org.bsc.langgraph4j.streaming.StreamingOutput;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.messages.UserMessage;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.tool.ToolCallback;
var agent = AgentExecutor.builder()
.chatModel(AiModel.OPENAI_GPT_4O_MINI.model, true)
.toolsFromObject( new WeatherTool() )
.build()
.compile();
var result = agent.stream( Map.of( "messages", new UserMessage("Weather in Napoli ?") ));
var state = result.stream()
.peek( s -> {
if( s instanceof StreamingOutput<?> sout ) {
System.out.printf( "%s: (%s)\n", sout.node(), sout.chunk());
}
else {
System.out.println(s.node());
}
})
.reduce((a, b) -> b)
.map( NodeOutput::state)
.orElseThrow();
log.info( "result: {}", state.lastMessage()
.map(AssistantMessage.class::cast)
.map(AssistantMessage::getText)
.orElseThrow() );
import org.bsc.langgraph4j.spring.ai.agentexecutor.AgentExecutor;
import org.bsc.langgraph4j.NodeOutput;
import org.bsc.langgraph4j.streaming.StreamingOutput;
import org.springframework.ai.chat.messages.AssistantMessage;
import org.springframework.ai.chat.messages.UserMessage;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.tool.ToolCallback;
var agent = AgentExecutor.builder()
.chatModel(AiModel.OPENAI_GPT_4O_MINI.model, true)
.toolsFromObject( new WeatherTool() )
.build()
.compile();
var result = agent.stream( Map.of( "messages", new UserMessage("Weather in Napoli ?") ));
var state = result.stream()
.peek( s -> {
if( s instanceof StreamingOutput > sout ) {
System.out.printf( "%s: (%s)\n", sout.node(), sout.chunk());
}
else {
System.out.println(s.node());
}
})
.reduce((a, b) -> b)
.map( NodeOutput::state)
.orElseThrow();
log.info( "result: {}", state.lastMessage()
.map(AssistantMessage.class::cast)
.map(AssistantMessage::getText)
.orElseThrow() );
START
__START__ agent: ()
executeTools
agent action agent: () agent: (The) agent: ( weather) agent: ( in) agent: ( Napoli) agent: ( is) agent: ( currently) agent: ( cold) agent: (,) agent: ( with) agent: ( a) agent: ( low) agent: ( of) agent: ( ) agent: (13) agent: ( degrees) agent: ( Celsius) agent: (.) agent: (null)
executeTools
agent action __END__
result: The weather in Napoli is currently cold, with a low of 13 degrees Celsius.