-
Notifications
You must be signed in to change notification settings - Fork 1k
add streamText to aisdk llm client #969
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Summary
This review covers only the changes made since the last review (commit 111ec3b), not the entire PR.
The most recent changes focus on implementing streaming text functionality in the AI SDK LLM client. The developer added three new optional properties to the ChatCompletionOptions
interface: aiSDKTools
(of type ToolSet
), maxSteps
(number), and stream
(boolean). In the AISdkClient
implementation, a new conditional branch was added that checks for options.stream
and uses the streamText
function from the AI SDK when streaming is enabled.
When stream: true
is passed, the client now calls streamText
with the formatted messages, temperature, AI SDK tools, and max steps parameters, then returns the stream directly cast as T
. The implementation also updates tool parameter references from options.tools
to options.aiSDKTools
throughout the client to maintain consistency with the interface changes.
This change enables real-time text generation through the existing createChatCompletion
interface, allowing users to receive partial responses as they're generated rather than waiting for complete responses. The streaming functionality bypasses the existing caching mechanism entirely, which is necessary since streams cannot be cached or processed synchronously like regular responses.
Confidence score: 3/5
- This PR introduces useful streaming functionality but has some implementation concerns that could cause issues in production
- Score reflects the type safety concerns with the stream casting and the fact that streaming bypasses caching mechanisms
- Pay close attention to the type casting in the streaming path and ensure consumers can handle the stream response appropriately
2 files reviewed, 1 comment
why
what changed
test plan