feat: wire up tool calling execution in streaming responses#34
Conversation
|
@Ashok161 is attempting to deploy a commit to the CODERCOPS Team on Vercel. A member of the Team first needs to authorize it. |
|
@anurag629 I chose to keep tool execution inside the provider loop via |
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
anurag629
left a comment
There was a problem hiding this comment.
This fixes the biggest gap in the project — tool calling was completely broken, meaning lead capture (a headline feature) never actually worked. Good implementation across all 3 providers with solid test coverage.
A couple of minor things I noticed but nothing that should block this:
- The module-level
toolCallSequencecounters in gemini.ts/openai.ts will grow forever in a long-running server. Fine in practice but worth cleaning up later. - No timeout on
tool.execute()— if a tool hangs, the stream blocks. I'll open a follow-up issue for this.
Nice work @Ashok161.
Implements #16 by wiring up the full tool-calling loop for streaming and sync chat responses across all supported providers. Tool calls are now detected, executed server-side, fed back into the provider in the
provider-specific format, and the final assistant response continues streaming to the widget.
What’s included
Files changed
Behavior changes
Verification
Both passed locally.
Closes