fix: emit content_part.done and populate output_text.done.text per Responses API spec#49
Open
mikemolinet wants to merge 1 commit intoKochC:devfrom
Open
Conversation
…sponses API spec The /v1/responses streaming handler violates the OpenAI Responses API SSE lifecycle spec in two ways: 1. response.content_part.done is never emitted. Per the spec (https://platform.openai.com/docs/api-reference/responses-streaming), the event sequence for a text content part should be: content_part.added -> output_text.delta* -> output_text.done -> content_part.done -> output_item.done 2. response.output_text.done is emitted with text: "" instead of the accumulated output text. The spec requires the final content. Accumulate delta tokens in a local variable at the streaming call site, emit the missing response.content_part.done event with the accumulated text in part.text, and populate output_text.done.text with the same accumulated content. Gate the new content_part.done event on at least one delta having been received, keeping the content-part added/done lifecycle symmetric. Adds one regression test in index.test.js that asserts: - output_text.done.text equals the accumulated deltas - content_part.done event is present with part.text populated - correct ordering (output_text.done < content_part.done < output_item.done) Closes KochC#48
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
The
/v1/responsesstreaming handler violates the OpenAI Responses API SSE lifecycle spec in two ways:response.content_part.doneis not emitted betweenresponse.output_text.doneandresponse.output_item.done. Per https://platform.openai.com/docs/api-reference/responses-streaming the event sequence for a text content part should be:content_part.added→output_text.delta*→output_text.done→content_part.done→output_item.done.textfield:response.output_text.doneis sent withtext: ""instead of the accumulated output text. The spec requires the final accumulated content.Accumulate the delta text at the streaming call site, emit
content_part.donewith the accumulated text, and populateoutput_text.done.textcorrectly.Closes #48
Changes
index.js: in the/v1/responsesstreaming handler, accumulatedeltatokens in a localaccumulatedTextvariable viaaccumulatedText += deltain theonChunkcallback. On stream completion: (a) setresponse.output_text.done.texttoaccumulatedText(was""), (b) emit a newresponse.content_part.doneSSE event betweenoutput_text.doneandoutput_item.done, withpart.textset to the accumulated content. The new event is gated onpartIndex > 0(i.e. at least one delta arrived), keeping the content-part added/done lifecycle symmetric.index.test.js: add a localparseSseStreamhelper and a regression test asserting the new event, the accumulated-text content on bothoutput_text.doneandcontent_part.done, and the event ordering.Testing
npm test— 113 passed (112 existing + 1 new)npm run lint— cleanNotes
OpenAI Responses API streaming spec reference: https://platform.openai.com/docs/api-reference/responses-streaming.