Skip to content

/v1/responses streaming omits response.content_part.done and sends empty output_text.done.text #48

@mikemolinet

Description

@mikemolinet

What happened?

POST /v1/responses streaming violates the OpenAI Responses API SSE lifecycle spec (https://platform.openai.com/docs/api-reference/responses-streaming) in two ways:

  1. response.content_part.done is never emitted. Per the spec, the event sequence for a text content part is content_part.addedoutput_text.delta*output_text.donecontent_part.doneoutput_item.done. The current implementation skips content_part.done.
  2. response.output_text.done.text is always "". The spec requires this event to carry the final accumulated content, but index.js:956-964 hardcodes text: "".

Both bugs live in the same SSE emitter at index.js:955-988.

Steps to reproduce

  1. Run the proxy (npm start).
  2. Send a streaming Responses API request: POST /v1/responses with stream: true.
  3. Collect the emitted SSE events.
  4. Observe: (a) no response.content_part.done event appears, and (b) response.output_text.done has text: "" regardless of actual output.

Expected behaviour

The SSE stream should match the spec: a response.content_part.done event is emitted between output_text.done and output_item.done, with part.text equal to the accumulated output text. response.output_text.done.text should also be the accumulated content.

Request / response (if applicable)

curl -N -X POST http://127.0.0.1:4010/v1/responses \
  -H 'content-type: application/json' \
  -d '{"model":"gpt-4o","stream":true,"input":"Say hello."}'

Expected in the stream (abbreviated):

event: response.output_text.delta
data: {"type":"response.output_text.delta", ..., "delta":"Hello"}

event: response.output_text.done
data: {"type":"response.output_text.done", ..., "text":"Hello"}            # currently text:""

event: response.content_part.done                                          # currently MISSING
data: {"type":"response.content_part.done", ..., "part":{"type":"output_text","text":"Hello","annotations":[]}}

event: response.output_item.done
data: {"type":"response.output_item.done", ...}

opencode-llm-proxy version

1.6.1

Runtime and OS

Reproduces on Node.js >= 20 and Bun >= 1.0, any OS — pure JS logic (SSE event sequence).

Provider / model

Any. Bug is at the proxy's SSE emission layer, independent of downstream provider.


I have a fix ready — accumulate delta tokens at the streaming call site, populate output_text.done.text, and emit the missing content_part.done event with the accumulated text. Happy to open a PR once the approach is confirmed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions