Skip to content

Anthropic /v1/messages silently drops system field when passed as content-block array #46

@mikemolinet

Description

@mikemolinet

What happened?

POST /v1/messages silently drops the system field when it is passed as an array of content blocks. The Anthropic Messages API spec (https://docs.anthropic.com/en/api/messages) allows system as either a string or an array of content blocks (currently only {type: "text", text: "..."} blocks are defined). Clients that follow the spec — or SDKs that serialize system as an array — lose their system prompt at the proxy boundary without any error.

The relevant code at index.js:1044-1047:

const allMessages =
  typeof body.system === "string" && body.system.trim()
    ? [{ role: "system", content: body.system.trim() }, ...messages]
    : messages

typeof body.system === "string" is false for arrays, so the system content never reaches buildSystemPrompt and the request proceeds without a system prompt.

Steps to reproduce

  1. Start the proxy with npm start (or via opencode with the plugin loaded).
  2. Send a request to POST /v1/messages with system as an array of text content blocks.
  3. Observe that the model's response ignores the system prompt (vs. the same content sent as a string works correctly).

Expected behaviour

The system field, whether sent as a string OR as an array of {type: "text", text: "..."} content blocks, should reach the underlying model as a system prompt.

Request / response (if applicable)

Array form (currently dropped):

curl -X POST http://127.0.0.1:4010/v1/messages \
  -H 'content-type: application/json' \
  -d '{
    "model": "anthropic/claude-3-5-sonnet",
    "system": [
      { "type": "text", "text": "You are a pirate. Every reply must start with Arr." }
    ],
    "messages": [
      { "role": "user", "content": "Hello." }
    ]
  }'
# → 200 OK, reply has no pirate voice (system prompt silently ignored)

String form (works today — same intent):

curl -X POST http://127.0.0.1:4010/v1/messages \
  -H 'content-type: application/json' \
  -d '{
    "model": "anthropic/claude-3-5-sonnet",
    "system": "You are a pirate. Every reply must start with Arr.",
    "messages": [
      { "role": "user", "content": "Hello." }
    ]
  }'
# → 200 OK, reply starts with "Arr"

opencode-llm-proxy version

1.6.1

Runtime and OS

Reproduces on Node.js >= 20 and Bun >= 1.0, any OS — it's a pure JS logic bug.

Provider / model

Any Anthropic-model provider. Verified against anthropic/claude-3-5-sonnet conceptually; the bug is at the proxy boundary and is independent of the downstream provider.


I have a fix ready — add an exported normalizeAnthropicSystem helper that handles both forms, mirroring the existing normalizeAnthropicMessages array-of-content-blocks pattern (filter to text blocks, trim, join with \n\n). Happy to open a PR once you confirm the approach.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions