fix(llm): add GLM compatibility for reasoning_content field#55
fix(llm): add GLM compatibility for reasoning_content field#55jknightai wants to merge 1 commit intoNarcooo:masterfrom
Conversation
|
我这边补充下本地验证结果:在 GLM 上能复现 content 为空、实际文本在 reasoning_content 的情况,所以修复前会报 LLM returned empty response。这次我把同步和流式两条路径都做了 fallback,并用类型断言处理了非标准字段;标准 OpenAI 的 content 返回路径不受影响。 |
|
感谢 PR!审过代码,改动精准,流式和同步两条路径都覆盖了,不影响标准 OpenAI/Anthropic 通路。 预计将合入。 Thanks for the PR! Reviewed the code — clean, precise change that covers both streaming and sync paths without affecting standard OpenAI/Anthropic flow. Will merge. |
Summary
Add compatibility for Zhipu GLM responses where text may appear in
reasoning_contentwhencontentis empty.Changes
packages/core/src/llm/provider.tsdelta.reasoning_contentas output fallback.message.reasoning_contentwhenmessage.contentis empty.Why
Some GLM models can return:
content: ""reasoning_content: "actual text"Without this fallback, InkOS may throw
LLM returned empty response.Validation
pnpm -r buildpasses after changes.