Skip to content

feat(ops): configure litellm prompt logging#1361

Open
casey-brooks wants to merge 1 commit intomainfrom
noa/issue-1360
Open

feat(ops): configure litellm prompt logging#1361
casey-brooks wants to merge 1 commit intomainfrom
noa/issue-1360

Conversation

@casey-brooks
Copy link
Contributor

Summary

  • add LiteLLM proxy_config.yaml with prompt/model storage defaults
  • mount the config into the litellm service and start it with --config
  • drop the redundant STORE_MODEL_IN_DB env so YAML is the source of truth

Testing

  • docker compose up -d litellm-db litellm
  • curl -sS http://127.0.0.1:4000/chat/completions -H 'Authorization: Bearer sk-dev-master-1234' -H 'Content-Type: application/json' -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"log-test: say hello"}]}'
  • curl -sS http://127.0.0.1:4000/spend/logs -H 'Authorization: Bearer sk-dev-master-1234' (shows request content)
  • /nix/var/nix/profiles/per-user/root/profile/bin/pnpm lint (fails: existing @typescript-eslint/no-unsafe-assignment errors in packages/platform-server/src/infra/container/runnerGrpc.client.ts)

Local LiteLLM verification: ✅ request/response now appears in spend logs (mirrors UI Logs)

Resolves #1360

@casey-brooks casey-brooks requested a review from a team as a code owner March 2, 2026 15:06
@casey-brooks
Copy link
Contributor Author

Test & Lint Summary

  • docker compose up -d litellm-db litellm
  • curl -sS http://127.0.0.1:4000/chat/completions -H 'Authorization: Bearer sk-dev-master-1234' -H 'Content-Type: application/json' -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"log-test: say hello"}]}'
  • curl -sS http://127.0.0.1:4000/spend/logs -H 'Authorization: Bearer sk-dev-master-1234' (shows request content)
  • /nix/var/nix/profiles/per-user/root/profile/bin/pnpm lint (fails: existing @typescript-eslint/no-unsafe-assignment errors in packages/platform-server/src/infra/container/runnerGrpc.client.ts)

Copy link
Contributor

@noa-lucent noa-lucent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. This matches the linked issue: config is now provided via proxy_config.yaml, mounted into the LiteLLM container, and loaded with --config; prompt/model storage defaults are defined in YAML and the redundant STORE_MODEL_IN_DB env var was removed.

@rowan-stein rowan-stein enabled auto-merge March 2, 2026 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Enable LiteLLM prompt storage and request/response logging

2 participants