Skip to content

Document PlanExe system prompts and update coding standards#122

Open
82deutschmark wants to merge 44 commits intoPlanExeOrg:mainfrom
VoynichLabs:feature/domain-profiles
Open

Document PlanExe system prompts and update coding standards#122
82deutschmark wants to merge 44 commits intoPlanExeOrg:mainfrom
VoynichLabs:feature/domain-profiles

Conversation

@82deutschmark
Copy link
Contributor

  • Updated CODING_STANDARDS.md with Egon/Linux-friendly phrasing and clarified non-negotiables, plan requirements, and documentation expectations
  • Added system_prompts.jsonl (snapshot of 115 system prompts)
  • Added docs/system-prompts-review.md with review notes

EgonBot and others added 30 commits February 26, 2026 21:39
…uted execution, explain API, semantic search
… creating plans. I have a Gradio UI for creating plans, but it's far from what I have in mind.
- Add TokenMetrics database model to store per-call token usage metrics
- Implement token extraction from multiple LLM providers (OpenAI, OpenRouter, Anthropic, Ollama, etc.)
- Add TokenMetricsStore for database operations with lazy initialization
- Add token instrumentation module for pipeline integration
- Add API endpoints to retrieve token metrics (aggregated and detailed)
- Initialize token tracking for each plan execution
- Import TokenMetrics in Flask app for automatic table creation
- Comprehensive documentation with usage examples and troubleshooting guide

Metrics tracked per LLM invocation:
- Input tokens, output tokens, thinking tokens
- Duration and success/failure status
- LLM model and task name
- Raw provider-specific usage data for debugging

API endpoints:
- GET /runs/{run_id}/token-metrics - Aggregated summary
- GET /runs/{run_id}/token-metrics/detailed - Per-call details

Supports automatic metric recording from LLM responses across all provider types.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants