Open-source LLM FinOps proxy — track OpenAI, Anthropic (Claude), and Google Gemini costs by feature, team, and customer. Zero code changes. pip install burnlens.
-
Updated
May 10, 2026 - Python
Open-source LLM FinOps proxy — track OpenAI, Anthropic (Claude), and Google Gemini costs by feature, team, and customer. Zero code changes. pip install burnlens.
Production-tested AI/ML architecture patterns for regulated enterprises. 18 patterns covering AI gateways, multi-agent pipelines, compliance-aware routing, governance-as-architecture, and more. Each pattern includes architecture diagrams, tradeoffs, NIST AI RMF mapping, and quantitative governance metrics. Built from 18+ years in healthcare IT
AI Governance: The Foundation for Organized AI in Production — A Field Guide by Fabio Bastos. Published under CC BY 4.0.
Open-source LLM FinOps proxy — track OpenAI, Anthropic (Claude), and Google Gemini costs by feature, team, and customer. Zero code changes. pip install burnlens.
Token-Light, Code-Intensive (TLCI) — A design philosophy for AI agent automation. Use AI only where it's needed. 80-97% cost reduction.
VP / Head of AI Engineering portfolio — 14 production AI demos, governance-first architecture. prasadkavuri.com
FinOps governance layer for enterprise AI spend. Token-level cost attribution, multi-provider price comparison, budget burn-down, anomaly detection, monthly forecasting with confidence intervals, and department chargeback rollups.
Add a description, image, and links to the ai-finops topic page so that developers can more easily learn about it.
To associate your repository with the ai-finops topic, visit your repo's landing page and select "manage topics."