Open-source Prompt Firewall — deflect up to 95% of redundant LLM traffic before it leaves your infrastructure. Documents: https://isartor-ai.github.io/Isartor/index.html
-
Updated
Mar 29, 2026 - Rust
Open-source Prompt Firewall — deflect up to 95% of redundant LLM traffic before it leaves your infrastructure. Documents: https://isartor-ai.github.io/Isartor/index.html
A high-performance, multi-agent observability engine designed for the Model Context Protocol (MCP). It provides a non-blocking, transparent proxy layer that implements deterministic token attribution, real-time context-window alerting, and heuristic-driven static analysis to optimize LLM metadata overhead at scale.
Add a description, image, and links to the llm-cost-optimization topic page so that developers can more easily learn about it.
To associate your repository with the llm-cost-optimization topic, visit your repo's landing page and select "manage topics."