DebugBrain is a next-generation AI-powered debugging assistant that doesn't just find bugs β it remembers them. By leveraging a hybrid memory engine combining Local JSON persistence and Hindsight Cloud vector storage, DebugBrain learns your unique coding patterns and delivers personalized, context-aware fixes based on your own debugging history.
- Features
- Project Structure
- Tech Stack
- Prerequisites
- Quick Start
- Environment Variables
- Hybrid Memory System
- API Reference
- Deployment
- Team
| Feature | Description |
|---|---|
| π Vector Memory | Powered by Hindsight 2.0 to recall similar bugs via semantic search across all your past sessions. |
| π οΈ Auto-Fix Engine | Generates corrected code with step-by-step logic explanations powered by Groq (Llama 3.1). |
| π Pattern Recognition | Detects recurring errors and alerts you (e.g., "This is your 4th KeyError this week"). |
| π§΅ Thread-Safe Sync | Non-blocking cloud synchronization using FastAPI concurrency β zero latency on your UI. |
| π Quality Scoring | Provides a 1β10 code quality score with actionable, prioritized improvement tips. |
| π Debug Timeline | Searchable history of every debugging session, stored locally and synced to the cloud. |
debugbrain/
βββ backend/
β βββ main.py # FastAPI application & API routing
β βββ analyzer.py # Groq LLM logic & prompt engineering
β βββ memory.py # Hybrid Memory Manager (JSON + Hindsight 2.0)
β βββ requirements.txt # Python dependencies
β βββ .env # API keys and cloud URLs (not committed)
β
βββ frontend/
β βββ src/
β βββ App.jsx # Application state & core logic
β βββ components/ # Monaco Editor & Results panel components
β βββ utils/api.js # Axios config for backend communication
β
βββ README.md
| Layer | Technology |
|---|---|
| LLM | Llama 3.1 via Groq |
| Vector Memory | Hindsight 2.0 |
| Local Memory | JSON flat-file persistence |
| Backend | FastAPI, Uvicorn, Pydantic v2 |
| Frontend | React 18, Vite, Monaco Editor |
| Backend Hosting | Render |
| Frontend Hosting | Vercel |
Make sure you have the following installed before running DebugBrain:
- Python 3.9+
- Node.js 18+ and npm
- A Groq API Key β console.groq.com
- A Hindsight API Key β hindsight.vectorize.io
git clone https://github.com/Manshi4952/AI-Code-Debugging-Agent.git
cd AI-Code-Debugging-AgentCreate a .env file inside the backend/ directory:
cp backend/.env.example backend/.envThen open backend/.env and fill in your keys:
GROQ_API_KEY=gsk_your_groq_key_here
HINDSIGHT_API_KEY=hsk_your_hindsight_key_here
HINDSIGHT_API_URL=https://api.hindsight.vectorize.iocd backend
# Create and activate a virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Start the FastAPI server
python3 main.pyThe backend will be available at http://localhost:8000.
Open a new terminal:
cd frontend
# Install dependencies
npm install
# Start the development server
npm run devThe frontend will be available at http://localhost:5173.
All backend configuration is handled through environment variables. Set these in backend/.env:
| Variable | Description | Example |
|---|---|---|
GROQ_API_KEY |
Your Groq secret key for LLM access | gsk_... |
HINDSIGHT_API_KEY |
Your Hindsight Personal Access Token | hsk_... |
HINDSIGHT_API_URL |
Hindsight cloud endpoint | https://api.hindsight.vectorize.io |
β οΈ Never commit your.envfile. It is listed in.gitignoreby default.
DebugBrain uses a dual-layer memory architecture to ensure your debugging context is always fast, persistent, and intelligent.
ββββββββββββββββββββββββββββ
User submits code βββΊ β FastAPI /analyze route β
ββββββββββββββ¬ββββββββββββββ
β
βββββββββββββββββββββββ΄βββββββββββββββββββββββ
βΌ βΌ
ββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββ
β Local Layer (JSON) β β Cloud Layer (Hindsight 2.0) β
β - Frequency counting β β - Semantic vector search β
β - UI debug timeline β β - .retain() to store memory β
β data/memory/<uid>.json β β - .recall() to find matches β
ββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββ
Layer 1 β Local JSON:
High-speed persistence used for frequency counting (pattern detection) and rendering the debug timeline in the UI. Stored at data/memory/<user_id>.json.
Layer 2 β Hindsight 2.0 (Vector DB):
Enables semantic search across all past debug sessions. The AI can surface matches like: "I remember you had a similar NullPointerException in a different project two weeks ago..."
Thread Safety:
The memory engine uses fastapi.concurrency.run_in_threadpool to ensure cloud uploads are always non-blocking, keeping API responses snappy regardless of cloud latency.
Base URL (local): http://localhost:8000
| Endpoint | Method | Description |
|---|---|---|
/analyze |
POST |
Submit code for analysis. Syncs to Hindsight and recalls semantically similar past fixes. |
/history/{user_id} |
GET |
Fetch the visual debug timeline for a specific user. |
/memories/{user_id} |
GET |
Retrieve the most frequent bug patterns from the user's memory bank. |
/clear/{user_id} |
DELETE |
Wipe all local and cloud memory for a user β clean slate mode. |
curl -X POST http://localhost:8000/analyze \
-H "Content-Type: application/json" \
-d '{
"user_id": "user_123",
"code": "def divide(a, b):\n return a / b\n\ndivide(10, 0)",
"language": "python"
}'- Push your code to GitHub.
- Create a new Web Service on Render.
- Set the build command to
pip install -r requirements.txt. - Set the start command to
python3 main.py. - Add all environment variables (
GROQ_API_KEY,HINDSIGHT_API_KEY,HINDSIGHT_API_URL) in the Render dashboard under Environment.
- Import the repository on Vercel.
- Set the root directory to
frontend/. - Update
frontend/src/utils/api.jsto point to your Render backend URL. - Deploy β Vercel handles everything else automatically.
π Live Demo
| Name | Role |
|---|---|
| Manshi Kumari Shaw | Team Leader & Full-Stack Lead |
| Nandani | Contributor |
| Laxmi | Contributor |
| Manisha | Contributor |
This project is open source. Feel free to use, modify, and distribute it. Contributions via pull requests are welcome!