Skip to content

Misrilal-Sah/AI-Query-Master

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

14 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation


Python FastAPI React Vite ChromaDB Supabase


Gemini Groq OpenRouter MySQL PostgreSQL


The most powerful AI-driven SQL assistant β€” review queries, analyze schemas, generate SQL from plain English, and connect to live databases with a full agentic AI pipeline featuring self-reflection and RAG-powered intelligence.


πŸš€ Quick Start Β· ☁️ Deploy Β· ✨ Features Β· πŸ—οΈ Architecture Β· πŸ€– AI Pipeline Β· πŸ› οΈ Tech Stack


✨ Features

πŸ” Query Review

Paste any SQL query and get an instant AI-powered deep analysis β€” performance bottlenecks, security vulnerabilities, readability score, and concrete optimization suggestions with severity tags.

πŸ—‚οΈ Schema Review

Upload your DDL or paste a schema and receive a comprehensive evaluation of design choices, normalization gaps, missing indexes, and foreign key recommendations.

πŸ’¬ Natural Language β†’ SQL

Describe what you want in plain English. The AI understands your intent, picks the right dialect (MySQL / PostgreSQL), and returns a clean, optimized query with explanation.

πŸ”Œ Live Database Connection

Connect directly to your MySQL or PostgreSQL instance (read-only, session-based). Explore the live schema, run queries or NL prompts, and get EXPLAIN plan analysis β€” all in one place.

πŸ–ΌοΈ Schema Builder

Upload a schema diagram image and the AI reverse-engineers it into complete SQL DDL β€” table definitions, constraints, indexes, and relationships included.

πŸ“œ Analysis History

Every analysis is saved. Search, filter by type, date, or database dialect, and revisit any past result with full detail view and re-analysis option.

Live Database β€” Bonus Capabilities

Capability Detail
πŸ”Œ Secure Connection Read-only, session-scoped β€” never modifies your data
πŸ“Š Schema Explorer Toggle panel with table count, expandable column details
πŸ’¬ Dual Query Mode Switch between raw SQL and natural language on the fly
πŸŽ™οΈ Voice Input Speech-to-text for NL mode β€” just speak your question
πŸ€– Smart Context Agent automatically injects relevant schema context
πŸ“‹ EXPLAIN Plans Run and visualize query execution plans instantly

πŸ—οΈ System Architecture

╔══════════════════════════════════════════════════════════════╗
β•‘                  React Frontend  (Vite + React 18)           β•‘
║   Dashboard │ Query Review │ Schema │ NL→SQL │ Live DB │ History ║
β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•¦β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
                           β•‘  REST API (HTTP/JSON)
                           β–Ό
╔══════════════════════════════════════════════════════════════╗
β•‘                    FastAPI Backend                           β•‘
β•‘  /api/review-query   /api/review-schema   /api/nl-to-query   β•‘
β•‘  /api/connect-db     /api/history         /api/schema-build  β•‘
β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•¦β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•¦β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
                β•‘                      β•‘
                β–Ό                      β–Ό
╔══════════════════════╗  ╔══════════════════════════════════╗
β•‘      AI Agent        β•‘  β•‘       Live DB Connection         β•‘
β•‘  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β•‘  β•‘  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β•‘
β•‘  β”‚ Static Analysisβ”‚  β•‘  β•‘  β”‚   MySQL   β”‚  β”‚ PostgreSQL β”‚  β•‘
β•‘  β”‚ RAG Retrieval  β”‚  β•‘  β•‘  β”‚ (PyMySQL) β”‚  β”‚(Psycopg2)  β”‚  β•‘
β•‘  β”‚ Tool Calling   β”‚  β•‘  β•‘  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β•‘
β•‘  β”‚ LLM Reasoning  β”‚  β•‘  β•‘       Read-Only β”‚ Session-Scoped β•‘
β•‘  β”‚ Self-Reflectionβ”‚  β•‘  β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
β•‘  β”‚ Evaluation     β”‚  β•‘
β•‘  β””β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β•‘
β•šβ•β•β•β•β•β•β•β•β•β•β•¬β•β•β•β•β•β•β•β•β•β•β•β•
           β•‘
     ╔═════╩══════╗
     β–Ό            β–Ό
╔═════════╗  ╔══════════╗
β•‘ChromaDB β•‘  β•‘ Supabase β•‘
β•‘  (RAG)  β•‘  β•‘(History) β•‘
β•šβ•β•β•β•β•β•β•β•β•β•  β•šβ•β•β•β•β•β•β•β•β•β•β•

πŸ€– AI Agent Pipeline

Every analysis runs through a fully agentic loop β€” not a simple one-shot prompt:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                     AGENTIC LOOP                            β”‚
β”‚                                                             β”‚
β”‚  πŸ“₯ Input                                                   β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [1] πŸ”Ž Static Analysis ──── Pattern-based issue detection  β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [2] πŸ“š RAG Retrieval ─────── ChromaDB knowledge search     β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [3] πŸ”§ Tool Calling ──────── Specialized analysis modules  β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [4] 🧠 LLM Reasoning ─────── Multi-model inference         β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [5] πŸͺž Self-Reflection ───── Quality check (up to 3x)      β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  [6] πŸ“Š Evaluation ────────── Score: Perfβ”‚Secβ”‚Readβ”‚Complex  β”‚
β”‚      β”‚                                                      β”‚
β”‚      β–Ό                                                      β”‚
β”‚  πŸ“€ Final Response                                          β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ”€ Multi-LLM Fallback Chain

The agent automatically cascades through providers β€” zero downtime if one fails:

# Provider Model Speed Notes
1️⃣ Google Gemini gemini-2.5-pro Fast Primary β€” best quality
2️⃣ Google Gemini gemini-2.5-flash Faster Flash fallback
3️⃣ Groq llama-3.1-8b-instant Ultra-fast Low latency fallback
4️⃣ OpenRouter gemma-3-4b (free) Medium Free tier fallback
5️⃣ OpenRouter llama-3.3-70b (free) Medium Final free fallback

πŸš€ Quick Start

Prerequisites

Python Node Git

Step 1 β€” Clone & Setup Backend

git clone https://github.com/your-username/ai-query-master.git
cd "AI Query Master/backend"

# Create & activate virtual environment
python -m venv venv
venv\Scripts\activate          # Windows
# source venv/bin/activate    # Linux / macOS

# Install all dependencies
pip install -r requirements.txt

Step 2 β€” Configure Environment

Create backend/.env and fill in your keys:

# ── AI Providers ──────────────────────────────────────
GEMINI_API_KEY=your_gemini_key
GROQ_API_KEY=your_groq_key
OPENROUTER_KEY_1=your_openrouter_key
OPENROUTER_KEY_2=your_openrouter_key_optional   # optional second key

# ── Supabase (Auth + History storage) ─────────────────
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your_supabase_service_role_key

Step 3 β€” Setup Supabase

Run migration.sql in your Supabase SQL Editor to create all required tables.

Step 4 β€” Add Knowledge Base (optional but recommended)

Drop PDF documents into the RAG knowledge folders to supercharge analysis quality:

RAG_Knowledge/
β”œβ”€β”€ Mysql/          ← MySQL best practices, docs, guides
└── PostgreSQL/     ← PostgreSQL best practices, docs, guides

Step 5 β€” Launch

Terminal 1 β€” Backend:

cd backend
venv\Scripts\activate
python main.py
# βœ… API running at http://localhost:8000
# βœ… Docs available at http://localhost:8000/docs

Terminal 2 β€” Frontend:

cd frontend
npm install
npm run dev
# βœ… App running at http://localhost:5173

Open http://localhost:5173 and start querying! πŸŽ‰


Deploy: Render + Supabase pgvector

Use this production setup to avoid Render free-tier memory issues from local embedding models.

  1. Create a new Supabase project and run migration.sql in SQL Editor.
  2. Create a Render Web Service for backend:
    • Root Directory: backend
    • Build Command: pip install -r requirements.txt
    • Start Command: uvicorn main:app --host 0.0.0.0 --port $PORT
  3. Set backend environment variables in Render:
# App
FRONTEND_URL=https://your-frontend.vercel.app
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your_supabase_service_role_key

# LLMs
GEMINI_API_KEY=your_gemini_key
GROQ_API_KEY=your_groq_key
OPENROUTER_KEY_1=your_openrouter_key

# RAG (Supabase pgvector)
RAG_ENABLED=true
RAG_BACKEND=supabase
RAG_EMBEDDING_PROVIDER=gemini
RAG_EMBEDDING_DIM=384
RAG_PRELOAD_ON_STARTUP=false
RAG_AUTO_INDEX=false
  1. Deploy backend, then call POST /api/index once to build embeddings and vectors.
  2. Verify GET /api/health shows rag_pipeline.backend = supabase and non-zero chunks.
  3. Ensure frontend points to your Render backend API URL (replace localhost in frontend/src/api.js for production).

πŸ› οΈ Tech Stack

Layer Technology Purpose
Frontend React 18 + Vite + React Router SPA with fast HMR dev experience
Backend FastAPI + Python 3.12 Async REST API with auto-docs
AI / LLM Gemini Β· Groq Β· OpenRouter Multi-provider fallback inference
RAG ChromaDB or Supabase pgvector Β· LangChain Β· Gemini Embeddings Persistent vector search over knowledge base
Auth & Storage Supabase Auth, user management, history DB
Live Database PyMySQL Β· Psycopg2 Read-only live DB connections

πŸ“ Project Structure

AI Query Master/
β”‚
β”œβ”€β”€ πŸ“‚ backend/
β”‚   β”œβ”€β”€ main.py                  ← FastAPI app entry point
β”‚   β”œβ”€β”€ requirements.txt
β”‚   β”œβ”€β”€ πŸ“‚ agent/
β”‚   β”‚   β”œβ”€β”€ agent.py             ← Agentic orchestration loop
β”‚   β”‚   β”œβ”€β”€ llm_provider.py      ← Multi-LLM fallback chain
β”‚   β”‚   β”œβ”€β”€ rag_pipeline.py      ← ChromaDB/Supabase pgvector RAG retrieval
β”‚   β”‚   β”œβ”€β”€ tools.py             ← Specialized analysis tools
β”‚   β”‚   β”œβ”€β”€ reflection.py        ← Self-reflection module
β”‚   β”‚   └── evaluator.py         ← Scoring: Perf/Sec/Read/Complex
β”‚   β”œβ”€β”€ πŸ“‚ api/
β”‚   β”‚   β”œβ”€β”€ auth.py              ← Supabase auth endpoints
β”‚   β”‚   β”œβ”€β”€ nl_to_query.py       ← NL β†’ SQL endpoint
β”‚   β”‚   β”œβ”€β”€ query_review.py      ← Query analysis endpoint
β”‚   β”‚   β”œβ”€β”€ schema_review.py     ← Schema analysis endpoint
β”‚   β”‚   β”œβ”€β”€ schema_builder.py    ← Image β†’ DDL endpoint
β”‚   β”‚   β”œβ”€β”€ live_db.py           ← Live DB connection endpoint
β”‚   β”‚   └── history.py           ← Analysis history endpoint
β”‚   └── πŸ“‚ db/
β”‚       β”œβ”€β”€ mysql_connector.py   ← MySQL live connection
β”‚       β”œβ”€β”€ postgres_connector.py← PostgreSQL live connection
β”‚       └── supabase_client.py   ← Supabase client
β”‚
β”œβ”€β”€ πŸ“‚ frontend/
β”‚   β”œβ”€β”€ πŸ“‚ src/
β”‚   β”‚   β”œβ”€β”€ πŸ“‚ pages/            ← Full-page route components
β”‚   β”‚   β”œβ”€β”€ πŸ“‚ components/       ← Reusable UI components
β”‚   β”‚   β”œβ”€β”€ πŸ“‚ context/          ← Auth & Theme context
β”‚   β”‚   └── api.js               ← Centralized API client
β”‚   └── package.json
β”‚
β”œβ”€β”€ πŸ“‚ RAG_Knowledge/
β”‚   β”œβ”€β”€ Mysql/                   ← MySQL knowledge documents
β”‚   └── PostgreSQL/              ← PostgreSQL knowledge documents
β”‚
└── migration.sql                ← Supabase schema setup

πŸ” Security Model

  • Live DB connections are strictly read-only and session-scoped β€” no write operations, no persistent credentials stored
  • All API keys are loaded from environment variables β€” never hardcoded
  • Authentication handled by Supabase with JWT tokens
  • RAG supports local (ChromaDB) or hosted (Supabase pgvector) storage for indexed chunks

⭐ If this project helps you, give it a star!

About

AI-powered database assistant that analyzes, optimizes, and generates queries using RAG, reasoning, and agentic workflows.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors