A zero-dependency context transfer system for seamlessly switching between AI coding assistants without losing project context.
When you switch between AI coding assistants — Claude, GPT, Codex, Gemini, Cursor, Copilot — each new AI starts with zero memory of your project. You lose:
- What was just built
- What's broken
- What to do next
- WHY things were done a certain way
- Design system consistency
You end up re-explaining everything. Context is lost. Momentum dies.
UHP maintains structured context files inside your project that any AI can read and write. When you switch:
- Tell the old AI:
rate limits dying, prepare for handoff - Tell the new AI:
Read the .handoff/ directory...(see Quick Start) - The new AI picks up exactly where the old one left off.
No dependencies. No installation. No configuration. Just files.
your-project/
└── .handoff/
├── README.md ← Protocol rules (13 rules for AI behavior)
├── PROJECT_BIBLE.md ← Project encyclopedia (auto-generated)
├── ACTIVE_SESSION.md ← Live work state (auto-updated)
├── SESSION_LOG.md ← Historical audit trail
└── PROMPTS.md ← User quick-reference card
Layer 1: Arrival prompt ← Critical rules in primary instruction
Layer 2: README.md ← Full 13-rule protocol
Layer 3: Embedded file headers ← Instructions inside each file
Layer 4: CRITICAL summary ← Recency bias exploit at end of README
Layer 5: Mid-session correction ← User can snap AI back on track
Layer 6: Self-healing ← Missing files auto-regenerated
Layer 7: Freshness check ← Stale context detected and flagged
Layer 8: Validation proof ← AI proves it read everything
Tell any AI assistant:
Set up the Universal Handoff Protocol in this project. Fetch the exact
template files from https://github.com/shreybansal365/universal-handoff-protocol/tree/main/template
— download README.md and PROMPTS.md exactly as they are, do not modify
or reinterpret them. Place them in a .handoff/ directory in the project
root. Then scan this project's codebase and generate a fresh
PROJECT_BIBLE.md, ACTIVE_SESSION.md, and SESSION_LOG.md. Add !.handoff/
to .gitignore if needed. Run the setup verification checklist. IMPORTANT:
Throughout this session, silently update .handoff/ACTIVE_SESSION.md after
every significant change.
Tell any AI assistant:
Read the .handoff/ directory in my project root. Start with README.md,
then PROJECT_BIBLE.md, ACTIVE_SESSION.md, and SESSION_LOG.md. After
reading, reply with a "UHP loaded" validation proof showing: project
name, current state, next task, session count, and last AI model.
Then continue from where the last AI left off. IMPORTANT: Throughout
this session, silently update .handoff/ACTIVE_SESSION.md after every
significant change you make.
| Say This | What Happens |
|---|---|
rate limits dying, prepare for handoff |
Full context save — ready to switch AIs |
checkpoint or save |
Quick mid-session save — you're staying |
follow the protocol |
Snap a misbehaving AI back on track |
| Feature | Description |
|---|---|
| Continuous Auto-Sync | AI silently updates handoff files as it works |
| Checkpoint Saves | User can force a save at any time with "checkpoint" |
| Rolling Log Compression | Keeps last 5 sessions, compresses older into summary |
| Integrity Checking | New AI validates project context against actual codebase |
| Freshness Detection | Warns if handoff data is >48 hours stale |
| Decision Documentation | Captures WHY things were built a certain way |
| Self-Healing | Missing or corrupted files auto-regenerated |
| Git Protection | .handoff/ survives git clone via .gitignore rules |
| Security | Protocol forbids plaintext secrets in handoff files |
| Offline Fallback | Works with web-only AIs via copy-paste |
| Validation Proof | AI must prove it read and understood all files |
| Mid-Session Correction | User can re-engage a misbehaving AI instantly |
| Setup Verification | Bootstrapping runs 6-point checklist to confirm |
| Embedded Instructions | Each file has point-of-action rules inside it |
| Cross-File Reminders | Updating one file reminds AI about related files |
| Language Agnostic | Works with JS, Python, Rust, Go, or any language |
- Zero Dependencies — No scripts, no tools, no extensions. Pure markdown files.
- Self-Contained — Everything lives in
.handoff/. Nothing installed on the host. - Self-Healing — The system repairs itself when files are lost or corrupted.
- Self-Bootstrapping — One prompt creates the entire system from scratch.
- Defense in Depth — 8 independent layers ensure protocol compliance.
- Graceful Degradation — Even if a weak AI follows 50% of rules, the system works. The next smart AI cleans up.
The template/ directory contains the exact protocol files. When setting up UHP on a new project, AIs should download these verbatim — not recreate them from a description.
| File | Lines | Purpose |
|---|---|---|
README.md |
~340 | Complete 13-rule protocol with embedded templates |
PROMPTS.md |
~70 | User quick-reference card with all keywords |
PROJECT_BIBLE.md |
template | Auto-generated by AI on first setup |
ACTIVE_SESSION.md |
template | Auto-populated by AI during work |
SESSION_LOG.md |
template | Auto-populated on each handoff |
Q: Does this work with ALL AI coding assistants?
A: Yes. If the AI can read local files, it works natively. If it can't (web-only ChatGPT), there's an offline fallback where you paste file contents.
Q: What happens if I forget to say "prepare for handoff"?
A: The continuous auto-sync rule ensures files are updated throughout the session. Even without an explicit handoff, the files are ~95% current.
Q: What if a weak AI doesn't follow the rules?
A: The embedded instructions inside each file catch most weak models. For stubborn cases, use the correction keyword. And the next strong model will self-heal any gaps.
Q: Can I use this for non-coding projects?
A: Yes. The protocol is generic enough for any project where AI assistants need context — writing, research, design, etc.
Q: What if I lose the prompt?
A: Every project with UHP has PROMPTS.md inside it. Open any existing project and the prompt is right there.
MIT — Use it anywhere, share it with anyone, modify it for your needs.