pip install karma| Store | Scope | Backed by | Decay |
|---|---|---|---|
| CodeMemory | repo + env packages | codesigs + litesearch FTS5+vector | permanent |
| PracticeMemory | repo-scoped idioms | litesearch FTS5+vector (SHA1 dedup) | permanent |
| DecisionLog | global | litesearch FTS5+vector (append-only) | permanent |
#| eval: false
from karma import index_repo, search_code
# Index your repo
result = index_repo('.')
print(result)#| eval: false
from karma.memory import DecisionLog
dl = DecisionLog()
dl.log('Chose RRF k=60 for hybrid search — standard constant')
results = dl.search('RRF')
print(results)#| eval: false
from karma import arun, build_dev_context
ctx = arun(build_dev_context('how to extend a class', repo_root='.'))
print(ctx)#| eval: false
from karma import arun, PracticeMemory
pm = PracticeMemory(repo_root='.')
arun(pm.add('Use L.map not list comprehension', tags='fastcore', severity='must'))
results = arun(pm.query('how to map?'))
print(results)Every repo develops a memory — the decisions you made, the packages you chose, the patterns that work and the anti-patterns you burned your hands on. Without karma that memory lives only in your head (or gets lost in a wall of commit history).
karma gives that memory a home:
- CodeMemory — permanent FTS5+vector index of your repo and installed packages. Ask "does a hybrid search helper already exist?" before writing one.
- DecisionLog — append-only log of architectural decisions, searchable by meaning. "Why did we choose RRF k=60?" → answered in milliseconds.
- PracticeMemory — repo-scoped idioms and anti-patterns seeded from
CLAUDE.md/AGENTS.md. Copilot (or any agent) sees them in every context window. - Upgrade hints — when your search hits code from an older installed version, karma surfaces the newer API automatically.
karma ships async tool functions that drop straight into
Lisette's Chat(tools=[...]) interface —
the same pattern used by GitHub Copilot agents.
The system prompt orders the agent:
dev_context_tool → search_code_tool → implement → log_decision_tool.
That order ensures karma is consulted before any code is written.
#| eval: false
from lisette import Chat, bind
from karma.context import dev_context_tool, search_code_tool
from karma.memory import add_practice_tool, log_decision_tool, query_practices_tool
from karma.code import index_repo
index_repo('.') # incremental — skips unchanged files on subsequent runs
xtra = {'editor-version': 'vscode/1.85.1', 'Copilot-Integration-Id': 'vscode-chat'}
sp = """You are a coding assistant with full memory of this repository.
ALWAYS call dev_context_tool first — surfaces existing code, past decisions, and practices.
Order: dev_context_tool → search_code_tool → implement → log_decision_tool."""
chat = Chat('github_copilot/claude-sonnet-4', sp,
tools=[dev_context_tool, search_code_tool, add_practice_tool,
log_decision_tool, query_practices_tool])
c = bind(chat, max_steps=6, return_all=True, max_tokens=5000, extra_headers=xtra)
r = c('What should I know before adding a batch_search feature?')
print(chat.hist[-1].content)dev_context_tool— fetches code search results, repo practices, and past decisions for the query in parallel. Returns one formatted context block.search_code_tool— targeted FTS5+vector search for specific symbols.log_decision_tool— stores the decision made (e.g. "used RRF k=60 — standard constant") as a permanent entry in the DecisionLog.add_practice_tool— saves a new repo idiom (e.g. "always call pre(query) before FTS5").
Any LLM that speaks the OpenAI tool-call protocol works: Copilot, claude-3-5, GPT-4o, etc.
from karma.memory import DecisionLog
dl = DecisionLog()
# Store an architectural decision (append-only, permanent)
dl.log('Chose RRF k=60 for hybrid search — standard constant, insensitive to corpus size')
# Later — in any session, any agent
results = dl.search('why RRF k=60')
print(results[0]['content']) # → 'Chose RRF k=60 for hybrid search…'
# Recent decisions as a formatted context string
print(dl.as_context_str())from karma import arun, PracticeMemory
pm = PracticeMemory(repo_root='.')
# Reads CLAUDE.md, AGENTS.md, CONVENTIONS.md, CONTRIBUTING.md, README.md
result = arun(pm.seed_from_dir('.'))
print(result) # {'added': 12, 'skipped': 0}
# Query a practice
r = arun(pm.query('how should I handle list operations?'))
print(r[0]['content']) # → 'Use L.map not list comprehension' [severity: must]On import, karma copies its bundled SKILL.md into .agents/skills/karma/SKILL.md
at the project root. This is automatically discovered by Claude Code, Cursor, GitHub
Copilot, Continue.dev, and OpenCode.
from pyskills import list_pyskills, doc
import karma.skill
# Discovery: host sees karma without importing the full package
print(list_pyskills())
# → [('karma', 'Karma — repo memory for coding agents.'), ...]
# Documentation: all function signatures + docstrings
print(doc(karma.skill))
# Direct sync calls — works from any context
print(karma.skill.dev_context('retry logic', repo_root='.'))
print(karma.skill.search_code('async wrapper', repo_root='.'))import karma.skill as ks
ks.index_repo('.')
print(ks.dev_context('implement a cache', repo_root='.'))karma registers itself as a pyskills skill via an entry point in pyproject.toml,
so any compatible harness discovers it without importing the full package.
#| eval: false
from pyskills import list_pyskills, doc
import karma.skill
skills = list_pyskills()
print(skills) # → {'karma': 'Karma — repo memory for coding agents.', ...}
# LLM-friendly docs — full signatures + docstrings, no heavy imports needed
print(doc(karma.skill))pyskills also supports local skills placed outside any pip package, in
~/.local/share/pyskills/ (or $XDG_DATA_HOME/pyskills).
Use register_pyskill to create one alongside karma:
#| eval: false
from pyskills import register_pyskill, disable_pyskill, enable_pyskill
# Create a local extension skill
register_pyskill(
name='myproject.karma_ext',
docstr='Project-specific karma helpers',
code='from karma.skill import dev_context\n# your extensions here\n',
)
# Toggle any skill without uninstalling
disable_pyskill('karma') # hides karma from list_pyskills()
enable_pyskill('karma') # restores itkarma stores all data in $XDG_DATA_HOME/karma/ (default ~/.local/share/karma/):
| File | Contents | Scope |
|---|---|---|
karma.db |
Practices, decisions, conversations | global (all projects) |
codesearch.db |
Repo code sigs + chunks | global (all repos) |
env.db |
Installed package index + upgrade hints | global (all projects) |
Databases are global, not per-repo because installed packages and past decisions
are shared across projects. Only PracticeMemory is repo-scoped (stored in karma.db
under a per-repo agent_id).
pip install -e ".[dev]" # installs karma + dev deps (includes pyskills)
nbdev_export # regenerate karma/*.py from nbs/ — do this after any notebook edit
nbdev_preview # rebuild docs / READMEThe [project.entry-points.pyskills] entry in pyproject.toml makes
list_pyskills() return karma automatically after pip install -e ..
#| eval: false
from karma.core import _xdg_karma_dir
db_dir = _xdg_karma_dir()
print(f'karma data dir: {db_dir}')
# karma.db — practices, decisions, conversations
# codesearch.db — repo code sigs + chunks (all repos)
# env.db — installed package index + upgrade hints