A Go-based autonomous AI coding agent with TUI + HTTP API. Works best with A²gent/caesar as control app.
- Comprehensive tool system:
- File operations:
read,write,edit,replace_lines - Search:
glob,grep,find_files - Execution:
bashcommand execution - Media: screenshot capture and camera photo capture
- Extensible architecture for custom/server-backed tools
- Agentic loop: task -> LLM with tools -> tool execution -> result feedback -> repeat
- A2A bridge support: canonical message endpoint + outbound tunnel-based chat + agent-card discovery
- Multi-provider support: Anthropic Claude, Kimi, Google Gemini, LM Studio, OpenAI-compatible endpoints
- Auto-router and fallback chain support for reliability
- In-session provider/model switching support (web app flow)
- SQLite persistence for sessions, messages, jobs, integrations, and app settings
- Session resumption and parent/child session relationships
- Recurring jobs and project-aware session organization
- Interactive terminal UI with status bar, model display, token/context metrics, and session timer
- Multi-line input and command palette behavior
- Live message stream with tool call/result rendering
- REST API for web-app integration
- Session management endpoints (create/list/resume/manage)
- Speech and integration plumbing (including Whisper-related flows)
- Lightweight runtime footprint
- Context window tracking and management
- Structured logging and practical failure handling
- Go 1.24+
justcommand runner (install)- API key for at least one remote provider (unless you use local LM Studio)
- macOS camera features: Xcode Command Line Tools (
xcode-select --install) - For local audio in web-app (Whisper/Piper bootstrap):
cmakeffmpegpkg-config- Install on macOS:
brew install cmake ffmpeg pkg-config
git clone https://github.com/A2gent/brute.git
cd brute
# pick one provider key
export ANTHROPIC_API_KEY=sk-ant-...
# or
export KIMI_API_KEY=sk-kimi-...
# or
export GEMINI_API_KEY=...
just start# build + run (TUI + API)
just start
# build only
just build
# API only
just serverBuild image:
docker build -t a2gent-brute:latest -f Dockerfile .Run directly:
docker run --rm -it \
--name a2gent-brute \
--read-only \
--tmpfs /tmp:exec,size=256m \
-p 8080:8080 \
-v "$PWD":/workspace \
-v "$HOME/.a2gent-data":/data \
a2gent-brute:latestRun with compose helpers:
# API mode
just docker-api
# API mode with explicit LM Studio endpoint (useful for Tailscale IP)
just docker-api-lmstudio http://100.x.y.z:1234/v1
# interactive TUI mode (must run in a real terminal)
just docker-tui
# stop
just docker-api-down
# or
just docker-stopDocker notes:
/workspaceis the agent-visible project tree./datastores DB, logs, and config (AAGENT_DATA_PATH=/data).- Runtime image is Alpine-based and includes
ffmpeg. - Default LM Studio URL in compose is
http://host.docker.internal:1234/v1. - For Tailscale-hosted LM Studio, prefer direct Tailscale IP over MagicDNS hostname.
Requirement: install Apple container CLI: github.com/apple/container
# build image
just apple-build
# API mode
just apple-api
# API mode with explicit LM Studio endpoint (recommended for Tailscale)
LM_STUDIO_BASE_URL=http://100.x.y.z:1234/v1 just apple-api
# interactive TUI mode
just apple-tui
# stop running brute containers
just apple-stop
# stop Apple container runtime VM
just apple-system-stopCanonical location (single-folder layout with DB/logs):
| Location | Scope |
|---|---|
$AAGENT_DATA_PATH/config.json |
user-level |
Defaults:
AAGENT_DATA_PATH=~/.local/share/aagent- config:
~/.local/share/aagent/config.json - database:
~/.local/share/aagent/aagent.db - logs:
~/.local/share/aagent/logs/
Backward-compatible read fallbacks are still supported:
.aagent/config.json~/.config/aagent/config.json
The app loads .env from:
- current directory
~/.env
Required (choose one for remote providers):
| Variable | Description |
|---|---|
ANTHROPIC_API_KEY |
Anthropic key |
KIMI_API_KEY |
Kimi key |
GEMINI_API_KEY |
Gemini key |
OPENAI_API_KEY |
OpenAI-compatible key |
Common optional variables:
| Variable | Default | Description |
|---|---|---|
AAGENT_PROVIDER |
auto |
active provider (anthropic, kimi, gemini, lmstudio, auto-router) |
AAGENT_MODEL |
provider-specific | model override |
ANTHROPIC_BASE_URL |
https://api.anthropic.com |
Anthropic endpoint |
KIMI_BASE_URL |
https://api.kimi.com/coding/v1 |
Kimi endpoint |
GEMINI_BASE_URL |
https://generativelanguage.googleapis.com |
Gemini endpoint |
LM_STUDIO_BASE_URL |
http://localhost:1234/v1 |
LM Studio endpoint |
AAGENT_DATA_PATH |
~/.local/share/aagent |
data directory |
AAGENT_FALLBACK_PROVIDERS |
- | fallback chain list |
| Command | Description |
|---|---|
a2 |
launch TUI |
a2 "<task>" |
start with an initial task |
a2 --continue <session-id> |
resume session |
a2 session list |
list sessions |
a2 logs |
show logs |
a2 logs -f |
follow logs |
Brute supports A2A communication in two modes:
- Protocol-style HTTP endpoint for inbound A2A messages
- Tunnel-backed outbound messaging to remote agents via Square
GET /.well-known/agent-card.jsonsupportedInterfaces[0].urlpoints to/a2a/messages/send
| Method | Path | Description |
|---|---|---|
POST |
/a2a/messages/send |
Handle canonical A2A message (content[]) and return final response |
POST |
/a2a/messages/send/stream |
SSE stream for inbound A2A message lifecycle (accepted, running, final message) |
| Method | Path | Description |
|---|---|---|
POST |
/a2a/outbound/sessions |
Create local outbound A2A session bound to target agent |
POST |
/a2a/outbound/sessions/{sessionID}/chat |
Send outbound A2A message (sync) |
POST |
/a2a/outbound/sessions/{sessionID}/chat/stream |
Send outbound A2A message (SSE progress stream) |
- Canonical requests use
content[]parts (text,image_url,image_base64) - Image-only and text+image requests are supported
- For compatibility, brute still understands legacy bridge fields (
task,images,result) used by existing tunnel/proxy flows
- Sessions are persisted in a single SQLite DB (
AAGENT_DATA_PATH/aagent.db). - Session fields include
id,agent_id,title,status, timestamps, optionalparent_idandjob_id. - Grouping available now: parent/child sessions and job sessions.
- Not currently in HTTP session API: first-class project/folder filtering.
DB path:
~/.local/share/aagent/aagent.db
# or
$AAGENT_DATA_PATH/aagent.dbMain tables:
sessionsmessagesrecurring_jobsjob_executionsapp_settingsintegrationsmcp_serversprojects
Quick query:
sqlite3 ~/.local/share/aagent/aagent.db
SELECT id, title, status, created_at FROM sessions ORDER BY created_at DESC LIMIT 10;just run # run with go run
just dev # API hot reload (air)
just build # build binary
just test # run tests
just fmt # go fmt
just lint # go vet# all tests
just test
# with race + coverage
go test -v -race -coverprofile=coverage.out ./...
# one package
go test -v ./internal/tools/...Set one provider key:
export ANTHROPIC_API_KEY=sk-ant-...
# or KIMI_API_KEY / GEMINI_API_KEY / OPENAI_API_KEYexport AAGENT_PROVIDER=auto-router
export AAGENT_FALLBACK_PROVIDERS=anthropic,kimi,geminiRun TUI from a real interactive terminal (just docker-tui) or use API mode (just docker-api).
Use Tailscale IP, not MagicDNS hostname, e.g.:
just docker-api-lmstudio http://100.x.y.z:1234/v1aagent/
├── cmd/aagent/ # CLI entry point
├── internal/
│ ├── agent/ # orchestrator and loop
│ ├── config/ # config management
│ ├── llm/ # provider clients
│ ├── logging/ # logs
│ ├── session/ # session model + manager
│ ├── storage/ # SQLite store
│ ├── tools/ # built-in tools
│ └── tui/ # Bubble Tea UI
├── justfile
└── README.md
- Fork the repository.
- Create a branch (
git checkout -b feature/your-change). - Commit and push changes.
- Open a pull request.
License: MIT
| Channel | Contact |
|---|---|
| Founder Telegram | @tot_ra |
| X / Twitter | @tot_ra |
| Schedule Demo | https://calendly.com/artkurapov/30min |
artkurapov at gmail.com |