Skip to content

AI-powered coding assistant CLI and VS Code extension using Ollama with multi-engine research (DuckDuckGo, Google, SciraAI, SearXNG) and mem0 memory layer for persistent context across sessions

License

GPL-3.0, Unknown licenses found

Licenses found

GPL-3.0
LICENSE
Unknown
LICENSE.md
Notifications You must be signed in to change notification settings

penguintechinc/PenguinCode

Penguin Code Logo

Penguin Code

    ____                        _          ______          __
   / __ \___  ____  ____ ___  _(_)___     / ____/___  ____/ /__
  / /_/ / _ \/ __ \/ __ `/ / / / / __ \   / /   / __ \/ __  / _ \
 / ____/  __/ / / / /_/ / /_/ / / / / /  / /___/ /_/ / /_/ /  __/
/_/    \___/_/ /_/\__, /\__,_/_/_/ /_/   \____/\____/\__,_/\___/
                 /____/

AI-powered coding assistant CLI and VS Code extension using Ollama

CI License Python

Penguin Tech Inc © 2025

Features

  • 🤖 Multi-Agent System - ChatAgent orchestrates specialized Explorer/Executor agents
  • 🔍 Multi-Engine Research - 5 search engines with MCP protocol support
  • 🧠 Persistent Memory - mem0 integration for context across sessions
  • 📚 Documentation RAG - Auto-detects your project's languages and libraries, fetches official documentation, and uses it for accurate, syntax-correct answers
  • 🔌 MCP Integration - Extend with N8N, Flowise, and custom MCP servers
  • 🌐 Client-Server Mode - gRPC server for remote Ollama and team deployments
  • GPU Optimized - Smart model switching for RTX 4060 Ti (8GB VRAM) or higher
  • 🐧 Cross-Platform - Works on Linux, macOS, and Windows

Supported Languages

Language Detection Doc Sources
Python pyproject.toml, requirements.txt, *.py Official docs + PyPI libraries
JavaScript/TypeScript package.json, tsconfig.json MDN, npm packages
Go go.mod, *.go go.dev, pkg.go.dev
Rust Cargo.toml, *.rs docs.rs, crates.io
OpenTofu/Terraform *.tf, *.tofu, .terraform.lock.hcl OpenTofu docs, provider registries
Ansible ansible.cfg, playbook.yml, requirements.yml Ansible docs, Galaxy collections

Quick Start

# Install
pip install -e .
penguincode setup

# Pull required models
ollama pull llama3.2:3b qwen2.5-coder:7b nomic-embed-text

# Run
penguincode chat

VS Code Extension: Download VSIX from Releases

Server Mode (Team Deployment)

# Start gRPC server (connects to local Ollama)
python -m penguincode.server.main

# Or use Docker
docker compose up -d

# Connect from client
penguincode chat --server localhost:50051

See Architecture Documentation for remote deployment with TLS and authentication.

Documentation

License

AGPL-3.0 - See LICENSE for details

Support: support.penguintech.io | Homepage: www.penguintech.io

About

AI-powered coding assistant CLI and VS Code extension using Ollama with multi-engine research (DuckDuckGo, Google, SciraAI, SearXNG) and mem0 memory layer for persistent context across sessions

Topics

Resources

License

GPL-3.0, Unknown licenses found

Licenses found

GPL-3.0
LICENSE
Unknown
LICENSE.md

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published