Skip to content

smackypants/TrueAI

Repository files navigation

LocalAI - Local LLM Android App with Agentic Capabilities

A comprehensive React Native/Expo-based Android application for running AI models locally with advanced agentic capabilities and an extensible addon harness system.

Project Overview

LocalAI is a production-ready mobile AI platform that enables users to:

  • Chat with Local Models: Connect to Ollama servers or use local GGUF models
  • Run AI Agents: Execute autonomous agents with tool use and multi-step planning
  • Manage Models: Download and manage models from HuggingFace and Ollama
  • Extend Functionality: Install harnesses (addon bundles) to add tools and capabilities
  • Sync Data: Optional Supabase cloud sync for conversations and settings

Architecture

Core Components

1. Inference Engine (lib/inference.ts)

  • Unified interface for model inference
  • Supports Ollama (HTTP API) and local models
  • Streaming token output for real-time responses
  • Model listing and pulling capabilities

2. Agent Runtime (lib/agent.ts)

  • Plan-act-observe agentic loop
  • Built-in tools: calculator, datetime, memory
  • Tool call parsing and execution
  • Step-by-step execution tracing
  • Integration with Supabase for run history

3. Harness System (lib/harness.ts)

  • Extensible addon manifest format
  • Three built-in harnesses:
    • Code Assistant: Code execution, file management
    • Research Agent: Web search, note-taking, citations
    • Data Analyst: CSV parsing, charting, statistics
  • Manifest validation and loading from URLs
  • Tool registration and sandboxing

4. State Management (lib/store.ts)

  • Zustand-based global state
  • Persistent storage via AsyncStorage
  • Models, agents, extensions, conversations
  • Theme and Ollama URL configuration

Database Schema (Supabase)

Tables:

  • conversations: Chat session metadata
  • messages: Individual messages with tool call tracking
  • models: LLM models from Ollama/HuggingFace
  • agents: AI agent configurations
  • extensions: Installed harnesses with manifests
  • agent_runs: Agent execution history with step traces

All tables have RLS enabled with user-scoped access control.

Features

Tab-Based Navigation

  1. Chat Tab

    • Real-time message streaming
    • Model selection dropdown
    • Conversation persistence
    • System prompt configuration per conversation
  2. Agents Tab

    • Create/edit/delete agents
    • Configure system prompts
    • Select available tools and harnesses
    • View tool badges per agent
  3. Models Tab

    • Installed: Manage downloaded models
    • Ollama: Browse and pull available models
    • HuggingFace: Search (placeholder for future)
    • Download progress tracking
    • Model metadata and quantization info
  4. Extensions Tab

    • Installed: Enable/disable harnesses
    • Browse: Example harness gallery
    • Install harnesses from repo URLs
    • View harness manifests and tools
  5. Settings Tab

    • Theme toggle (light/dark)
    • Ollama server URL configuration
    • Connection testing
    • Inference parameters (temperature, max_tokens, etc.)
    • Cloud sync toggle
    • Account management

Setup and Installation

Prerequisites

  • Node.js 18+
  • Expo CLI: npm install -g expo-cli
  • Android development environment (for building APK)
  • Ollama server running (optional, for model inference)

Installation Steps

# Install dependencies
npm install

# Set up environment variables
# Copy .env.example to .env and fill in:
# EXPO_PUBLIC_SUPABASE_URL=your_supabase_url
# EXPO_PUBLIC_SUPABASE_ANON_KEY=your_anon_key

# Run on web
npm run dev

# Build for web
npm run build:web

# Type check
npm run typecheck

Environment Variables

Required .env variables:

EXPO_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
EXPO_PUBLIC_SUPABASE_ANON_KEY=your_anon_key_here

Optional for Ollama:

  • Default: http://localhost:11434
  • Configurable in Settings tab

Harness Development

Create a new harness by implementing the HarnessManifest interface:

export interface HarnessManifest {
  name: string;
  version: string;
  harness_type: string; // unique identifier
  description?: string;
  author?: string;
  tools: HarnessToolDefinition[];
  system_prompt_injection?: string;
  required_permissions?: string[];
}

Example: Custom Harness

const MY_HARNESS: HarnessManifest = {
  name: 'My Tool Suite',
  version: '1.0.0',
  harness_type: 'my-tools',
  description: 'Custom tools for specific use case',
  author: 'Your Name',
  tools: [
    {
      name: 'my_tool',
      description: 'Does something useful',
      parameters: {
        input: { type: 'string' }
      },
      handler: async (params) => {
        // Tool implementation
        return JSON.stringify({ result: 'output' });
      }
    }
  ]
};

Publish to a GitHub repository and users can install via the repo URL.

Key Features

Real-time Streaming

  • Token-by-token response streaming from Ollama
  • Live UI updates as model generates text
  • No waiting for full response before seeing output

Persistent Storage

  • Local AsyncStorage for UI state (theme, URLs, active selections)
  • Supabase for conversations, messages, and agent configurations
  • Automatic sync between devices for authenticated users

Security

  • Row-level security (RLS) on all Supabase tables
  • User-scoped data access only
  • No global access to other users' conversations/agents
  • Optional local-only mode (no cloud sync)

Performance Optimizations

  • FlatList with content size optimization for long message threads
  • Lazy loading of conversations
  • Debounced Ollama connection checks
  • Efficient state management with Zustand

Android-Specific Features

  • Haptic feedback on messages sent and agent steps completed
  • Network status detection
  • Proper back button handling
  • Storage permission management
  • Dark mode system integration

API Integration Points

Ollama API (lib/inference.ts)

  • GET /api/tags - List available models
  • POST /api/pull - Download models with streaming
  • POST /api/chat - Chat inference with streaming
  • POST /api/embed - Embedding generation

Supabase REST API

  • Full CRUD operations on all tables
  • Real-time subscriptions (future implementation)
  • Authentication via email/password

Future Enhancements

  • Real-time message subscriptions
  • Agent run visualization with step replay
  • Model fine-tuning support
  • Voice input/output integration
  • HuggingFace model downloading
  • Custom harness upload UI
  • Agent scheduler for background execution
  • Multi-model ensemble agents
  • Local model quantization tools
  • Web UI for harness development

Project Structure

app/
├── _layout.tsx                 # Root layout with app state initialization
├── (tabs)/
│   ├── _layout.tsx            # Tab navigation setup
│   ├── chat/index.tsx         # Chat interface
│   ├── agents/index.tsx       # Agent management
│   ├── models/index.tsx       # Model browser and management
│   ├── extensions/index.tsx   # Harness/extension management
│   └── settings/index.tsx     # App settings
└── +not-found.tsx

lib/
├── supabase.ts                # Supabase client and types
├── inference.ts               # Ollama and model inference
├── agent.ts                   # Agentic runtime
├── harness.ts                 # Harness manifest system
├── store.ts                   # Zustand global state
├── theme.ts                   # Design tokens
└── android.ts                 # Android-specific utilities

components/                     # Shared UI components (future)
hooks/
└── useFrameworkReady.ts       # Expo framework initialization

License

MIT

Contributing

Contributions welcome! Please follow the existing code style and add tests for new features.

Support

For issues and feature requests, please open a GitHub issue.


Built with Expo, React Native, and Supabase

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

  •  

Packages

 
 
 

Contributors