Skip to content

nhlpl/upLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Let's put Ultramemory to work. Here's a complete, runnable example that integrates Ultramemory with your Pydantic AI learning agent powered by DeepSeek. This demonstrates the key production features we've discussed: learning with versioning, time-aware recall, and automatic conflict resolution.

πŸ§ͺ Step 1: Setup

First, install the required packages. Ultramemory uses LiteLLM under the hood, so it works seamlessly with DeepSeek.

pip install ultramemory[local] pydantic-ai openai duckduckgo-search

πŸ’» Step 2: The Complete Code

Save this as learning_agent_ultramemory.py. It's a fully functional agent that learns from web searches and maintains a versioned, time-aware memory.

# learning_agent_ultramemory.py
import asyncio
import os
from dataclasses import dataclass, field
from typing import List, Optional
from datetime import datetime

from pydantic_ai import Agent, RunContext
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.common_tools.duckduckgo import duckduckgo_search_tool
from pydantic_ai.messages import ModelMessage

# Import Ultramemory
from ultramemory import UltraMemory

# ============================================================================
# 1. Memory Manager Wrapper (Ultramemory)
# ============================================================================
class UltraMemoryManager:
    """A thin wrapper around Ultramemory for our agent's tools."""
    
    def __init__(self, db_path: str = "./agent_memory.db"):
        # Initialize Ultramemory with DeepSeek as the extraction LLM
        # It will use local embeddings (all-MiniLM-L6-v2) for search by default
        self.memory = UltraMemory(
            db_path=db_path,
            model="deepseek-chat",  # Uses DeepSeek for fact extraction & relation detection
            embedding_model="all-MiniLM-L6-v2",  # Free local embeddings
            api_key=os.getenv("DEEPSEEK_API_KEY"),
            api_base="https://api.deepseek.com/v1"
        )
        self.agent_id = "learning_agent_001"
    
    async def learn(self, content: str, source: str = "web_search") -> dict:
        """Ingest new information. Ultramemory extracts facts and detects relations."""
        result = await self.memory.learn(
            content,
            agent_id=self.agent_id,
            metadata={"source": source, "ingested_at": datetime.now().isoformat()}
        )
        return result  # Contains 'facts_extracted', 'relations_detected', etc.
    
    async def recall(self, query: str, as_of: Optional[str] = None, k: int = 5) -> List[dict]:
        """Time-aware semantic search. Returns only current, non-superseded facts."""
        results = await self.memory.search(
            query,
            agent_id=self.agent_id,
            as_of=as_of,
            limit=k,
            resolve_versions=True  # Automatically filter out superseded facts
        )
        return results
    
    async def get_entity_profile(self, entity: str) -> dict:
        """Get a consolidated, conflict-free profile of everything known about an entity."""
        return await self.memory.get_entity_profile(entity, agent_id=self.agent_id)

# ============================================================================
# 2. Agent Dependencies
# ============================================================================
@dataclass
class LearningDeps:
    conversation_history: List[ModelMessage] = field(default_factory=list)
    memory_manager: UltraMemoryManager = field(default_factory=UltraMemoryManager)

# ============================================================================
# 3. Custom Tools (Leveraging Ultramemory)
# ============================================================================
async def remember_fact(ctx: RunContext[LearningDeps], fact: str) -> str:
    """Store a new piece of information in the versioned memory."""
    result = await ctx.deps.memory_manager.learn(fact)
    
    facts_count = len(result.get("facts_extracted", []))
    relations = result.get("relations_detected", [])
    
    response = f"βœ… Extracted {facts_count} fact(s)."
    for rel in relations:
        if rel["type"] == "updates":
            response += f"\nπŸ“ Updated existing fact: '{rel['old_fact']}' β†’ '{rel['new_fact']}'"
        elif rel["type"] == "contradicts":
            response += f"\n⚠️ Contradiction detected! New fact '{rel['new_fact']}' conflicts with '{rel['old_fact']}'. Memory updated."
        elif rel["type"] == "extends":
            response += f"\nβž• Extended knowledge about '{rel['parent_fact']}'."
    return response

async def recall_knowledge(ctx: RunContext[LearningDeps], query: str, as_of: str = None) -> str:
    """Search the memory. Use as_of for historical queries (e.g., '2025-06-01')."""
    results = await ctx.deps.memory_manager.recall(query, as_of=as_of, k=3)
    
    if not results:
        return f"❌ No knowledge found for '{query}'."
    
    response = f"πŸ“– Found {len(results)} relevant fact(s):\n"
    for i, fact in enumerate(results, 1):
        response += f"{i}. {fact['text']} (confidence: {fact.get('confidence', 'N/A')})\n"
        if fact.get('event_date'):
            response += f"   πŸ“… Event date: {fact['event_date']}\n"
    return response

async def get_profile(ctx: RunContext[LearningDeps], entity: str) -> str:
    """Get a consolidated summary of everything known about a person, place, or thing."""
    profile = await ctx.deps.memory_manager.get_entity_profile(entity)
    
    if not profile or not profile.get("facts"):
        return f"❌ No profile found for '{entity}'."
    
    response = f"🧠 Profile: {entity}\n"
    for fact in profile["facts"]:
        response += f"β€’ {fact['text']} (source: {fact.get('metadata', {}).get('source', 'unknown')})\n"
    return response

# ============================================================================
# 4. Agent Setup
# ============================================================================
model = OpenAIModel(
    model_name="deepseek-chat",
    base_url="https://api.deepseek.com/v1",
    api_key=os.getenv("DEEPSEEK_API_KEY")
)

learning_agent = Agent(
    model=model,
    deps_type=LearningDeps,
    tools=[duckduckgo_search_tool(), remember_fact, recall_knowledge, get_profile],
    system_prompt=(
        "You are a Learning Agent with a production-grade memory powered by Ultramemory. "
        "Your memory tracks how facts change over time and resolves contradictions automatically.\n\n"
        "For EVERY user question, follow this workflow:\n"
        "1. Use `recall_knowledge` to check if you already know something about this topic.\n"
        "2. If not, or if the user wants current info, use `duckduckgo_search_tool` to gather facts.\n"
        "3. Use `remember_fact` to store any new, verified information.\n"
        "4. Provide a complete, sourced answer.\n\n"
        "You can also use `get_profile` to retrieve a summary about a specific entity."
    )
)

# ============================================================================
# 5. Interactive Session
# ============================================================================
async def main():
    print("🧠 Ultramemory-Powered Learning Agent with DeepSeek")
    print("   Features: Versioned facts, conflict resolution, time-aware search")
    print("   Commands: /profile <entity> | /as_of YYYY-MM-DD | /quit\n")
    
    deps = LearningDeps()
    
    # Set an optional as_of date for historical queries
    current_as_of = None
    
    while True:
        user_input = input("\nπŸ‘€ You: ").strip()
        if user_input.lower() in ("/quit", "/exit", "/q"):
            break
        
        # Special commands
        if user_input.startswith("/profile "):
            entity = user_input[9:]
            profile = await get_profile(RunContext(deps=deps), entity)
            print(f"\n{profile}")
            continue
        
        if user_input.startswith("/as_of "):
            current_as_of = user_input[7:]
            print(f"πŸ•’ Time-travel mode enabled. Searching as of: {current_as_of}")
            continue
        
        print("\nπŸ€– Agent: ", end="", flush=True)
        
        # Inject as_of context into the prompt if set
        prompt = user_input
        if current_as_of:
            prompt = f"[Time context: Answer as if the date is {current_as_of}]\n{user_input}"
        
        result = await learning_agent.run(
            prompt,
            deps=deps,
            message_history=deps.conversation_history
        )
        
        deps.conversation_history = result.all_messages()
        print(result.output)
        
        # Reset as_of after one query (or keep it - your choice)
        # current_as_of = None

if __name__ == "__main__":
    asyncio.run(main())

πŸš€ Step 3: Run the Agent

Set your DeepSeek API key as an environment variable and run the script:

export DEEPSEEK_API_KEY="your-api-key"
python learning_agent_ultramemory.py

πŸ§ͺ Step 4: Test the Production Features

Here are a few interactions that showcase Ultramemory's power:

1. Learning and Versioning

πŸ‘€ You: What is the population of Paris?
πŸ€– Agent: Let me search for that. [Uses web search, returns answer]
βœ… Extracted 1 fact(s).

πŸ‘€ You: I just read that the population of Paris is actually 2.16 million, not 2.1 million.
πŸ€– Agent: Let me update that.
πŸ“ Updated existing fact: 'Paris population is 2.1 million' β†’ 'Paris population is 2.16 million'

2. Time-Travel Queries

πŸ‘€ You: /as_of 2025-01-01
πŸ•’ Time-travel mode enabled. Searching as of: 2025-01-01

πŸ‘€ You: What was the population of Paris?
πŸ€– Agent: [Recalls the fact as it was known on that date, not the current one]

3. Entity Profiles

πŸ‘€ You: /profile Paris
🧠 Profile: Paris
β€’ Paris population is 2.16 million (source: web_search)
β€’ Paris is the capital of France (source: web_search)
β€’ The Eiffel Tower is located in Paris (source: web_search)

πŸ’Ž What This Demonstrates

Feature Demonstrated By
Automatic Fact Extraction The agent passes raw text to remember_fact, Ultramemory extracts atomic facts.
Conflict Resolution When a fact updates, Ultramemory detects updates or contradicts and returns that information.
Time-Aware Recall Using /as_of, you can query the memory as it existed at any point in the past.
Entity Summaries /profile gives a clean, deduplicated view of everything known about an entity.
Local-First Everything is stored in agent_memory.db, a single SQLite file. No cloud required.

πŸ”§ Production Considerations

  • Embeddings: The example uses all-MiniLM-L6-v2 (384 dimensions) for local embeddings. For better accuracy, you can switch to a paid embedding API in the UltraMemory constructor.
  • Scaling: Ultramemory is designed for single-machine deployments. For high-throughput multi-agent systems, consider using its REST API server mode.
  • Backup: Since all data is in a single SQLite file, backing up is as simple as copying agent_memory.db.

This is a solid foundation for a production-ready learning agent. If you'd like to explore any specific aspectβ€”like deploying the memory as a separate service, adding custom fact extractors, or integrating with a vector database for larger scaleβ€”just let me know.

About

Ultramemory with Pydantic AI learning agent powered by DeepSeek

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors