Skip to content

ibitato/MistralAITests

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

33 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Mistral AI Test Project

CI Status Code Quality Coverage License: MIT Python Mistral AI GitHub Stars GitHub Forks GitHub Issues GitHub Pull Requests

This project provides a scaffold for testing Mistral AI services. All development and examples have been created using Mistral AI Vibe CLI 2.2.1 and the Devstral 2 model.

Powered by:

  • πŸ€– Mistral AI Vibe CLI 2.2.1
  • 🧠 Devstral 2 Model

Features

  • Python 3.12+ environment with venv
  • Mistral AI Vibe CLI 2.2.1 integration with determinism control
  • Devstral 2 model support
  • Git version control
  • Environment variable management with python-dotenv
  • Code formatting with Black
  • Linting with Ruff
  • Type checking with mypy
  • Testing with pytest
  • Determinism controller for precise AI response control
  • Streaming responses for real-time output
  • Performance metrics tracking (tokens, duration, etc.)
  • Comprehensive error handling and validation
  • Multiple response modes (regular, streaming, with metrics)
  • Standardized output with colorama for better UX
  • Batch processing with JSONL format support

Setup

  1. Clone the repository:

    git clone https://github.com/ibitato/MistralAITests.git
    cd MistralAITests
  2. Create and activate virtual environment:

    python3 -m venv .venv
    source .venv/bin/activate
  3. Install dependencies:

    pip install -r requirements.txt
    pip install -r requirements-dev.txt
  4. Set up environment variables:

    cp .env.example .env
    # Edit .env with your Mistral AI API key

Usage

Run tests:

pytest

Format code:

black .

Lint code:

ruff check .

Type check:

mypy .

Run example with determinism control:

python src/example_determinism.py

New Features

Streaming Responses

Get real-time responses chunk by chunk:

from src.mistral_client import MistralAIClient

client = MistralAIClient(api_key="your_api_key")
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Tell me a story about AI."}
]

print("Streaming response:")
for chunk in client.chat_completion_stream(messages):
    print(chunk, end='', flush=True)

Performance Metrics

Get detailed metrics about your API calls:

result = client.chat_completion_with_metrics(messages)
print(f"Response: {result['content']}")
print(f"Duration: {result['duration']:.3f} seconds")
print(f"Tokens used: {result['tokens']['total']}")
print(f"Response time: {result['metrics']['response_time_ms']:.1f} ms")

Enhanced Error Handling

try:
    response = client.chat_completion([])  # Empty messages
except ValueError as e:
    print(f"Validation error: {e}")

try:
    response = client.chat_completion(messages)
except RuntimeError as e:
    print(f"API error: {e}")

Project Structure

MistalAITests/
β”œβ”€β”€ .venv/                     # Virtual environment
β”œβ”€β”€ .git/                      # Git repository
β”œβ”€β”€ .gitignore                 # Git ignore rules
β”œβ”€β”€ .env                       # Environment variables
β”œβ”€β”€ .env.example               # Example environment variables
β”œβ”€β”€ src/                       # Source code
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ mistral_client.py      # Mistral AI client
β”‚   β”œβ”€β”€ determinism_controller.py # Determinism control
β”‚   └── utils.py               # Utilities
β”œβ”€β”€ tests/                     # Tests
β”‚   β”œβ”€β”€ __init__.py
β”‚   └── test_mistral.py        # Test cases
β”œβ”€β”€ README.md                  # Documentation
β”œβ”€β”€ requirements.txt           # Production dependencies
β”œβ”€β”€ requirements-dev.txt       # Development dependencies
└── pyproject.toml             # Project configuration

πŸ“Š Mistral AI SDK Capabilities Coverage

This project demonstrates comprehensive coverage of Mistral AI SDK capabilities:

βœ… Implemented Features

Category Feature Status Example Coverage
Core Chat Basic chat completion βœ… example_determinism.py Complete
Core Chat Streaming responses βœ… chat_completion_stream() Complete
Core Chat Performance metrics βœ… chat_completion_with_metrics() Complete
Core Chat Determinism control βœ… determinism_controller.py Complete
Core Chat Temperature control βœ… All examples Complete
Tool Calling Function calling βœ… example_tool_calling.py Complete
Tool Calling Tool execution βœ… execute_tool_calls() Complete
Tool Calling Parallel tools ⚠️ Partial Planned
Vision Image analysis βœ… example_vision.py Complete
Vision Multimodal chat βœ… vision_with_text() Complete
Batch Processing Batch jobs βœ… example_batch_processing.py Complete
Batch Processing Status monitoring βœ… check_batch_status() Complete
Document Intelligence OCR processing βœ… example_advanced_ocr.py Complete
Document Intelligence PDF metadata βœ… example_complex_pdf.py Complete
Document Intelligence Table extraction βœ… example_complex_pdf.py Complete
Document Intelligence Structure analysis βœ… example_complex_pdf.py Complete
Document Intelligence PDF to JSON βœ… example_complex_pdf.py Complete
Reasoning Step-by-step reasoning βœ… example_determinism.py Complete
Reasoning Thinking process βœ… All determinism levels Complete
Embeddings Text embeddings βœ… embeddings() Basic
File Management Document upload βœ… document_manager.py Complete
File Management File listing βœ… list_documents() Complete
File Management File retrieval βœ… get_document_info() Complete
File Management File deletion βœ… delete_document() Complete

⏳ Planned Features

Category Feature Status Priority
Agents API Web search πŸ”„ High
Agents API Code execution πŸ”„ High
Agents API Multi-agent workflows πŸ”„ Medium
RAG Document retrieval πŸ”„ High
RAG Vector database πŸ”„ High
RAG Hybrid search πŸ”„ Medium
Advanced JSON mode πŸ”„ Medium
Advanced Function calling v2 πŸ”„ Medium
Advanced Parallel function calls πŸ”„ High
Advanced Structured outputs πŸ”„ Medium

πŸ“Š Coverage Statistics

  • Core Features: 18/18 implemented (100%)
  • Advanced Features: 8/16 implemented (50%)
  • Document Intelligence: 6/6 implemented (100%)
  • Reasoning: 2/2 implemented (100%)
  • Overall: 26/32 features (81% coverage)

Project Information

Developer: David R. Lopez B. Email: ibitato@gmail.com Tools: Mistral AI Vibe CLI 2.2.1 with Devstral 2 Medium LLM

Determinism Control

This project includes a determinism controller that allows fine-grained control over AI response creativity vs. precision:

  • Level 1 (Exact): Deterministic responses, minimal variation
  • Level 2 (Focused): Highly controlled generation, minimal creativity
  • Level 3 (Balanced): Balanced generation (default)
  • Level 4 (Creative): More freedom and variation
  • Level 5 (Free): Highly creative, maximum variation

The determinism controller automatically handles Mistral AI requirements, such as setting top_p=1.0 when using greedy sampling (temperature=0.0).

🎯 Development Credits

This project and all examples were developed using:

Mistral AI Vibe CLI 2.2.1 - The powerful CLI tool that enabled rapid development and testing of Mistral AI services.

Devstral 2 Model - The advanced language model that powers all the AI responses and demonstrations in this project.

About Mistral AI

Mistral AI provides state-of-the-art language models and tools for building intelligent applications. This project demonstrates best practices for integrating with Mistral AI services using their official Python SDK and CLI tools.

Development Environment

  • CLI Tool: Mistral AI Vibe CLI 2.2.1
  • Model: Devstral 2
  • SDK: mistralai Python package
  • Framework: Python 3.12+

Developer

David R. Lopez B.

License

MIT

About

MistralAITests

Topics

Resources

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages