Skip to content

Self Experimenting Kalai - AI Test Case Generator powered by Ollama LLM and FAISS vector database. Completely free of cost. No OpenAPI Key is required

Notifications You must be signed in to change notification settings

Kalaiselvan-21/OpenSource-AITestCaseGenerator

Repository files navigation

AI Test Case Generator

An intelligent test case generation system powered by open-source LLM models (Ollama) and FAISS vector database for semantic search and context-aware test case creation.

πŸš€ Features

  • AI-Powered Test Generation: Leverages Ollama LLM models (Llama2, Mistral) for intelligent test case creation
  • Vector-Based Context Search: Uses FAISS for semantic similarity search in knowledge base
  • Knowledge Base Integration: Processes documentation and examples to generate contextually relevant test cases
  • RESTful API: Flask-based backend with CORS support
  • Token Usage Tracking: Built-in token counter for monitoring LLM usage
  • Extensible Architecture: Modular design for easy integration and customization

πŸ—οΈ Architecture

AITestCaseGenerator/
β”œβ”€β”€ backend/                    # Flask backend application
β”‚   β”œβ”€β”€ app.py                 # Main Flask application
β”‚   β”œβ”€β”€ vector_store.py        # FAISS vector store implementation
β”‚   β”œβ”€β”€ token_counter.py       # Token usage tracking
β”‚   β”œβ”€β”€ knowledge_base/        # Documentation and examples
β”‚   β”‚   β”œβ”€β”€ overall_functionality.txt
β”‚   β”‚   β”œβ”€β”€ test_case_examples/
β”‚   β”‚   └── best_practices/
β”‚   β”œβ”€β”€ vector_store/          # FAISS index files (generated)
β”‚   └── requirements.txt       # Python dependencies
β”œβ”€β”€ AITestAgent/               # Frontend application
└── README.md

πŸ› οΈ Prerequisites

  • Python 3.8+
  • Ollama installed and running
  • Git (for version control)

Ollama Models Required

  • llama2:latest
  • mistral:latest

πŸ“¦ Installation

  1. Clone the repository

    git clone <repository-url>
    cd AITestCaseGenerator
  2. Set up Python virtual environment

    cd backend
    python3 -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install dependencies

    pip install -r requirements.txt
  4. Install and start Ollama

    # Install Ollama (macOS)
    brew install ollama
    
    # Start Ollama service
    ollama serve
    
    # Pull required models
    ollama pull llama2
    ollama pull mistral
  5. Initialize the vector store

    python -c "from vector_store import initialize_vector_store; initialize_vector_store()"

πŸš€ Usage

Starting the Backend Server

cd backend
source venv/bin/activate
python app.py

The server will start on http://localhost:5000

API Endpoints

Health Check

GET /health

Returns system status including Ollama connectivity and token usage statistics.

Generate Test Cases

POST /generate-test-cases
Content-Type: application/json

{
  "requirements": "User login functionality with email validation",
  "test_type": "functional",
  "complexity": "medium"
}

Vector Store Operations

The FAISS vector store automatically processes documents from the knowledge_base/ directory:

  • Functionality specifications: overall_functionality.txt
  • Test case examples: test_case_examples/
  • Best practices: best_practices/

πŸ”§ Configuration

Environment Variables

Create a .env file in the backend directory:

OLLAMA_BASE_URL=http://localhost:11434
VECTOR_STORE_PATH=./vector_store
KNOWLEDGE_BASE_PATH=./knowledge_base
CHUNK_SIZE=1000
CHUNK_OVERLAP=200

Ollama Configuration

Ensure Ollama is running and accessible:

# Check Ollama status
curl http://localhost:11434/api/tags

# Test model availability
ollama list

πŸ“Š Monitoring

Token Usage Tracking

The system tracks token usage for cost monitoring:

# View token usage statistics
curl http://localhost:5000/health

Vector Store Statistics

from vector_store import get_vector_store

vs = get_vector_store()
stats = vs.get_stats()
print(stats)

πŸ§ͺ Development

Adding New Knowledge Base Content

  1. Add documents to appropriate directories in knowledge_base/
  2. Reinitialize the vector store:
    from vector_store import initialize_vector_store
    initialize_vector_store(force_recreate=True)

Testing Vector Search

from vector_store import get_vector_store

vs = get_vector_store()
results = vs.similarity_search("user authentication test cases", k=5)
for doc in results:
    print(f"Score: {doc.metadata}")
    print(f"Content: {doc.page_content[:200]}...")

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ†˜ Troubleshooting

Common Issues

  1. Ollama Connection Error

    # Check if Ollama is running
    ps aux | grep ollama
    
    # Restart Ollama
    ollama serve
  2. Vector Store Initialization Failed

    # Check knowledge base files exist
    ls -la backend/knowledge_base/
    
    # Recreate vector store
    python -c "from vector_store import initialize_vector_store; initialize_vector_store(force_recreate=True)"
  3. FAISS Import Error

    # Reinstall FAISS
    pip uninstall faiss-cpu
    pip install faiss-cpu==1.11.0

πŸ“ž Support

For support and questions:

  • Create an issue in the GitHub repository
  • Check the troubleshooting section above
  • Review Ollama documentation for LLM-related issues

πŸ”„ Version History

  • v1.0.0 - Initial release with basic test case generation
  • v1.1.0 - Added FAISS vector store integration
  • v1.2.0 - Enhanced context-aware generation

Built with ❀️ using Ollama, FAISS, and Flask

About

Self Experimenting Kalai - AI Test Case Generator powered by Ollama LLM and FAISS vector database. Completely free of cost. No OpenAPI Key is required

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published