A backend service for a chatbot application built with FastAPI and MongoDB, using Ollama for local language model integration. This backend streams real-time AI responses to the frontend via WebSocket and stores user data and conversation history in MongoDB.
- ⚡ FastAPI: High-performance web framework for building APIs
- 🗄️ MongoDB: NoSQL database for storing conversations and user data
- 🧠 Ollama: For running language models locally
- 🔄 WebSocket: For real-time streaming responses to the frontend
- 🐍 Python 3.8+
- 🗄️ MongoDB installed and running
- 🧠 Ollama installed (for local LLM support)
-
📂 Clone the repository:
git clone https://github.com/yourusername/chatbot-be.git cd chatbot-be -
🌐 Create a virtual environment:
python -m venv venv
-
⚙️ Activate the virtual environment:
# Windows venv\Scripts\activate # Linux/Mac source venv/bin/activate
-
📦 Install dependencies:
pip install -r requirements.txt
-
🔧 Set up environment variables: Create a
.envfile in the root directory with the following variables:MONGODB_URL=mongodb://localhost:27017 MONGODB_DB=chatbot OLLAMA_API_URL=http://localhost:11434/api
Start the FastAPI server:
uvicorn app.main:app --reloadThe API will be available at http://localhost:8000
Once the application is running, you can access:
- 📘 Interactive API documentation:
http://localhost:8000/docs - 📕 Alternative API documentation:
http://localhost:8000/redoc
chatbot-be/
├── app/
│ ├── api/ # API routes
│ ├── core/ # Core functionality, config
│ ├── db/ # Database models and connections
│ ├── models/ # Pydantic models
│ ├── services/ # Business logic
│ └── main.py # Application entry point
├── tests/ # Test files
├── .env # Environment variables
├── .gitignore
├── requirements.txt
└── README.md
- 💬 Chat conversation management
- 🧠 Integration with Ollama for text generation
- 🗄️ Conversation history storage in MongoDB
- 👤 User management
- 🔄 Real-time response streaming via WebSocket
The frontend for this project is available at: 🌐 Chatbot Frontend
📄 MIT License