Skip to content

Conversation

@Madzionator
Copy link
Collaborator

Ollama Backend

  • Added OllamaService – an OpenAI-compatible service for chat completions, including API key management and memory-based chat.
  • Updated configuration (BackendType, MaINSettings) to support Ollama and its API key.
  • Registered OllamaService in the LLM service factory and added API URLs and HTTP client constants.

Application & Examples Integration

  • Integrated Ollama backend selection in the InferPage app, including environment variable support.
  • Added a chat example ChatExampleOllama and registered it in the examples project with dependency injection.

Limitations / Changes

  • MCP integration for Ollama is explicitly disabled (throws an exception).
  • Image generation for Ollama is unsupported (image generation factory returns null).

Dependency Cleanup:

  • Removed several unused or redundant package references from project files

Introduces support for the Ollama backend by updating configuration, service factory, and constants. Adds OllamaService implementing OpenAI-compatible API calls, updates the example project to include a ChatExampleOllama, and provides utility setup for Ollama API key. Also updates InferPage to allow selection and configuration of Ollama as a backend.
Deleted the OllamaImageGenerations constant from ServiceConstants as it is no longer used. Also, set the Ollama backend to null in ImageGenServiceFactory and cleaned up unused usings in ImageGenService.cs.
Throws a NotSupportedException when BackendType.Ollama is used with MCP integration, clarifying that Ollama models are not supported. This improves error handling and provides clearer feedback to developers.
Cleaned up .csproj files by removing unnecessary and redundant NuGet package references across multiple projects. This reduces build complexity and potential dependency conflicts.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants