Add Ollama Integration Support#1
Open
moah0911 wants to merge 4 commits into
Open
Conversation
- Added `llm/ollama_generator.py` with Ollama connection testing and SQL generation functions - Created `test_ollama.py` for end-to-end validation of Ollama integration - Updated `app/cli.py` to support dynamic LLM provider switching between NVIDIA and Ollama - Modified `llm/__init__.py` to expose new Ollama functions in public API - Enhanced `llm/generator.py` with Ollama-specific logic and fallback handling - Updated `main.py` to include Ollama provider management and runtime validation - Updated `pyproject.toml` to reflect version bump and added Ollama keyword - Improved `.gitignore` to exclude additional temporary and IDE files - Maintained consistent error handling and user feedback across both LLM providers
…-0c1431da4906 Update from task b8d7a3a6-68cc-4abd-8a0d-0c1431da4906
…-0c1431da4906 Update from task b8d7a3a6-68cc-4abd-8a0d-0c1431da4906
Owner
|
I appreciate your work integrating Ollama support. I noticed a few concerns: Once these are addressed, I’ll be happy to continue the review. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR introduces support for Ollama, enabling users to run and interact with local LLMs seamlessly within SnapBase. The integration maintains compatibility with existing workflows while expanding the range of supported inference backends.
Changes included: