Skip to content

Axemt/que

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Que

LLM question answering + local document vector storage.

How it works

The que command line utility recursively indexes the documents in any directory you invoke it in, and stores them in a vector database (ChromaDB) in ~/.config/que/index.chroma. Then it uses cosine similarity to find related texts based on your query, feeds them as context to a Llama model and has it answer the question.

On following invocations, que re-checks if the indexed files have changed or have been deleted, or if new documents are present, and updates the internal vector database, avoiding an expensive re-indexing of files.

que relies on llama-cpp, a fast inference implementation compatible with MPS, CUDA and Vulkan.

See it in action:

example

About

Fast LLM question answering + document indexing RAG in the command line

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published