Extract and summarize key information from historical or factual queries.
Chronic AI is a Python package that helps users obtain well-structured information from complex or detailed queries. It takes a user's input text, such as a question about historical events or facts, and returns a structured summary. This summary includes the main points, context, and relevant details, ensuring the information is presented in a clear and organized manner.
- Structured Summaries: Obtain clear and organized summaries of complex queries.
- Predefined Format: Ensure output adheres to a predefined format with the help of llmatch-messages.
- Easy to Use: Leverage the package to quickly obtain structured information from detailed queries.
pip install chronicle_aifrom chronicle_ai import chronicle_ai
response = chronicle_ai(user_input="What were the main causes of World War I?")
print(response)user_input: The user input text to process. (Type:str)llm: The langchain llm instance to use. If not provided, the default ChatLLM7 will be used. (Type:Optional[BaseChatModel])api_key: The api key for llm7. If not provided, the free tier api key will be used. (Type:Optional[str])
- The package uses the ChatLLM7 from langchain_llm7 by default. Devs can safely pass their own llm instance (based on https://docs.langchain.com/) if they want to use another LLM.
from langchain_openai import ChatOpenAI
from chronicle_ai import chronicle_ai
llm = ChatOpenAI()
response = chronicle_ai(user_input="What were the main causes of World War I?", llm=llm)from langchain_anthropic import ChatAnthropic
from chronicle_ai import chronicle_ai
llm = ChatAnthropic()
response = chronicle_ai(user_input="What were the main causes of World War I?", llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from chronicle_ai import chronicle_ai
llm = ChatGoogleGenerativeAI()
response = chronicle_ai(user_input="What were the main causes of World War I?", llm=llm)The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If devs want higher rate limits for LLM7 they can pass their own api_key via environment variable LLM7_API_KEY or via passing it directly like chronicle_ai(user_input, api_key="their_api_key").
Dev can get a free api key by registering at https://token.llm7.io/.
Eugene Evstafev ( GitHub: chigwell ) ( hi@eugegne.plus )