A Python package that extracts and structures key features and use cases from text descriptions of blurring tools, providing a clean, formatted summary without sensitive or technical details.
textblur_summary processes user-provided text about a blurring tool (e.g., its purpose, benefits, and limitations) and returns a structured summary of its features. It highlights non-sensitive aspects like being free, instant, and watermark-free, while omitting technical or proprietary details.
Install via pip:
pip install textblur_summaryfrom textblur_summary import textblur_summary
# Example input: A user-provided description of a blurring tool
user_input = """
TextBlur is a free, instant image blurring tool. It allows users to blur faces or sensitive details in photos without watermarks.
"""
# Call the function (LLM7 API key is fetched from environment variable LLM7_API_KEY)
response = textblur_summary(user_input)
print(response)You can pass your own LLM instance (e.g., OpenAI, Anthropic, or Google) for flexibility:
from langchain_openai import ChatOpenAI
from textblur_summary import textblur_summary
llm = ChatOpenAI()
response = textblur_summary(user_input, llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from textblur_summary import textblur_summary
llm = ChatAnthropic()
response = textblur_summary(user_input, llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from textblur_summary import textblur_summary
llm = ChatGoogleGenerativeAI()
response = textblur_summary(user_input, llm=llm)
print(response)- Default LLM: Uses
ChatLLM7(fromlangchain_llm7) with the API key fetched from:- Environment variable:
LLM7_API_KEY - Fallback: Hardcoded default (if no key is provided).
- Environment variable:
- Custom API Key: Pass it directly:
response = textblur_summary(user_input, api_key="your_llm7_api_key")
- Get a Free API Key: Register at LLM7 Token.
| Parameter | Type | Description |
|---|---|---|
user_input |
str |
The text description of the blurring tool to analyze. |
api_key |
Optional[str] |
LLM7 API key (optional if using environment variable). |
llm |
Optional[BaseChatModel] |
Custom LLM instance (e.g., ChatOpenAI, ChatAnthropic). Defaults to ChatLLM7. |
- Rate Limits: The default LLM7 free tier is sufficient for most use cases.
- Output Format: Returns a list of structured key points (e.g., features, benefits).
- Safety: Avoid sharing sensitive or proprietary details in
user_input.
Report bugs or feature requests at: GitHub Issues
- Name: Eugene Evstafev
- Email: hi@euegne.plus
- GitHub: @chigwell