A lightweight Python package that extracts structured information from holiday‑themed narratives.
Given a free‑form text (e.g., “The Christmas Tree that came in from the Cold”), the parser identifies:
- Main theme
- Emotional tone
- Key story elements
- Relevant details
The extracted data is returned as a list of strings that match a predefined regular‑expression pattern, making it easy to feed into downstream analysis, categorisation, or summarisation pipelines.
pip install holiday_narrative_parserfrom holiday_narrative_parser import holiday_narrative_parser
story = """
The Christmas tree arrived shivering from the frosty north,
its branches still dusted with snow. Yet inside the living room,
the glow of candlelight warmed its needles, and the family sang
songs of joy and togetherness.
"""
result = holiday_narrative_parser(user_input=story)
print(result)
# Example output:
# [
# "Theme: Christmas",
# "Tone: Warm, nostalgic",
# "Elements: Tree, Snow, Candlelight, Family singing",
# "Details: Tree came from cold, interior warmth"
# ]holiday_narrative_parser(
user_input: str,
llm: Optional[BaseChatModel] = None,
api_key: Optional[str] = None
) -> List[str]| Parameter | Type | Description |
|---|---|---|
| user_input | str |
The narrative text to be processed. |
| llm | Optional[BaseChatModel] |
A LangChain BaseChatModel instance. If omitted, the default ChatLLM7 from langchain‑llm7 is used. |
| api_key | Optional[str] |
API key for the LLM7 service. If not supplied, the function looks for the LLM7_API_KEY environment variable; otherwise it falls back to a placeholder "None" (which triggers the default behaviour of ChatLLM7). |
Returns: A list of strings that match the extraction pattern defined in the package.
You can provide any LangChain‑compatible chat model (OpenAI, Anthropic, Google Gemini, etc.). This is useful if you need a different provider, higher token limits, or specialized model behaviour.
from langchain_openai import ChatOpenAI
from holiday_narrative_parser import holiday_narrative_parser
my_llm = ChatOpenAI(model="gpt-4o-mini")
response = holiday_narrative_parser(user_input=story, llm=my_llm)
print(response)from langchain_anthropic import ChatAnthropic
from holiday_narrative_parser import holiday_narrative_parser
my_llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = holiday_narrative_parser(user_input=story, llm=my_llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from holiday_narrative_parser import holiday_narrative_parser
my_llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = holiday_narrative_parser(user_input=story, llm=my_llm)
print(response)- By default the package uses
ChatLLM7from langchain‑llm7. - The free tier of LLM7 provides sufficient request‑per‑minute limits for typical development and small‑scale production use.
- For higher limits, supply your personal API key either via the
LLM7_API_KEYenvironment variable or directly:
response = holiday_narrative_parser(user_input=story, api_key="YOUR_LLM7_API_KEY")You can obtain a free API key by registering at https://token.llm7.io/.
If you encounter bugs, have feature requests, or need help, please open an issue:
- GitHub Issues: https://github.com/chigwell/holiday_narrative_parser/issues
Eugene Evstafev
📧 Email: hi@euegne.plus
🐙 GitHub: chigwell
This project is licensed under the MIT License. See the LICENSE file for details.