Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
367ce8e
feat: integrate @falkordb/canvas for schema visualization and remove …
Anchel123 Dec 30, 2025
9bf6497
Implement feature X to enhance user experience and optimize performance
Anchel123 Dec 30, 2025
ddfe7bc
chore: update @falkordb/canvas dependency to version 0.0.21
Anchel123 Jan 4, 2026
71bb1cd
Implement new feature for user authentication and improve error handling
Anchel123 Jan 4, 2026
6489f85
chore: update @falkordb/canvas dependency from 0.0.12 to 0.0.21 in pa…
Anchel123 Jan 5, 2026
1dbbc58
chore: update @falkordb/canvas dependency from 0.0.21 to 0.0.22 in pa…
Anchel123 Jan 7, 2026
67d82b9
chore: update @falkordb/canvas dependency from 0.0.21 to 0.0.22 in pa…
Anchel123 Jan 7, 2026
e1b8588
Merge remote-tracking branch 'origin/staging' into falkordb-canvas
Anchel123 Jan 7, 2026
755a2f4
refactor: update theme handling and improve styling in SchemaViewer c…
Anchel123 Jan 7, 2026
3f5e4d7
chore: update @falkordb/canvas dependency to version 0.0.23 in packag…
Anchel123 Jan 8, 2026
4c00d49
Bump urllib3 from 2.6.2 to 2.6.3 in the pip group across 1 directory
dependabot[bot] Jan 8, 2026
dddfbb5
Add weekly updates for GitHub Actions dependencies
gkorland Jan 8, 2026
31ba4ef
Merge branch 'staging' into dependabot/pip/pip-8177a8837a
gkorland Jan 8, 2026
bfef44f
Merge pull request #371 from FalkorDB/dependabot/pip/pip-8177a8837a
gkorland Jan 8, 2026
8af91e5
chore: update @falkordb/canvas dependency to version 0.0.24 in packag…
Anchel123 Jan 12, 2026
cddd3eb
chore: add preact dependency and update version to 10.28.2 in package…
Anchel123 Jan 18, 2026
444e271
Implement feature X to enhance user experience and fix bug Y in module Z
Anchel123 Jan 18, 2026
ce084c2
chore: update preact dependency to version 10.28.2 in package.json an…
Anchel123 Jan 18, 2026
870a0c2
Refactor code structure for improved readability and maintainability
Anchel123 Jan 18, 2026
784c386
Initial plan
Copilot Jan 24, 2026
989bac7
Add HSTS header to prevent man-in-the-middle attacks
Copilot Jan 24, 2026
93df5c9
Fix test endpoint path from /api/graphs to /graphs
Copilot Jan 24, 2026
c2b3551
Merge branch 'staging' into copilot/add-hsts-header
gkorland Jan 24, 2026
48eba40
Merge pull request #382 from FalkorDB/copilot/add-hsts-header
gkorland Jan 24, 2026
904f859
Bump jsonschema from 4.25.1 to 4.26.0
dependabot[bot] Jan 26, 2026
d7d4d2a
Refactor code structure for improved readability and maintainability
Anchel123 Feb 1, 2026
dc8e04b
Refactor code structure for improved readability and maintainability
Anchel123 Feb 1, 2026
f060d88
Refactor code structure for improved readability and maintainability
Anchel123 Feb 2, 2026
b6d2b67
Merge branch 'staging' into falkordb-canvas
Naseem77 Feb 3, 2026
927675c
Merge pull request #351 from FalkorDB/falkordb-canvas
Naseem77 Feb 3, 2026
364bbc1
Merge branch 'staging' into dependabot/pip/staging/jsonschema-4.26.0
gkorland Feb 3, 2026
befe74e
Merge pull request #377 from FalkorDB/dependabot/pip/staging/jsonsche…
gkorland Feb 3, 2026
351082b
add-ttl-memory
galshubeli Feb 12, 2026
40bf368
env-var
galshubeli Feb 12, 2026
0103263
redis-error
galshubeli Feb 12, 2026
3dd4d13
spell
galshubeli Feb 12, 2026
9859c08
update-docker
galshubeli Feb 12, 2026
d767956
update-docker
galshubeli Feb 12, 2026
90c6a5c
bust docker cache for nodejs install
galshubeli Feb 12, 2026
de08a9a
fix: remove pre-installed nodejs before installing NodeSource version
galshubeli Feb 12, 2026
b884bb9
Merge pull request #389 from FalkorDB/memory-ttl
galshubeli Feb 12, 2026
8ba85e5
up-dockerfile
galshubeli Feb 16, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,14 @@ FALKORDB_URL=redis://localhost:6379/0 # REQUIRED - change to your FalkorDB URL
# Google Tag Manager ID (optional)
# GOOGLE_TAG_MANAGER_ID=GTM-XXXXXXX

# -----------------------------
# Memory TTL (optional)
# -----------------------------
# Set a TTL (in seconds) on per-user memory graphs so they auto-expire.
# When unset, memory graphs persist indefinitely.
# Example: 604800 = 1 week
# MEMORY_TTL_SECONDS=604800

# -----------------------------
# Optional MCP (Model Context Protocol) settings
# -----------------------------
Expand Down
4 changes: 4 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@

version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
Comment on lines +8 to +11
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing target-branch: "staging" — intentional?

The pip and npm entries both target staging, but this new github-actions entry omits target-branch, so Dependabot will open PRs against the default branch (main). If the intent is to keep all dependency PRs flowing through staging first, add target-branch: "staging" here as well.

Proposed fix
   - package-ecosystem: "github-actions"
     directory: "/"
+    target-branch: "staging"
     schedule:
       interval: "weekly"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
- package-ecosystem: "github-actions"
directory: "/"
target-branch: "staging"
schedule:
interval: "weekly"
🤖 Prompt for AI Agents
In @.github/dependabot.yml around lines 8 - 11, The Dependabot entry for
package-ecosystem "github-actions" is missing target-branch and will open PRs
against the default branch; update the "github-actions" block (the
package-ecosystem: "github-actions" entry) to include target-branch: "staging"
so its PRs route to the staging branch like the pip and npm entries.

- package-ecosystem: "pip"
directory: "/"
target-branch: "staging"
Expand Down
1 change: 1 addition & 0 deletions .github/wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -95,3 +95,4 @@ Sanitization
JOINs
subqueries
subquery
TTL
17 changes: 13 additions & 4 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
netcat-openbsd \
git \
build-essential \
curl \
ca-certificates \
gnupg \
&& rm -rf /var/lib/apt/lists/* \
&& ln -sf /usr/local/bin/python3.12 /usr/bin/python3 \
&& ln -sf /usr/local/bin/python3.12 /usr/bin/python
Expand All @@ -36,9 +39,15 @@ RUN PIP_BREAK_SYSTEM_PACKAGES=1 pipenv sync --system

# Install Node.js (Node 22) so we can build the frontend inside the image.
# Use NodeSource setup script to get a recent Node version on Debian-based images.
RUN curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
&& apt-get update && apt-get install -y nodejs \
&& rm -rf /var/lib/apt/lists/*
# Remove any pre-installed nodejs first to avoid conflicts.
RUN apt-get update \
&& apt-get remove -y nodejs || true \
&& rm -rf /var/lib/apt/lists/* \
&& curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
&& apt-get update \
&& apt-get install -y nodejs \
&& rm -rf /var/lib/apt/lists/* \
&& node --version && npm --version
Comment on lines +43 to +50
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Shell operator precedence issue silently swallows apt-get update failure; also add --no-install-recommends.

Because && and || have equal precedence and are left-associative in shell, the expression apt-get update && apt-get remove -y nodejs || true && ... evaluates as ((apt-get update && apt-get remove ...) || true) && .... If apt-get update on line 43 fails, the || true catches it and the rest of the chain proceeds silently with a stale/missing package index. Scope the || true to only the remove command.

Additionally, apt-get install -y nodejs on line 48 is missing --no-install-recommends, which would pull in unnecessary recommended packages and increase image size (flagged by Trivy DS-0029).

Proposed fix
-RUN apt-get update \
-    && apt-get remove -y nodejs || true \
-    && rm -rf /var/lib/apt/lists/* \
-    && curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
-    && apt-get update \
-    && apt-get install -y nodejs \
-    && rm -rf /var/lib/apt/lists/* \
-    && node --version && npm --version
+RUN apt-get update \
+    && (apt-get remove -y nodejs || true) \
+    && rm -rf /var/lib/apt/lists/* \
+    && curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
+    && apt-get update \
+    && apt-get install -y --no-install-recommends nodejs \
+    && rm -rf /var/lib/apt/lists/* \
+    && node --version && npm --version
🧰 Tools
🪛 Trivy (0.69.1)

[error] 43-50: 'apt-get' missing '--no-install-recommends'

'--no-install-recommends' flag is missed: 'apt-get update && apt-get remove -y nodejs || true && rm -rf /var/lib/apt/lists/* && curl -fsSL https://deb.nodesource.com/setup_22.x | bash - && apt-get update && apt-get install -y nodejs && rm -rf /var/lib/apt/lists/* && node --version && npm --version'

Rule: DS-0029

Learn more

(IaC/Dockerfile)

🤖 Prompt for AI Agents
In `@Dockerfile` around lines 43 - 50, Scope the "|| true" to only the removal
step and add --no-install-recommends to the install: change the RUN chain so
apt-get update is not short-circuited by the "|| true" (i.e., use apt-get update
&& (apt-get remove -y nodejs || true) && rm -rf /var/lib/apt/lists/* && curl
-fsSL https://deb.nodesource.com/setup_22.x | bash - && apt-get update &&
apt-get install -y --no-install-recommends nodejs && rm -rf /var/lib/apt/lists/*
&& node --version && npm --version), ensuring the "remove" fallback only applies
to the remove command and the install uses --no-install-recommends.


# Copy only frontend package files so Docker can cache npm installs when
# package.json / package-lock.json don't change.
Expand All @@ -57,7 +66,7 @@ COPY ./app ./app

RUN npm --prefix ./app run build

# Copy application code
# Copy application code
COPY . .
Comment on lines +69 to 70
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if .dockerignore exists and what it excludes related to app/ build artifacts
if [ -f .dockerignore ]; then
  echo "=== .dockerignore contents ==="
  cat .dockerignore
else
  echo "WARNING: No .dockerignore found"
fi

Repository: FalkorDB/QueryWeaver

Length of output: 95


🏁 Script executed:

#!/bin/bash
# Check the Dockerfile around lines 69-70 for full context
echo "=== Dockerfile (full content) ==="
if [ -f Dockerfile ]; then
  cat -n Dockerfile
else
  echo "Dockerfile not found"
fi

echo ""
echo "=== .gitignore contents ==="
if [ -f .gitignore ]; then
  cat .gitignore
else
  echo "No .gitignore found"
fi

echo ""
echo "=== Check if app/node_modules or app/dist exist in repo ==="
git ls-files | grep -E '(app/node_modules|app/dist|app/build)' | head -20 || echo "No committed build artifacts or node_modules found"

Repository: FalkorDB/QueryWeaver

Length of output: 3629


COPY . . on line 70 is redundant; the app/ directory was already copied on line 65.

The subsequent COPY . . will re-copy application code without risk of overwriting the frontend build artifacts (app/dist/), since .gitignore properly excludes both node_modules/ and /app/dist/ from the repository. However, a .dockerignore file would be beneficial to exclude unnecessary build context files (e.g., .git, .venv, .env) and improve build efficiency. Consider removing the redundant COPY . . or, if it's needed for Python application code, consolidate the two COPY operations and add a .dockerignore to exclude build directories, cache files, and version control artifacts.

🤖 Prompt for AI Agents
In `@Dockerfile` around lines 69 - 70, The Dockerfile contains a redundant COPY .
. (the later COPY . . duplicates the earlier copy of the app/ directory); remove
the second COPY . . (or consolidate into a single targeted COPY that copies only
the necessary source files) and add a .dockerignore to exclude unnecessary
context (e.g., .git, node_modules, .venv, .env, app/dist) so builds are smaller
and faster; locate the duplicate COPY . . entry and either delete it or replace
both COPY operations with a single explicit COPY that targets the application
source, then add a .dockerignore file listing the unwanted files/directories.


# Copy and make start.sh executable
Expand Down
2 changes: 1 addition & 1 deletion Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ psycopg2-binary = "~=2.9.11"
pymysql = "~=1.1.0"
authlib = "~=1.6.4"
itsdangerous = "~=2.2.0"
jsonschema = "~=4.25.0"
jsonschema = "~=4.26.0"
tqdm = "~=4.67.1"
python-multipart = "~=0.0.10"
jinja2 = "~=3.1.4"
Expand Down
17 changes: 9 additions & 8 deletions Pipfile.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

11 changes: 11 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,17 @@ docker run -p 5000:5000 -it \

> For a full list of configuration options, consult `.env.example`.

## Memory TTL (optional)

QueryWeaver stores per-user conversation memory in FalkorDB. By default these graphs persist indefinitely. Set `MEMORY_TTL_SECONDS` to apply a Redis TTL (in seconds) so idle memory graphs are automatically cleaned up.

```bash
# Expire memory graphs after 1 week of inactivity
MEMORY_TTL_SECONDS=604800
```

The TTL is refreshed on every user interaction, so active users keep their memory.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Documentation may overstate the TTL refresh guarantee.

Line 69 says "The TTL is refreshed on every user interaction." As noted in the graphiti_tool.py review, _refresh_ttl() is only called inside MemoryTool.create(). If create() isn't invoked on every interaction, this statement is misleading. Please verify the call path and update the wording if needed.

🤖 Prompt for AI Agents
In `@README.md` at line 69, The README overstates TTL behavior — _refresh_ttl() is
only invoked from MemoryTool.create(), so update the sentence "The TTL is
refreshed on every user interaction" to accurately reflect that TTL is refreshed
only when MemoryTool.create() is called (or change the implementation to call
_refresh_ttl() on every interaction if you intend the original claim); reference
the _refresh_ttl() method and MemoryTool.create() in the updated wording so
readers know the exact trigger for TTL refresh.


## MCP server: host or connect (optional)

QueryWeaver includes optional support for the Model Context Protocol (MCP). You can either have QueryWeaver expose an MCP-compatible HTTP surface (so other services can call QueryWeaver as an MCP server), or configure QueryWeaver to call an external MCP server for model/context services.
Expand Down
8 changes: 8 additions & 0 deletions api/app_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,14 @@ async def dispatch(self, request: Request, call_next):
return JSONResponse(status_code=403, content={"detail": "Forbidden"})

response = await call_next(request)

# Add HSTS header to prevent man-in-the-middle attacks
# max-age=31536000: 1 year in seconds
# includeSubDomains: apply to all subdomains
# preload: eligible for browser HSTS preload lists
hsts_value = "max-age=31536000; includeSubDomains; preload"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[BLOCKER]: Middleware is stacked LIFO, so adding SecurityMiddleware inside RequestValidationMiddleware.dispatch means it executes after ProxyHeadersMiddleware/SessionMiddleware have already built their response. Redirects, cookies, etc. leave the app before the HSTS header is injected, defeating the goal of globally enforcing HSTS. Please register the HSTS middleware at FastAPI app creation (before ProxyHeaders/Session) so every response carries the header.

response.headers["Strict-Transport-Security"] = hsts_value

return response


Expand Down
24 changes: 22 additions & 2 deletions api/memory/graphiti_tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@
from typing import List, Dict, Any, Optional, Tuple
from datetime import datetime

from redis import RedisError

# Import Azure OpenAI components
from openai import AsyncAzureOpenAI

Expand Down Expand Up @@ -47,10 +49,18 @@ def extract_embedding_model_name(full_model_name: str) -> str:
class MemoryTool:
"""Memory management tool for handling user memories and interactions."""

# Optional TTL (in seconds) for the memory graph key. Set via MEMORY_TTL_SECONDS
# env var to enable automatic expiry (e.g. 604800 for 1 week). Unset = no expiry.
MEMORY_TTL_SECONDS: Optional[int] = (
int(os.environ["MEMORY_TTL_SECONDS"])
if os.environ.get("MEMORY_TTL_SECONDS")
else None
)
Comment on lines +54 to +58
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

MEMORY_TTL_SECONDS will raise ValueError at import time if the env var is set to a non-integer string.

If someone sets MEMORY_TTL_SECONDS=1w or any non-numeric value, int(os.environ["MEMORY_TTL_SECONDS"]) will crash with a ValueError during module import, taking down the entire application. Consider wrapping in a try/except or validating gracefully.

🛡️ Proposed defensive parsing
-    MEMORY_TTL_SECONDS: Optional[int] = (
-        int(os.environ["MEMORY_TTL_SECONDS"])
-        if os.environ.get("MEMORY_TTL_SECONDS")
-        else None
-    )
+    `@staticmethod`
+    def _parse_ttl() -> Optional[int]:
+        raw = os.environ.get("MEMORY_TTL_SECONDS")
+        if not raw:
+            return None
+        try:
+            return int(raw)
+        except ValueError:
+            logging.warning("Invalid MEMORY_TTL_SECONDS=%r, ignoring", raw)
+            return None
+
+    MEMORY_TTL_SECONDS: Optional[int] = _parse_ttl()
🤖 Prompt for AI Agents
In `@api/memory/graphiti_tool.py` around lines 54 - 58, The current top-level
constant MEMORY_TTL_SECONDS will raise ValueError during import if the
environment variable is non-numeric; change the initialization of
MEMORY_TTL_SECONDS to defensively parse the env var (use os.environ.get) and
wrap int(...) in a try/except (or validate with str.isdigit()/regex) so
non-integer values result in a safe fallback (e.g., None) and optionally log or
warn; update the symbol MEMORY_TTL_SECONDS initialization accordingly so
import-time crashes are prevented.


def __init__(self, user_id: str, graph_id: str):
# Create FalkorDB driver with user-specific database
user_memory_db = f"{user_id}-memory"
falkor_driver = FalkorDriver(falkor_db=db, database=user_memory_db)
self.memory_db_name = f"{user_id}-memory"
falkor_driver = FalkorDriver(falkor_db=db, database=self.memory_db_name)


# Create Graphiti client with Azure OpenAI configuration
Expand All @@ -60,6 +70,13 @@ def __init__(self, user_id: str, graph_id: str):
self.graph_id = graph_id


async def _refresh_ttl(self) -> None:
"""Set a TTL on the memory graph key using Redis EXPIRE."""
try:
await db.execute_command("EXPIRE", self.memory_db_name, self.MEMORY_TTL_SECONDS)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[MAJOR]: _refresh_ttl only logs RedisError and then returns control to the caller, so a FalkorDB/Redis outage silently disables TTL enforcement—graphs never expire but nothing fails or alerts. That means the “memory auto-expiry” feature can be broken for weeks without anyone knowing. Please fail MemoryTool.create (or otherwise signal the failure) when the EXPIRE call raises so the request can retry/alert rather than silently degrading security/privacy guarantees.

except RedisError as e:
logging.warning("Failed to refresh TTL for %s: %s", self.memory_db_name, e)

@classmethod
async def create(cls, user_id: str, graph_id: str, use_direct_entities: bool = True) -> "MemoryTool":
"""Async factory to construct and initialize the tool."""
Expand All @@ -72,6 +89,9 @@ async def create(cls, user_id: str, graph_id: str, use_direct_entities: bool = T
driver = self.graphiti_client.driver
await driver.execute_query(f"CREATE VECTOR INDEX FOR (p:Query) ON (p.embeddings) OPTIONS {{dimension:{vector_size}, similarityFunction:'euclidean'}}")

if cls.MEMORY_TTL_SECONDS is not None:
await self._refresh_ttl()
Comment on lines +92 to +93
Copy link

Copilot AI Feb 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The TTL is only refreshed during object creation, but according to the README documentation added in this PR, "The TTL is refreshed on every user interaction, so active users keep their memory." Consider calling _refresh_ttl() at the end of user interaction methods like add_new_memory, save_query_memory, retrieve_similar_queries, and search_memories to prevent active users' memory from expiring.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[MAJOR]: The README now promises “The TTL is refreshed on every user interaction,” but _refresh_ttl() is only invoked once inside MemoryTool.create. After MEMORY_TTL_SECONDS elapses the user graph expires even if the user was actively chatting, which defeats the idle-based TTL requirement. Please refresh the TTL on every read/write path (e.g. search, add_new_memory, save_query_memory) so each interaction extends the expiry.


return self

async def _ensure_entity_nodes_direct(self, user_id: str, database_name: str) -> bool:
Expand Down
Loading
Loading