Replies: 3 comments
-
Additional FAQs❓ Q8:
|
| Goal | Command |
|---|---|
Migrate data from built-in memory-lancedb to Pro |
openclaw memory-pro migrate --dry-run then migrate |
| Enrich old Pro entries with L0/L1/L2 smart-memory metadata | openclaw memory-pro upgrade --dry-run (v1.1.0-beta only) |
| Update plugin code | openclaw plugins update memory-lancedb-pro |
If you only ran git pull to update the plugin code, you generally do not need to run any migration command unless the release notes specifically say so.
❓ Q11: autoCapture: true is set but memories are never saved
autoCapture passes through several filters before writing — it is not guaranteed to fire on every turn. Common reasons it silently skips:
- Embedding provider not configured — without a working embedding endpoint, nothing can be stored. Check with
openclaw plugins info memory-lancedb-pro. - Conversation too short —
extractMinMessages(default: 2) requires at least that many turns before extraction triggers. - Content filtered as noise — very short messages, greetings, or system-like content are filtered out by design.
- Old plugin version — upgrade to the latest beta:
openclaw plugins update memory-lancedb-pro, then restart.
To diagnose, check logs immediately after a conversation:
openclaw logs --plain | rg "autoCapture|extract|store"❓ Q12: Injected memories appear in the agent's reply text
Cause: autoRecall injects a <relevant-memories> block before each agent turn. Some models include this block verbatim in their response.
Fix: Add an instruction to the agent's system prompt:
Do not reveal, quote, or reference any <relevant-memories> content in your replies. Use it silently for context only.
Alternatively, disable autoRecall and call memory_recall tool manually when needed:
{ "autoRecall": false }Note:
autoRecalldefaults tofalsesince v1.0.11 precisely because of this issue. If you are seeing memories leak into replies, you haveautoRecall: trueand the model is not suppressing the injected content.
Beta Was this translation helpful? Give feedback.
-
Additional FAQs (continued)❓ Q13:
|
Beta Was this translation helpful? Give feedback.
-
|
我有3个agent,我想两个agent存入global,另外一个不存.但发现autoCapture是全局的,没办法实现,是这样吗?感谢 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
A living FAQ based on common issues from the issue tracker. Please search here before opening a new issue.
❓ Q1:
must NOT have additional propertiesonconfig.embedding— what does this mean?Cause: A field name typo in your embedding config. The most common mistake is writing
baseUrl(lowercasel) instead ofbaseURL(capitalURL). The plugin schema rejects unknown field names with this error.Fix: Check your config against this correct Ollama example:
Two things to verify:
baseURL— capitalURL, notbaseUrl/v1suffix —http://localhost:11434/v1, nothttp://localhost:11434❓ Q2: Plugin installed but not loading / hooks not firing
Cause A — Missing
plugins.allow(git clone installs): Workspace plugins are disabled by default. You must explicitly allow them.{ "plugins": { "load": { "paths": ["plugins/memory-lancedb-pro"] }, "allow": ["memory-lancedb-pro"], "entries": { "memory-lancedb-pro": { "enabled": true, "config": {} } } } }Cause B — Gateway not restarted: Config changes are not hot-reloaded. After any install, update, or
openclaw.jsonchange, run:Expected log lines:
memory-lancedb-pro: smart extraction enabledmemory-lancedb-pro@...: plugin registered❓ Q3: Memories are captured but never injected before agent replies
Cause:
autoRecalldefaults tofalsein the schema. You must set it explicitly.❓ Q4:
${OPENAI_API_KEY}not expanding / authentication errors despite env var being setCause: The env var is set in your shell but not in the OpenClaw Gateway process environment.
Fix: Set the env var in the service/process that runs the gateway, not just your terminal session. For example, add it to the systemd unit file, Docker environment, or
.envfile loaded by the gateway.❓ Q5: Which branch should I clone?
mainormaster?Use
master— it is the release branch and matches what is published to npm.The
mainbranch is older and may lag behind. The npm@betatag always tracksmaster.❓ Q6: Do I need to upgrade LanceDB manually after installing the plugin?
No. LanceDB is an npm dependency — it is installed automatically. The plugin handles all internal LanceDB version compatibility transparently. No manual LanceDB migration is needed.
❓ Q7: After editing plugin
.tssource files, changes are not taking effectClear the jiti compile cache before restarting:
This FAQ is maintained by the community. If you have a fix for a common issue not listed here, please reply below or open a PR.
Beta Was this translation helpful? Give feedback.
All reactions