-
-
Notifications
You must be signed in to change notification settings - Fork 52
no models in custom provider #392
Description
Describe the bug
ECA 0.123.2 reads a custom provider from =config.json= correctly, but the provider is not registered as an available runtime model. As a result:
- =/config= shows the provider and =defaultModel= correctly
- =/providers-list= shows the provider as configured, but with =0 models=
- =/doctor= shows empty =Default model:=
- chat requests can fail with:
#+begin_src text :results output
API url not found.
Make sure you have provider '' configured properly.
#+end_src
This seems to affect custom OpenAI-compatible providers configured with =api: "openai-chat"= and explicit static models in config.
To Reproduce
- Create =~/.config/eca/config.json= with a custom provider, for example:
#+begin_src json :results output
{
"$schema": "https://eca.dev/config.json",
"defaultModel": "aitunnel/qwen3.5-397b-a17b",
"providers": {
"aitunnel": {
"api": "openai-chat",
"url": "https://api.aitunnel.ru/v1",
"key": "REDACTED",
"fetchModels": false,
"models": {
"qwen3.5-397b-a17b": {}
}
}
}
}
#+end_src - Start ECA server.
- Open chat and run =/config=.
- Run =/providers-list= and =/doctor=.
- Send a normal chat message.
Expected behavior
If a custom provider is present in =/config= and has explicit static models in =providers..models=, ECA should register those models as available runtime models even when =fetchModels=false=.
Expected result:
- =/providers-list= should show at least 1 model for the custom provider
- =/doctor= should show a non-empty =Default model:=
- chat should use =aitunnel/qwen3.5-397b-a17b= successfully
Actual behavior
- =/config= shows the merged config correctly, including:
- =defaultModel "aitunnel/qwen3.5-397b-a17b"=
- the custom provider
- =fetchModels false=
- the static =models= entry
- =/providers-list= shows:
#+begin_src text :results output
aitunnel ✓ configured
API Key · (config) · active
api: openai-chat
url: https://api.aitunnel.ru/v1
fetchModels: false
0 models
#+end_src - =/doctor= shows:
#+begin_src text :results output
Default model:
Logged providers:
openai: Unknown
#+end_src
- sending a chat message can fail with:
#+begin_src text :results output
API url not found.
Make sure you have provider '' configured properly.
#+end_src
Doctor
#+begin_src text :results output
ECA version: 0.123.2
Server cmd: /home/zoya/.emacs.d/eca/eca server
Workspaces: /home/zoya/.config/eca
Default model:
Logged providers:
openai: Unknown
#+end_src
Additional context
From source inspection, this looks like a runtime model registration / initialization issue rather than a config parsing issue:
- =/config= proves the config is loaded correctly
- custom providers with =api "openai-chat"= appear to be supported by runtime dispatch
- the failure seems to happen because the custom model does not end up in =db[:models]=
- chat/model selection then has no valid runtime model to use
- there may also be an initialization race, because model sync is started asynchronously during startup
Relevant symptoms are consistent with:
- static models from custom providers not being surfaced into available models
- or chat starting before model sync completes
- or both