Skip to content

no models in custom provider #392

@11111000000

Description

@11111000000

Describe the bug

ECA 0.123.2 reads a custom provider from =config.json= correctly, but the provider is not registered as an available runtime model. As a result:

  • =/config= shows the provider and =defaultModel= correctly
  • =/providers-list= shows the provider as configured, but with =0 models=
  • =/doctor= shows empty =Default model:=
  • chat requests can fail with:
    #+begin_src text :results output
    API url not found.
    Make sure you have provider '' configured properly.
    #+end_src

This seems to affect custom OpenAI-compatible providers configured with =api: "openai-chat"= and explicit static models in config.

To Reproduce

  1. Create =~/.config/eca/config.json= with a custom provider, for example:
    #+begin_src json :results output
    {
    "$schema": "https://eca.dev/config.json",
    "defaultModel": "aitunnel/qwen3.5-397b-a17b",
    "providers": {
    "aitunnel": {
    "api": "openai-chat",
    "url": "https://api.aitunnel.ru/v1",
    "key": "REDACTED",
    "fetchModels": false,
    "models": {
    "qwen3.5-397b-a17b": {}
    }
    }
    }
    }
    #+end_src
  2. Start ECA server.
  3. Open chat and run =/config=.
  4. Run =/providers-list= and =/doctor=.
  5. Send a normal chat message.

Expected behavior

If a custom provider is present in =/config= and has explicit static models in =providers..models=, ECA should register those models as available runtime models even when =fetchModels=false=.

Expected result:

  • =/providers-list= should show at least 1 model for the custom provider
  • =/doctor= should show a non-empty =Default model:=
  • chat should use =aitunnel/qwen3.5-397b-a17b= successfully

Actual behavior

  • =/config= shows the merged config correctly, including:
    • =defaultModel "aitunnel/qwen3.5-397b-a17b"=
    • the custom provider
    • =fetchModels false=
    • the static =models= entry
  • =/providers-list= shows:
    #+begin_src text :results output
    aitunnel ✓ configured
    API Key · (config) · active
    api: openai-chat
    url: https://api.aitunnel.ru/v1
    fetchModels: false
    0 models
    #+end_src
  • =/doctor= shows:
    #+begin_src text :results output
    Default model:

Logged providers:
openai: Unknown
#+end_src

  • sending a chat message can fail with:
    #+begin_src text :results output
    API url not found.
    Make sure you have provider '' configured properly.
    #+end_src

Doctor

#+begin_src text :results output
ECA version: 0.123.2

Server cmd: /home/zoya/.emacs.d/eca/eca server

Workspaces: /home/zoya/.config/eca

Default model:

Logged providers:
openai: Unknown
#+end_src

Additional context

From source inspection, this looks like a runtime model registration / initialization issue rather than a config parsing issue:

  • =/config= proves the config is loaded correctly
  • custom providers with =api "openai-chat"= appear to be supported by runtime dispatch
  • the failure seems to happen because the custom model does not end up in =db[:models]=
  • chat/model selection then has no valid runtime model to use
  • there may also be an initialization race, because model sync is started asynchronously during startup

Relevant symptoms are consistent with:

  • static models from custom providers not being surfaced into available models
  • or chat starting before model sync completes
  • or both

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions