Skip to content

Instructions for local LLMs unclear for newbies #563

@stijn-uva

Description

@stijn-uva

Discussed in #536

Originally posted by kolenyo2099 October 15, 2025
Hi all,

I was trying to make use of the local LLM connection feature, but it took me a while to remember that, when connecting to local host from docker, one should switch http://127.0.0.1 for http://host.docker.internal, and thus, when using ollama or lmstudio, with something like http://host.docker.internal:1234/v1.

I thought perhaps there's a fast way of communicating this in the UI.

Cheers,

g.

Metadata

Metadata

Assignees

No one assigned

    Labels

    (mostly) front-endPrimarily involves the web interface or API.enhancementNew feature or request

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions