Skip to content

Add local-only LM Studio model profile (local.json)#150

Open
82deutschmark wants to merge 1 commit intoPlanExeOrg:mainfrom
VoynichLabs:lmstudio-local-profile
Open

Add local-only LM Studio model profile (local.json)#150
82deutschmark wants to merge 1 commit intoPlanExeOrg:mainfrom
VoynichLabs:lmstudio-local-profile

Conversation

@82deutschmark
Copy link
Collaborator

This PR adds a clean, local-only LM Studio model profile for PlanExe.

What it does

  • Provides a dedicated local.json configuration file for LM Studio-based local models
  • Includes both primary and fallback model configurations using the LMStudio class

Why context_window and num_output matter

The llama_index library uses conservative defaults for context_window (3900 tokens) and num_output (256 tokens) that can silently truncate prompts for local models. This profile explicitly sets:

  • context_window: 32768 — prevents silent truncation of long prompts
  • num_output: 4096 — allows for more detailed responses

These settings ensure that local models receive complete prompts and can generate fuller responses without the underlying llama_index framework silently cutting content.

Reference

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant