Skip to content

feat: concurrent requests limit#1043

Open
linkliti wants to merge 2 commits intoagent0ai:developmentfrom
linkliti:concurrent-requests
Open

feat: concurrent requests limit#1043
linkliti wants to merge 2 commits intoagent0ai:developmentfrom
linkliti:concurrent-requests

Conversation

@linkliti
Copy link
Contributor

Adds concurrent request limiting via semaphores, allowing to cap how many requests can run simultaneously per model type. This is useful when running multiple chats in parallel (and using #1030) to avoid overwhelming the API or the local model.

Also fixes missing rate limit settings for browser and embedding models in initialize.py.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant