- Lafayette, CA
Pinned Loading
-
generic-copilot
generic-copilot PublicUse frontier LLMs from nearly any orovider in VS Code with GitHub Copilot Chat - no Copilot subscription needed. Powered by any Vercel AI-SDK compatible provider including Claude Code, LiteLLM. z.a…
-
llm-speed-test
llm-speed-test Public`llm-speed-bench` is a command-line interface (CLI) tool for benchmarking the performance of Large Language Model (LLM) providers that offer an OpenAI-compatible API.
JavaScript 6
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.






