Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 12 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,26 +13,27 @@

[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fwavetermdev%2Fwaveterm.svg?type=shield)](https://app.fossa.com/projects/git%2Bgithub.com%2Fwavetermdev%2Fwaveterm?ref=badge_shield)

Wave is an open-source terminal that combines traditional terminal features with graphical capabilities like file previews, web browsing, and AI assistance. It runs on MacOS, Linux, and Windows.
Wave is an open-source, AI-integrated terminal for macOS, Linux, and Windows. It works with any AI model. Bring your own API keys for OpenAI, Claude, or Gemini, or run local models via Ollama and LM Studio. No accounts required.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

Wave Terminal AI model support documentation

💡 Result:

Wave Terminal’s current (“v0.13+”) documentation for AI model support is centered on Wave AI Modes:

  • Wave AI (Local Models + BYOK) — how to configure supported providers/models via ~/.config/waveterm/waveai.json, including provider presets and examples for Ollama, LM Studio, vLLM, plus supported providers (openai, openrouter, google/gemini, azure, azure-legacy, custom) and API types (openai-chat, openai-responses, google-gemini). [1]
  • Release notes (v0.13.0+) — background on the shift to local models/BYOK and OpenAI-compatible /v1/chat/completions support, plus Gemini integration. [2]
  • AI Presets (Deprecated) — older preset-based configuration (kept for legacy/reference). [3]
  • Legacy docs (deprecated pre-0.8.0 Wave) — older “supported LLMs”/OpenAI setup docs. [4]

Sources: [1] [2] [3] [4]


Qualify the "any AI model" claim—it's inaccurate as stated.

The statement "It works with any AI model" overstates capabilities. Wave Terminal actually supports specific providers (Ollama, LM Studio, vLLM, OpenAI, OpenRouter, Gemini, Azure, and custom) with specific API types (openai-chat, openai-responses, google-gemini). While it does support OpenAI-compatible models and custom providers via configuration, this is not the same as supporting "any AI model." Revise to accurately reflect that it works with major AI providers and OpenAI-compatible models, or specify the supported providers/configurations explicitly.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` at line 16, The phrase "It works with any AI model" is too
broad—update the README sentence to accurately list supported providers and API
types: mention support for major providers (Ollama, LM Studio, vLLM, OpenAI,
OpenRouter, Gemini, Azure, and custom providers) and clarify support for
OpenAI-compatible models and specific API types (openai-chat, openai-responses,
google-gemini) instead of claiming universal compatibility; replace the original
sentence with a concise line that either enumerates these providers or says
"major AI providers and OpenAI-compatible models" and optionally references
custom provider configuration.


Modern development involves constantly switching between terminals and browsers - checking documentation, previewing files, monitoring systems, and using AI tools. Wave brings these graphical tools directly into the terminal, letting you control them from the command line. This means you can stay in your terminal workflow while still having access to the visual interfaces you need.
Wave also supports durable SSH sessions that survive network interruptions and restarts, with automatic reconnection. Edit remote files with a built-in graphical editor and preview files inline without leaving the terminal.

![WaveTerm Screenshot](./assets/wave-screenshot.webp)

## Key Features

- Wave AI - Context-aware terminal assistant that reads your terminal output, analyzes widgets, and performs file operations
- Durable SSH Sessions - Remote terminal sessions survive connection interruptions, network changes, and Wave restarts with automatic reconnection
- Flexible drag & drop interface to organize terminal blocks, editors, web browsers, and AI assistants
- Built-in editor for seamlessly editing remote files with syntax highlighting and modern editor features
- Built-in editor for editing remote files with syntax highlighting and modern editor features
- Rich file preview system for remote files (markdown, images, video, PDFs, CSVs, directories)
- Quick full-screen toggle for any block - expand terminals, editors, and previews for better visibility, then instantly return to multi-block view
- Wave AI - Context-aware terminal assistant that reads your terminal output, analyzes widgets, and performs file operations
- AI chat widget with support for multiple models (OpenAI, Claude, Azure, Perplexity, Ollama)
- Command Blocks for isolating and monitoring individual commands with auto-close options
- Command Blocks for isolating and monitoring individual commands
- One-click remote connections with full terminal and file system access
- Secure secret storage using native system backends - store API keys and credentials locally, access them across SSH sessions
- Rich customization including tab themes, terminal styles, and background images
- Powerful `wsh` command system for managing your workspace from the CLI and sharing data between terminal sessions
- Connected file management with `wsh file` - seamlessly copy and sync files between local, remote SSH hosts, Wave filesystem, and S3
- Connected file management with `wsh file` - seamlessly copy and sync files between local and remote SSH hosts

## Wave AI

Expand All @@ -41,10 +42,12 @@ Wave AI is your context-aware terminal assistant with access to your workspace:
- **Terminal Context**: Reads terminal output and scrollback for debugging and analysis
- **File Operations**: Read, write, and edit files with automatic backups and user approval
- **CLI Integration**: Use `wsh ai` to pipe output or attach files directly from the command line
- **BYOK Support**: Bring your own API keys for OpenAI, Claude, Gemini, Azure, and other providers
- **Local Models**: Run local models with Ollama, LM Studio, and other OpenAI-compatible providers
- **Free Beta**: Included AI credits while we refine the experience
- **Coming Soon**: Command execution (with approval), local model support, and alternate AI providers (BYOK)
- **Coming Soon**: Command execution (with approval)

Learn more in our [Wave AI documentation](https://docs.waveterm.dev/waveai).
Learn more in our [Wave AI documentation](https://docs.waveterm.dev/waveai) and [Wave AI Modes documentation](https://docs.waveterm.dev/waveai-modes).

## Installation

Expand All @@ -65,7 +68,7 @@ Wave Terminal runs on the following platforms:
The WSH helper runs on the following platforms:

- macOS 11 or later (arm64, x64)
- Windows 10 or later (arm64, x64)
- Windows 10 or later (x64)
- Linux Kernel 2.6.32 or later (x64), Linux Kernel 3.1 or later (arm64)

## Roadmap
Expand All @@ -79,8 +82,6 @@ Want to provide input to our future releases? Connect with us on [Discord](https
- Homepage — https://www.waveterm.dev
- Download Page — https://www.waveterm.dev/download
- Documentation — https://docs.waveterm.dev
- Legacy Documentation — https://legacydocs.waveterm.dev
- Blog — https://blog.waveterm.dev
- X — https://x.com/wavetermdev
- Discord Community — https://discord.gg/XfvZ334gwU

Expand Down