AI coding tools evolve fast, and OpenCode stands out as an open-source agent rivaling Cursor or Claude Code CLI - fully terminal-native with LSP support, multi-session handling, and 75+ LLM providers via Models.dev.
Compared to locked-in alternatives that are subscription base you can pick models pay-as-you-go prioritizing privacy and flexibility.
Check my 5-minute setup video for a quick demo.
Installation
Start with the one-liner curl install - no Docker or complex deps needed:
curl -fsSL https://opencode.ai/install | bash .
This adds the opencode CLI globally. Verify with opencode --version, then launch via opencode. For VS Code integration, it pairs seamlessly as an LSP client. Full docs: opencode.ai/docs.
OpenRouter Setup
OpenRouter powers model access via its OpenAI-compatible API - sign up at openrouter.ai, grab a key, and fund credits (pay-per-token, starting free tiers).
Export these env vars (add to ~/.zshrc for persistence):
export OPENAI_API_BASE=https://openrouter.ai/api/v1
export OPENAI_API_KEY=your-openrouter-api-key
Source your shell (source ~/.zshrc) and relaunch. Screenshot your API keys page for readers to match. This unlocks 300+ models without switching providers.
I added some credits to OpenRouter just for more advanced models for different projects.
Source your shell (source ~/.zshrc) and relaunch. Screenshot your API keys page for readers to match. This unlocks 300+ models without switching providers.
Basic Usage
Run opencode in a repo to init: It prompts project context, then use slash commands in the TUI.
-
/connect: Link OpenRouter (auto-detects env vars). -
/models: List/select (e.g., Claude 3.5 Sonnet or GPT-4o-filter free ones). -
/init: Scans repo, suggests plan (e.g., "Analyze main.rs and create layers"). - Core flow:
/planfor outline,/buildto generate/runcode,/improvefor refinements.



Top comments (0)