Complete Guide to Your AI-Powered Coding Assistant
Version 1.0 | opencan.ai
OCCode is a powerful AI-powered coding assistant that runs directly in your terminal. It combines the capabilities of multiple AI models with advanced features like convergence orchestration, model profiles, and intelligent context management.
Support for 11 providers and 40+ models including Claude, GPT-4, Gemini, DeepSeek, and more.
Run multiple AI models in parallel and synthesize their outputs for higher quality results.
Save named configurations and switch instantly with prefix shortcuts like "quick:".
Real-time token and cost tracking with model-specific pricing across 20+ tiers.
Create save points and undo file changes with a single command.
Interactive terminal with syntax highlighting, streaming responses, and visual progress indicators.
Download OCCode from opencan.ai/downloads (account required), then install:
# Extract the archive (Linux/macOS)
tar xzf occode-0.1.0-linux-x64.tar.gz
sudo mv occode /usr/local/bin/
# Activate with your license key
occode activate --key YOUR-LICENSE-KEY
# Verify installation
occode --version
# Set your API key
export ANTHROPIC_API_KEY="your-key-here"
# Start OCCode
occode
# Ask a question
What is recursion?
# Generate code
write a binary search function in TypeScript
# Add files to context
/context src/main.ts
# Get help
/help
Model Profiles let you create named configurations for different AI models and switch between them instantly using prefix shortcuts.
/profile template fast # Claude Haiku - quick answers
/profile template power # Claude Opus - complex tasks
/profile template creative # High temperature Sonnet
/profile template gpt # GPT-4 Turbo
/profile template local # Ollama (offline)
/profile create myprofile
You'll be prompted for:
/profile fast # Switch to fast profile
/profile power # Switch to power profile
/profile off # Deactivate (use global config)
Set a prefix trigger to temporarily use a profile for a single message:
# Set prefix for a profile
/profile prefix fast quick:
/profile prefix power think:
# Use in conversation
quick: what is recursion?
think: design a distributed cache system
/profile - Show status and list all profiles
/profile edit myprofile - Edit existing profile
/profile delete myprofile - Delete a profile
/profile default fast - Set default profile (auto-activates on startup)
/profile export - Export profiles for sharing
/profile import profiles.json - Import profiles from team
| Template | Model | Prefix | Use Case |
|---|---|---|---|
fast |
Claude Haiku | quick: |
Quick questions, simple tasks |
default |
Claude Sonnet 4 | - | Balanced quality/speed |
power |
Claude Opus 4 | think: |
Complex reasoning, critical tasks |
creative |
Sonnet (high temp) | create: |
Creative writing, brainstorming |
gpt |
GPT-4 Turbo | gpt: |
Alternative perspective |
local |
Ollama | local: |
Privacy, offline work |
The Convergence Engine runs multiple AI models in parallel and combines their outputs using the Mixture-of-Agents pattern for higher quality results.
# View available presets
/converge preset
# Apply a preset
/converge preset duo-merge
# Enable convergence
/converge on
# Your messages now use convergence
write a binary search function
# Disable when done
/converge off
| Preset | Models | Strategy | Description |
|---|---|---|---|
duo-merge |
Sonnet + GPT-4 → Sonnet | merge | Best balance of quality and cost |
trio-merge |
Sonnet + GPT-4 + Haiku → Sonnet | merge | Maximum quality, diverse perspectives |
code-review |
Sonnet → GPT-4 (review) | review | Cost-effective quality assurance |
debate |
Sonnet ↔ GPT-4 (2 rounds) | debate | Thorough analysis through dialogue |
vote |
3 models generate + vote | vote | Democratic selection |
local-merge |
CodeLlama + DeepSeek (local) | merge | Private, no API costs |
All participant models generate responses in parallel, then an aggregator synthesizes the best parts.
Best for: Code generation, complex explanations, creative tasks
All models generate responses, then each votes for the best (not their own).
Best for: Tasks with objective correct answers, choosing between options
Multi-round critique and refinement - models critique each other, then revise their responses.
Best for: Important decisions, thorough analysis, exploring trade-offs
Pair programming workflow - Model A generates, Model B reviews, Model A revises.
Best for: Code quality, catching bugs, cost-effective quality boost
/converge add sonnet - Add model using short alias
/converge add gpt4o - Add another model
/converge remove haiku - Remove a model
/converge strategy merge - Set convergence strategy
/converge aggregator sonnet - Set which model synthesizes results
/converge rounds 3 - Set debate rounds (1-5)
/converge show on - Show individual model outputs
/converge models - View configured models
/converge last - View statistics from last run
The Model Catalog is a centralized registry of 40+ AI models from 11 providers with short aliases, pricing data, and automatic API key detection.
Use sonnet instead of claude-sonnet-4-20250514
Automatically finds available models based on your API keys
Real-time pricing information for accurate cost estimates
Find models by capability, price, or provider
| Provider | Models Available | Environment Variable |
|---|---|---|
| Anthropic | Claude Opus 4, Sonnet 4, Haiku 4.5 | ANTHROPIC_API_KEY |
| OpenAI | GPT-4o, GPT-4, o1, o3-mini, GPT-4o-mini | OPENAI_API_KEY |
| Gemini 2.0 Flash, Gemini 2.5 Pro | GOOGLE_API_KEY |
|
| DeepSeek | DeepSeek V3, DeepSeek R1 | DEEPSEEK_API_KEY |
| Mistral | Mistral Large, Codestral | MISTRAL_API_KEY |
| Groq | Llama 3.3 70B, DeepSeek R1 Distill | GROQ_API_KEY |
| Together | Various open models | TOGETHER_API_KEY |
| OpenRouter | 100+ models via unified API | OPENROUTER_API_KEY |
| OpenCan | Custom models | OPENCAN_API_KEY |
| Ollama | Local models (CodeLlama, Qwen, etc.) | None (local) |
| LM Studio | Local models | None (local) |
| Alias | Full Model ID | Tier | Use Case |
|---|---|---|---|
opus |
claude-opus-4-20250514 | flagship | Highest quality, complex reasoning |
sonnet |
claude-sonnet-4-20250514 | balanced | Best balance of speed/quality |
haiku |
claude-haiku-4-5-20251001 | fast | Quick responses, low cost |
| Alias | Full Model ID | Tier | Use Case |
|---|---|---|---|
gpt4o |
gpt-4o | flagship | General purpose, multimodal |
gpt4 |
gpt-4-turbo | flagship | Complex tasks, long context |
mini |
gpt-4o-mini | fast | Fast, cost-effective |
o1 |
o1 | flagship | Advanced reasoning |
o3mini |
o3-mini | balanced | Efficient reasoning |
# See complete catalog
/converge catalog
# Search by keyword
/converge search code
/converge search fast
/converge search free
# Show only models with API keys set
/converge available
| Tier | Description | Examples | Use Case |
|---|---|---|---|
| flagship | Highest quality, most capable | Opus, GPT-4, Gemini Pro | Critical tasks, complex reasoning |
| balanced | Best quality/cost ratio | Sonnet, o3-mini | General development work |
| fast | Quick responses, lower cost | Haiku, mini, Gemini Flash | Simple questions, iteration |
| economy | Maximum cost efficiency | Local models | High volume, privacy needs |
OCCode provides powerful tools for managing which files and information the AI can access.
/context - Show current context overview
/context add <file> - Add file to context
/context remove <file> - Remove file from context
/context src/**/*.ts - Add files using glob patterns
Pin files to keep them in context across all sessions:
/pin src/config.ts
/pin README.md
/unpin src/config.ts
Exclude files or directories from being added to context:
/exclude node_modules
/exclude "*.test.ts"
/exclude "**/tests/**"
/include node_modules # Remove exclusion
The /context command shows:
/provider [name] - Show current or switch provider (anthropic, openai, google, etc.)
/model [name] - Show current or change model (sonnet, gpt4o, etc.)
/api_key [key] - Show status or set API key
/api_url [url] - Show or change custom API endpoint
/profile - Show active profile and list all
/profile <name> - Activate profile
/profile create <name> - Create new profile
/profile template <name> - Create from template
/profile edit <name> - Edit profile
/profile delete <name> - Delete profile
/profile off - Deactivate profile
/profile default <name> - Set default profile
/profile prefix <name> <prefix> - Set message prefix
/profile export - Export profiles
/profile import <file> - Import profiles
/converge - Show status
/converge on|off - Enable/disable
/converge strategy <name> - Set strategy (merge/vote/debate/review)
/converge add <alias> - Add model
/converge remove <name> - Remove model
/converge aggregator <name> - Set aggregator
/converge preset <name> - Load preset
/converge rounds <n> - Set debate rounds
/converge show on|off - Toggle individual outputs
/converge catalog - Browse models
/converge available - Show models with API keys
/converge search <keyword> - Search models
/context - Show context overview
/context add <file> - Add file
/context remove <file> - Remove file
/pin <file> - Pin file (persistent)
/unpin <file> - Unpin file
/exclude <pattern> - Exclude files
/include <pattern> - Remove exclusion
/clear - Clear conversation history
/status - Show session statistics
/cost - Show cost/token breakdown
/compact - Compact history to save tokens
/export <file> - Export session
/debug - Toggle debug mode
/git - Show git status
/commit - Generate and apply commit
/diff <files> - Show enhanced diff
/checkpoint [message] - Create checkpoint
/checkpoint list - List checkpoints
/checkpoint restore <id> - Restore checkpoint
/undo - Undo last file change
/index - Show indexing status
/index rebuild - Force rebuild index
/daemon - Show daemon status
/daemon start - Start daemon
/daemon stop - Stop daemon
/mode interactive - Approve each action
/mode auto - Execute without approval
/help - Show all commands
/git
Shows:
/commit
OCCode will:
/diff src/main.ts
/diff --side-by-side src/main.ts src/utils.ts
/diff --no-syntax src/**/*.ts
Features:
OCCode provides real-time cost tracking with model-specific pricing across 20+ pricing tiers.
/status # Quick overview
/cost # Detailed breakdown
When using convergence, costs are tracked for:
/converge last command shows exact costs from your last run.
Save a snapshot of all file states:
/checkpoint "Before refactoring auth module"
/checkpoint "Working state before experiment"
/checkpoint list # List all checkpoints
/checkpoint restore cp_123 # Restore a specific checkpoint
Undo the last file change with a single command:
/undo
Checkpoints include:
~/.occode/checkpoints/. They persist across sessions but can accumulate disk space over time.
OCCode supports 11 AI providers with 40+ models.
# Anthropic (Claude)
export ANTHROPIC_API_KEY="sk-ant-..."
# OpenAI (GPT)
export OPENAI_API_KEY="sk-..."
# Google (Gemini)
export GOOGLE_API_KEY="..."
# DeepSeek
export DEEPSEEK_API_KEY="..."
# Mistral
export MISTRAL_API_KEY="..."
# Groq
export GROQ_API_KEY="..."
/provider anthropic
/api_key sk-ant-your-key-here
/provider anthropic
/model sonnet
/provider openai
/model gpt4o
/provider google
/model gemini-pro
For self-hosted or custom API endpoints:
/api_url https://your-custom-api.com/v1
/api_url reset # Reset to default
| Provider | Tool Use | Vision | Streaming | Cost |
|---|---|---|---|---|
| Anthropic | ✅ | ✅ | ✅ | Medium-High |
| OpenAI | ✅ | ✅ | ✅ | Medium-High |
| ✅ | ✅ | ✅ | Low-Medium | |
| DeepSeek | ✅ | ❌ | ✅ | Very Low |
| Ollama (Local) | Varies | Varies | ✅ | Free |
OCCode stores configuration in ~/.occode/:
config.json - Global settingsprofiles.json - Model profilesconvergence.json - Convergence configurationsessions/ - Session historycheckpoints/ - File snapshotsEdit ~/.occode/config.json:
{
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"apiEndpoint": null,
"maxTokens": 4096,
"temperature": 0.7,
"autoApprove": false,
"theme": "auto"
}
Profiles are stored in ~/.occode/profiles.json:
{
"profiles": {
"fast": {
"name": "fast",
"provider": "anthropic",
"model": "claude-haiku-4-5-20251001",
"temperature": 0.3,
"maxTokens": 2048,
"triggerPrefix": "quick:",
"description": "Quick answers, low cost"
}
},
"activeProfile": "fast",
"defaultProfile": null
}
Configuration priority (highest to lowest):
OCCODE_*)Solution:
# Check if environment variable is set
echo $ANTHROPIC_API_KEY
# Set via OCCode
/api_key your-key-here
# Or set environment variable
export ANTHROPIC_API_KEY="your-key"
Solution:
sk-ant- for Anthropic)Solution:
# List all profiles to check spelling
/profile
# Recreate if needed
/profile template fast
Solution:
/profile (should show [quick:] next to profile)quick: not quick/profile prefix fast quick:Solution:
# Use a preset
/converge preset duo-merge
# Or add models manually
/converge add sonnet
/converge add gpt4o
/converge aggregator sonnet
Solution:
# Use cost-effective strategy
/converge preset code-review
# Use cheaper models
/converge add haiku
/converge add mini
/converge aggregator haiku
# Disable when not needed
/converge off
Solution:
# Remove large files
/context remove large-file.json
# Exclude test files
/exclude "*.test.ts"
/exclude "**/tests/**"
# Compact conversation history
/compact
Solution:
//help to see all commands/debug for more detailed error messagesSolution:
# Check sessions directory
ls ~/.occode/sessions/
# Start fresh session
occode --new-session
/help/debug