1. Introduction
What is OCCode CLI?
OCCode CLI is a terminal-based AI coding assistant developed by OpenCan.ai. It brings the power of advanced language models directly to your command line with multi-provider support, real-time cost tracking, checkpoint-based undo, and over 60 interactive commands.
Multi-Provider AI
11 providers, 40+ models including Claude, GPT, Gemini, DeepSeek, Ollama & more
Checkpoint System
Save and restore file states for safe undo/rollback of any change
Cost Tracking
Real-time token usage and cost display across all providers
Session Persistence
Resume conversations without losing context
Convergence Engine
Multi-model ensemble with Merge, Vote, Debate, and Review strategies
Git Integration
AI-generated commit messages, enhanced diffs, auto-stage
System Requirements
| Requirement | Details |
|---|---|
| Node.js | 18.0 or higher (for npm install method) |
| OS | Windows 10+, macOS 12+, Linux (x64/arm64) |
| Disk Space | ~50MB (binary) or ~5MB (npm) |
| Network | Required for cloud AI providers; optional for Ollama/local |
2. Installation
Download (Account Required)
Download OCCode from opencan.ai/downloads. An opencan.ai account is required. A valid license key is required to activate the application.
| Platform | File |
|---|---|
| Windows x64 | occode-0.1.0-windows-x64.zip |
| macOS Intel | occode-0.1.0-macos-x64.tar.gz |
| macOS Apple Silicon | occode-0.1.0-macos-arm64.tar.gz |
| Linux x64 | occode-0.1.0-linux-x64.tar.gz |
# Linux/macOS install
tar xzf occode-0.1.0-linux-x64.tar.gz
sudo mv occode /usr/local/bin/
# Windows - extract the .zip, then add the folder to your PATH
# Activate with your license key
occode activate --key YOUR-LICENSE-KEY
Verify Installation
occode --version
occode --help
3. Quick Start
Step 1: Configure Your API Key
# Interactive configuration wizard
occode config
# Or set directly
occode config --set-key --provider anthropic
Step 2: Start an Interactive Session
occode
This launches the REPL (Read-Eval-Print Loop) where you can chat with the AI and give coding instructions.
Step 3: Run a One-Shot Task
# Execute a task and exit
occode run "Add unit tests for UserService"
# Autonomous mode (no approval prompts)
occode run "Fix all TypeScript errors" --yes
# With specific files in context
occode run "Refactor this component" -c src/App.tsx
Step 4: Understanding the Interface
# The REPL shows a timestamp prompt:
2026-02-10 14:30:12 EST ❯ your message here
# Use slash commands for quick actions:
/help # Show all commands
/cost # Show token costs
/model # Show current model
/status # Show session info
4. Core Commands
occode / occode chat — Interactive Mode
occode [chat]
-m, --model <model> # Model to use (e.g. claude-sonnet-4-20250514)
-p, --provider <provider> # AI provider (anthropic, openai, etc.)
-c, --context <files...> # Add files to context
--session <id> # Resume previous session
occode run <task> — Single Task Execution
occode run <task>
-y, --yes # Auto-approve all actions (autonomous)
-s, --supervised # Approve all file changes
-m, --model <model> # Model to use
-c, --context <files...> # Add files to context
--dry-run # Preview without executing
--max-turns <n> # Maximum agent turns (default: 50)
--timeout <seconds> # Timeout limit
occode watch — Watch Mode
occode watch
-p, --pattern <glob> # File pattern to watch
-i, --ignore <patterns> # Patterns to ignore
-t, --task <task> # Task to run on changes
occode config — Configuration
occode config
--set <key=value> # Set config value
--get <key> # Get config value
--list # List all settings
--reset # Reset to defaults
--set-key # Set API key securely (keychain)
occode commit — AI Git Commit
occode commit
-a, --all # Stage all changes
-p, --push # Push after commit
occode explain — Explain Code
occode explain [file]
-d, --detailed # Detailed explanation
--architecture # Explain project structure
occode review — Code Review
occode review [path]
--security # Focus on security
--performance # Focus on performance
--diff # Review staged changes
occode search — Semantic Search
occode search <query>
-n, --limit <n> # Max results
-t, --type <type> # Symbol type filter
occode generate — Generate Code/Docs
occode generate <type>
# Types: readme, tests, docs, types
occode checkpoint — Manage Checkpoints
occode checkpoint list # List all checkpoints
occode checkpoint create <msg> # Create named checkpoint
occode checkpoint restore <id> # Restore to checkpoint
occode checkpoint delete <id> # Delete checkpoint
occode undo — Undo Last Change
occode undo
occode history — Session History
occode history
-n, --limit <n> # Number of sessions
--clear # Clear all history
--export <file> # Export to file
--import <file> # Import from file
Other Commands
occode init # Initialize in current directory
occode status # Show current status
occode update # Check for updates
5. Interactive Mode (REPL)
While in chat mode, use these slash commands:
Provider & Model Commands
/provider [name] # Show current or switch provider
/model [name] # Show current or change model
/api_key [key] # Show status or set API key
/api_url [url] # Show or change custom API endpoint
/api_url reset # Reset to default endpoint
Model Profile Commands (12)
/profile # Show active profile and list all
/profile <name> # Activate a profile
/profile create <name> # Create new profile interactively
/profile template <name> # Create from built-in template
/profile templates # List available templates
/profile edit <name> # Edit existing profile
/profile delete <name> # Delete a profile
/profile off # Deactivate current profile
/profile default <name> # Set default profile (auto-activates on startup)
/profile prefix <name> <prefix> # Set message prefix trigger
/profile export # Export profiles to JSON
/profile import <file> # Import profiles from JSON
Convergence Commands (18)
/converge # Show convergence status
/converge on | enable # Enable convergence mode
/converge off | disable # Disable convergence mode
/converge strategy <name> # Set strategy (merge/vote/debate/review)
/converge add <alias> # Add model by short alias
/converge remove <name> # Remove a participant model
/converge aggregator <name> # Set aggregator model for synthesis
/converge preset <name> # Load preset configuration
/converge presets # List available presets
/converge rounds <n> # Set debate rounds (1-5)
/converge show on|off # Show/hide individual model outputs
/converge models # List configured participant models
/converge catalog # Browse all models in catalog
/converge available # Show models with API keys detected
/converge search <keyword> # Search models
/converge last # Show stats from last run
/converge reset # Reset to defaults
/converge export|import # Export/import configuration
Context Management (6)
/context # Show context overview with token usage
/context add <file> # Add file to context
/context remove <file> # Remove file from context
/pin <file> # Pin file to persistent context
/unpin <file> # Remove pin from file
/exclude <pattern> # Exclude files matching glob
/include <pattern> # Remove exclusion pattern
Session Management (7)
/clear # Clear conversation history
/status # Show session statistics
/cost # Show detailed cost breakdown
/compact # Compact history to save tokens
/export <file> # Export session to JSON
/debug # Toggle debug mode
/mode interactive|auto # Set execution mode
Git Integration (3)
/git # Show git status
/commit # Generate AI-powered commit message
/diff <files> # Enhanced diff with syntax highlighting
/diff --side-by-side <files> # Side-by-side comparison
Checkpoint & Undo (4)
/checkpoint [message] # Create named checkpoint
/checkpoint list # List all checkpoints
/checkpoint restore <id> # Restore to specific checkpoint
/undo # Undo last file change
Daemon & Indexing (5)
/index # Show indexing status
/index rebuild # Force rebuild codebase index
/daemon # Show daemon status
/daemon start|stop|restart # Manage background daemon
Subscription (3)
OCCode includes a 7-day free trial with full access. Create an opencan.ai account and choose a plan to start your trial — you won't be charged until day 8. After the trial, a license key is required. Cancel before day 8 to avoid charges. No refunds after billing.
/subscription # Show subscription status (trial days remaining or plan info)
/subscription activate <key> # Activate license key
/subscription plans # View available plans
Feature Toggles
/features # Show all features and status
/features enable <feature> # Enable a feature
/features disable <feature> # Disable a feature
/features cost # Show token cost impact
/features reset # Reset to defaults
/tdg # Toggle Test-Driven Generation
/autofix # Toggle Auto-Fix in LSP Loop
Help & Transcripts
/help | /h | /? # Show all commands
/transcripts # Toggle transcript saving
/transcripts status # Show transcript configuration
/transcripts export [path] # Export to unencrypted JSON
/transcripts clear # Clear all entries
Keyboard Shortcuts
| Key | Action |
|---|---|
Ctrl+C | Stop current operation |
Ctrl+D | Exit session |
Up/Down | Navigate command history |
Tab | Auto-complete file paths and commands |
6. AI Providers
OCCode supports 11 AI providers with 40+ models.
| Provider | Speed | Quality | Cost | Key Required |
|---|---|---|---|---|
| Anthropic (Claude) | Fast | Excellent | $$$ | Yes |
| OpenAI (GPT) | Fast | Excellent | $$$ | Yes |
| DeepSeek | Fast | Excellent | $ | Yes |
| Google (Gemini) | Medium | Good | $$ | Yes |
| Mistral AI | Fast | Good | $$ | Yes |
| Groq | Ultra Fast | Good | Free* | Yes |
| Together AI | Fast | Good | $$ | Yes |
| OpenRouter | Varies | Varies | Varies | Yes |
| Ollama (Local) | Medium | Good | Free | No |
| OpenCan | Varies | Varies | Varies | Yes |
| Custom/Local | Varies | Varies | Varies | Optional |
Anthropic (Claude) Setup
occode config --set provider=anthropic
occode config --set model=claude-sonnet-4-20250514
occode config --set-key --provider anthropic
# Get key from: https://console.anthropic.com
OpenAI (GPT) Setup
occode config --set provider=openai
occode config --set model=gpt-4o
occode config --set-key --provider openai
# Get key from: https://platform.openai.com
DeepSeek Setup
occode config --set provider=deepseek
occode config --set model=deepseek-coder
occode config --set-key --provider deepseek
# Get key from: https://platform.deepseek.com
Ollama (Local - Free, Offline)
# 1. Install Ollama: https://ollama.ai/download
# 2. Pull a model:
ollama pull llama3.3
# 3. Configure OCCode:
occode config --set provider=local
occode config --set apiEndpoint=http://localhost:11434/v1
occode config --set model=llama3.3
OpenRouter (100+ Models via One Key)
occode config --set provider=openrouter
occode config --set model=anthropic/claude-sonnet-4
occode config --set-key --provider openrouter
# Get key from: https://openrouter.ai/keys
Recommended Setups
| User Type | Primary | Fallback |
|---|---|---|
| Professional Dev | Anthropic + Claude Sonnet | OpenRouter + DeepSeek R1 |
| Student/Learner | Groq + Llama 3.3 70B (free) | Ollama + Llama 3.3 (free) |
| Team/Enterprise | OpenCan (centralized billing) | Anthropic or OpenAI |
| Budget-Conscious | DeepSeek + deepseek-coder ($) | Ollama (free) |
7. Model Profiles
Profiles are named model configurations for quick switching.
Built-in Templates
| Template | Provider | Model | Use Case |
|---|---|---|---|
fast | Anthropic | Claude Haiku | Quick responses, low cost |
power | Anthropic | Claude Opus | Complex reasoning |
creative | Anthropic | High-temp Sonnet | Creative tasks |
gpt | OpenAI | GPT-4 Turbo | Alternative perspective |
local | Ollama | Local models | Offline, private |
Usage
# Create from template
/profile template fast
# Activate
/profile fast
# Create custom profile
/profile create my-custom
# Set prefix trigger (e.g., "quick: your message" auto-activates fast profile)
/profile prefix fast quick
# Set a default profile
/profile default fast
# Export/import profiles for team sharing
/profile export
/profile import profiles.json
8. Convergence Engine
The Convergence Engine runs your query through multiple AI models simultaneously and synthesizes the best response.
4 Strategies
| Strategy | How It Works | Best For |
|---|---|---|
| Merge (MoA) | Parallel generation + synthesis | Maximum quality |
| Vote | Democratic selection via model voting | Consensus decisions |
| Debate | Multi-round critique and refinement | Thorough analysis |
| Review | Generate + review + revise workflow | Code quality |
Built-in Presets
/converge preset duo-merge # 2 models with merge
/converge preset trio-merge # 3 models for max quality
/converge preset code-review # Cost-effective review workflow
/converge preset debate # Thorough analysis
/converge preset vote # Democratic consensus
/converge preset local-merge # Zero API cost (local models)
Custom Configuration
/converge on # Enable convergence
/converge strategy merge # Set strategy
/converge add sonnet # Add Claude Sonnet
/converge add gpt4 # Add GPT-4
/converge aggregator opus # Use Opus as synthesizer
/converge rounds 3 # Set 3 debate rounds
9. Context Management
Adding Files to Context
/context add src/api/routes.ts
/context add src/**/*.ts # Glob patterns supported
Pinning Files (Persistent Across Sessions)
/pin src/types.ts # Always in context
/unpin src/types.ts # Remove pin
Excluding Files
/exclude "*.test.ts"
/exclude "**/node_modules/**"
/include "*.test.ts" # Remove exclusion
Context Overview
/context
# Output:
# Context Overview:
# Working directory: /home/user/my-project
# Token usage: 12,450 / 128,000 (9.7%)
# [###.............................]
#
# Pinned files:
# src/types.ts (2,500 tokens)
# Active context files:
# src/api/routes.ts (1,800 tokens)
# ... and 15 more files
Automatic Detection
OCCode automatically detects project type (Node.js, Python, Rust, Go, Java), frameworks (React, Django, Express), and loads relevant context.
10. Session Management
Automatic Persistence
Sessions are automatically saved to ~/.occode/sessions/. Each session gets a unique ID.
Resume a Previous Session
occode --session sess_abc123
Session Commands
/status # Messages, tokens, cost, turns
/clear # Reset conversation (current session)
/compact # Summarize to save tokens
/export session.json # Save session to file
History Management
occode history # View recent sessions
occode history -n 10 # Last 10 sessions
occode history --export sessions.json
occode history --clear # Delete all history
11. Checkpoints & Undo
Checkpoints save file states before changes so you can always roll back.
Creating Checkpoints
/checkpoint Before refactoring auth module
# or
occode checkpoint create "Before refactoring"
Restoring
/checkpoint list # See all saved states
/checkpoint restore ckpt_123 # Restore to that state
Quick Undo
/undo # Reverts last file change
# or
occode undo
12. Cost Tracking
Viewing Costs
/cost
# Output:
# Cost Report
# ────────────────────────────
# Provider: anthropic
# Model: claude-sonnet-4-20250514
#
# Token Usage:
# Input: 12,450 tokens
# Output: 3,200 tokens
# Total: 15,650 tokens
#
# Total Cost: $0.0856
Model Tiers
| Tier | Examples | Cost |
|---|---|---|
| Flagship | Opus, GPT-4, Gemini Pro | $$$ |
| Balanced | Sonnet, o3-mini | $$ |
| Fast | Haiku, GPT-4o-mini, Gemini Flash | $ |
| Economy | Ollama, local models | Free |
/profile fast for simple tasks and /profile power only when needed. Switch to Ollama for unlimited free usage on private/offline tasks.13. Git Integration
AI-Powered Commits
/commit
# OCCode analyzes your diff and generates:
# "feat: Add user authentication with JWT tokens"
# Approve? [y/N]
Enhanced Diff
/diff src/auth.ts # Syntax-highlighted diff
/diff --side-by-side src/auth.ts # Side-by-side view
/diff --no-syntax src/auth.ts # Plain text diff
Git Status
/git
# Shows branch, staged files, unstaged changes
CLI Commit Command
occode commit # Interactive commit
occode commit -a # Stage all + commit
occode commit -a -p # Stage all + commit + push
14. Feature Toggles
OCCode includes powerful features that use additional AI tokens. Toggle them on/off to control costs.
High Token Impact HIGH
| Feature | Command | Default | Description |
|---|---|---|---|
| Test-Driven Generation | /tdg | Off | Generate tests + iterate until passing (3-10x tokens) |
| Visual UI Repair | /features enable visualUIRepair | Off | Screenshot analysis + auto-fix UI issues |
| Browser E2E Testing | /features enable browserTesting | Off | Puppeteer/Playwright auto-test generation |
| Auto Code Review | /features enable autoCodeReview | Off | Automatic PR review + suggestions |
Medium Token Impact MEDIUM
| Feature | Command | Default |
|---|---|---|
| Proactive Monitoring | /features enable proactiveMonitoring | Off |
| Coverage-Guided Tests | /features disable coverageGuidedTests | On |
| Auto Documentation | /features enable autoDocumentation | Off |
| Refactoring Suggestions | /features enable refactoringSuggestions | Off |
| Performance Optimization | /features enable performanceOptimization | Off |
Low Token Impact LOW
| Feature | Command | Default |
|---|---|---|
| Auto-Fix (LSP Loop) | /autofix | On |
Token Cost Multiplier
/features cost
# Shows: Token Multiplier: 1.2x (baseline + coverage-guided tests)
# With TDG enabled: 5.0x
# With multiple features: up to 10x+
15. Configuration Reference
Global Config: ~/.occode/config.json
{
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"maxTokens": 4096,
"temperature": 0.7,
"autoApprove": false,
"features": {
"tdg": false,
"autoFix": true,
"coverageGuidedTests": true
}
}
Project Config: .occode.json
{
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"mode": "interactive",
"contextPatterns": ["src/**/*"],
"ignorePatterns": ["node_modules", "dist"]
}
Environment Variables
| Variable | Description | Example |
|---|---|---|
OCCODE_PROVIDER | AI provider | anthropic |
OCCODE_MODEL | Model name | claude-sonnet-4-20250514 |
OCCODE_API_ENDPOINT | Custom endpoint | http://localhost:11434/v1 |
OCCODE_MAX_TOKENS | Max output tokens | 4096 |
OCCODE_TEMPERATURE | Temperature (0-1) | 0.7 |
OCCODE_API_KEY | Fallback API key | — |
ANTHROPIC_API_KEY | Anthropic key | sk-ant-... |
OPENAI_API_KEY | OpenAI key | sk-... |
GOOGLE_API_KEY | Google/Gemini key | — |
DEEPSEEK_API_KEY | DeepSeek key | — |
MISTRAL_API_KEY | Mistral key | — |
GROQ_API_KEY | Groq key | — |
TOGETHER_API_KEY | Together AI key | — |
OPENROUTER_API_KEY | OpenRouter key | sk-or-v1-... |
OPENCAN_API_KEY | OpenCan key | — |
Configuration Priority (highest to lowest)
- Command-line flags (
--model,--provider) - Environment variables (
OCCODE_MODEL) - Project config (
.occode.json) - Global config (
~/.occode/config.json) - Built-in defaults
16. Troubleshooting
API Key Not Found
occode config --set-key --provider anthropic
# Or set via environment:
export ANTHROPIC_API_KEY="sk-ant-..."
Model Not Available
# Check configuration:
occode config --list
# Verify model name for your provider:
/model # Shows current model
/converge catalog # Browse all available models
Command Timeout
occode run "task" --timeout 600 # 10 minutes
Local Model Connection (Ollama)
# Ensure Ollama is running:
curl http://localhost:11434/api/tags
# Start if needed:
ollama serve
# Check available models:
ollama list
Session Won't Resume
# Check session ID:
occode history
# Sessions stored at:
ls ~/.occode/sessions/
High Token Usage
/features cost # Check token multiplier
/features # See which features are enabled
/compact # Compress conversation history
17. Tips & Best Practices
/context add or -c flag so the AI understands your code.
/cost regularly. Use /profile fast for simple tasks to save money.
/checkpoint "before big refactor".
--dry-run to see what the AI would do before executing.
/compact to summarize conversation history and reduce token usage.
fast for quick queries, power for complex work.