OCCode CLI — User Guide

1. Introduction

What is OCCode CLI?

OCCode CLI is a terminal-based AI coding assistant developed by OpenCan.ai. It brings the power of advanced language models directly to your command line with multi-provider support, real-time cost tracking, checkpoint-based undo, and over 60 interactive commands.

Multi-Provider AI

11 providers, 40+ models including Claude, GPT, Gemini, DeepSeek, Ollama & more

Checkpoint System

Save and restore file states for safe undo/rollback of any change

Cost Tracking

Real-time token usage and cost display across all providers

Session Persistence

Resume conversations without losing context

Convergence Engine

Multi-model ensemble with Merge, Vote, Debate, and Review strategies

Git Integration

AI-generated commit messages, enhanced diffs, auto-stage

System Requirements

RequirementDetails
Node.js18.0 or higher (for npm install method)
OSWindows 10+, macOS 12+, Linux (x64/arm64)
Disk Space~50MB (binary) or ~5MB (npm)
NetworkRequired for cloud AI providers; optional for Ollama/local

2. Installation

Download (Account Required)

Download OCCode from opencan.ai/downloads. An opencan.ai account is required. A valid license key is required to activate the application.

PlatformFile
Windows x64occode-0.1.0-windows-x64.zip
macOS Inteloccode-0.1.0-macos-x64.tar.gz
macOS Apple Siliconoccode-0.1.0-macos-arm64.tar.gz
Linux x64occode-0.1.0-linux-x64.tar.gz
# Linux/macOS install
tar xzf occode-0.1.0-linux-x64.tar.gz
sudo mv occode /usr/local/bin/

# Windows - extract the .zip, then add the folder to your PATH

# Activate with your license key
occode activate --key YOUR-LICENSE-KEY

Verify Installation

occode --version
occode --help

3. Quick Start

Step 1: Configure Your API Key

# Interactive configuration wizard
occode config

# Or set directly
occode config --set-key --provider anthropic

Step 2: Start an Interactive Session

occode

This launches the REPL (Read-Eval-Print Loop) where you can chat with the AI and give coding instructions.

Step 3: Run a One-Shot Task

# Execute a task and exit
occode run "Add unit tests for UserService"

# Autonomous mode (no approval prompts)
occode run "Fix all TypeScript errors" --yes

# With specific files in context
occode run "Refactor this component" -c src/App.tsx

Step 4: Understanding the Interface

# The REPL shows a timestamp prompt:
2026-02-10 14:30:12 EST ❯ your message here

# Use slash commands for quick actions:
/help        # Show all commands
/cost        # Show token costs
/model       # Show current model
/status      # Show session info

4. Core Commands

occode / occode chat — Interactive Mode

occode [chat]
  -m, --model <model>         # Model to use (e.g. claude-sonnet-4-20250514)
  -p, --provider <provider>   # AI provider (anthropic, openai, etc.)
  -c, --context <files...>    # Add files to context
  --session <id>              # Resume previous session

occode run <task> — Single Task Execution

occode run <task>
  -y, --yes                   # Auto-approve all actions (autonomous)
  -s, --supervised            # Approve all file changes
  -m, --model <model>         # Model to use
  -c, --context <files...>    # Add files to context
  --dry-run                   # Preview without executing
  --max-turns <n>             # Maximum agent turns (default: 50)
  --timeout <seconds>         # Timeout limit

occode watch — Watch Mode

occode watch
  -p, --pattern <glob>        # File pattern to watch
  -i, --ignore <patterns>     # Patterns to ignore
  -t, --task <task>           # Task to run on changes

occode config — Configuration

occode config
  --set <key=value>           # Set config value
  --get <key>                 # Get config value
  --list                      # List all settings
  --reset                     # Reset to defaults
  --set-key                   # Set API key securely (keychain)

occode commit — AI Git Commit

occode commit
  -a, --all                   # Stage all changes
  -p, --push                  # Push after commit

occode explain — Explain Code

occode explain [file]
  -d, --detailed              # Detailed explanation
  --architecture              # Explain project structure

occode review — Code Review

occode review [path]
  --security                  # Focus on security
  --performance               # Focus on performance
  --diff                      # Review staged changes

occode search — Semantic Search

occode search <query>
  -n, --limit <n>             # Max results
  -t, --type <type>           # Symbol type filter

occode generate — Generate Code/Docs

occode generate <type>
  # Types: readme, tests, docs, types

occode checkpoint — Manage Checkpoints

occode checkpoint list             # List all checkpoints
occode checkpoint create <msg>     # Create named checkpoint
occode checkpoint restore <id>     # Restore to checkpoint
occode checkpoint delete <id>      # Delete checkpoint

occode undo — Undo Last Change

occode undo

occode history — Session History

occode history
  -n, --limit <n>             # Number of sessions
  --clear                     # Clear all history
  --export <file>             # Export to file
  --import <file>             # Import from file

Other Commands

occode init                  # Initialize in current directory
occode status                # Show current status
occode update                # Check for updates

5. Interactive Mode (REPL)

While in chat mode, use these slash commands:

Provider & Model Commands

/provider [name]             # Show current or switch provider
/model [name]                # Show current or change model
/api_key [key]               # Show status or set API key
/api_url [url]               # Show or change custom API endpoint
/api_url reset               # Reset to default endpoint

Model Profile Commands (12)

/profile                     # Show active profile and list all
/profile <name>              # Activate a profile
/profile create <name>       # Create new profile interactively
/profile template <name>     # Create from built-in template
/profile templates           # List available templates
/profile edit <name>         # Edit existing profile
/profile delete <name>       # Delete a profile
/profile off                 # Deactivate current profile
/profile default <name>      # Set default profile (auto-activates on startup)
/profile prefix <name> <prefix>  # Set message prefix trigger
/profile export              # Export profiles to JSON
/profile import <file>       # Import profiles from JSON

Convergence Commands (18)

/converge                    # Show convergence status
/converge on | enable        # Enable convergence mode
/converge off | disable      # Disable convergence mode
/converge strategy <name>    # Set strategy (merge/vote/debate/review)
/converge add <alias>        # Add model by short alias
/converge remove <name>      # Remove a participant model
/converge aggregator <name>  # Set aggregator model for synthesis
/converge preset <name>      # Load preset configuration
/converge presets            # List available presets
/converge rounds <n>         # Set debate rounds (1-5)
/converge show on|off        # Show/hide individual model outputs
/converge models             # List configured participant models
/converge catalog            # Browse all models in catalog
/converge available          # Show models with API keys detected
/converge search <keyword>   # Search models
/converge last               # Show stats from last run
/converge reset              # Reset to defaults
/converge export|import      # Export/import configuration

Context Management (6)

/context                     # Show context overview with token usage
/context add <file>          # Add file to context
/context remove <file>       # Remove file from context
/pin <file>                  # Pin file to persistent context
/unpin <file>                # Remove pin from file
/exclude <pattern>           # Exclude files matching glob
/include <pattern>           # Remove exclusion pattern

Session Management (7)

/clear                       # Clear conversation history
/status                      # Show session statistics
/cost                        # Show detailed cost breakdown
/compact                     # Compact history to save tokens
/export <file>               # Export session to JSON
/debug                       # Toggle debug mode
/mode interactive|auto       # Set execution mode

Git Integration (3)

/git                         # Show git status
/commit                      # Generate AI-powered commit message
/diff <files>                # Enhanced diff with syntax highlighting
/diff --side-by-side <files> # Side-by-side comparison

Checkpoint & Undo (4)

/checkpoint [message]        # Create named checkpoint
/checkpoint list             # List all checkpoints
/checkpoint restore <id>     # Restore to specific checkpoint
/undo                        # Undo last file change

Daemon & Indexing (5)

/index                       # Show indexing status
/index rebuild               # Force rebuild codebase index
/daemon                      # Show daemon status
/daemon start|stop|restart   # Manage background daemon

Subscription (3)

OCCode includes a 7-day free trial with full access. Create an opencan.ai account and choose a plan to start your trial — you won't be charged until day 8. After the trial, a license key is required. Cancel before day 8 to avoid charges. No refunds after billing.

/subscription                # Show subscription status (trial days remaining or plan info)
/subscription activate <key> # Activate license key
/subscription plans          # View available plans

Feature Toggles

/features                    # Show all features and status
/features enable <feature>   # Enable a feature
/features disable <feature>  # Disable a feature
/features cost               # Show token cost impact
/features reset              # Reset to defaults
/tdg                         # Toggle Test-Driven Generation
/autofix                     # Toggle Auto-Fix in LSP Loop

Help & Transcripts

/help | /h | /?              # Show all commands
/transcripts                 # Toggle transcript saving
/transcripts status          # Show transcript configuration
/transcripts export [path]   # Export to unencrypted JSON
/transcripts clear           # Clear all entries

Keyboard Shortcuts

KeyAction
Ctrl+CStop current operation
Ctrl+DExit session
Up/DownNavigate command history
TabAuto-complete file paths and commands

6. AI Providers

OCCode supports 11 AI providers with 40+ models.

ProviderSpeedQualityCostKey Required
Anthropic (Claude)FastExcellent$$$Yes
OpenAI (GPT)FastExcellent$$$Yes
DeepSeekFastExcellent$Yes
Google (Gemini)MediumGood$$Yes
Mistral AIFastGood$$Yes
GroqUltra FastGoodFree*Yes
Together AIFastGood$$Yes
OpenRouterVariesVariesVariesYes
Ollama (Local)MediumGoodFreeNo
OpenCanVariesVariesVariesYes
Custom/LocalVariesVariesVariesOptional

Anthropic (Claude) Setup

occode config --set provider=anthropic
occode config --set model=claude-sonnet-4-20250514
occode config --set-key --provider anthropic
# Get key from: https://console.anthropic.com

OpenAI (GPT) Setup

occode config --set provider=openai
occode config --set model=gpt-4o
occode config --set-key --provider openai
# Get key from: https://platform.openai.com

DeepSeek Setup

occode config --set provider=deepseek
occode config --set model=deepseek-coder
occode config --set-key --provider deepseek
# Get key from: https://platform.deepseek.com

Ollama (Local - Free, Offline)

# 1. Install Ollama: https://ollama.ai/download
# 2. Pull a model:
ollama pull llama3.3

# 3. Configure OCCode:
occode config --set provider=local
occode config --set apiEndpoint=http://localhost:11434/v1
occode config --set model=llama3.3

OpenRouter (100+ Models via One Key)

occode config --set provider=openrouter
occode config --set model=anthropic/claude-sonnet-4
occode config --set-key --provider openrouter
# Get key from: https://openrouter.ai/keys

Recommended Setups

User TypePrimaryFallback
Professional DevAnthropic + Claude SonnetOpenRouter + DeepSeek R1
Student/LearnerGroq + Llama 3.3 70B (free)Ollama + Llama 3.3 (free)
Team/EnterpriseOpenCan (centralized billing)Anthropic or OpenAI
Budget-ConsciousDeepSeek + deepseek-coder ($)Ollama (free)

7. Model Profiles

Profiles are named model configurations for quick switching.

Built-in Templates

TemplateProviderModelUse Case
fastAnthropicClaude HaikuQuick responses, low cost
powerAnthropicClaude OpusComplex reasoning
creativeAnthropicHigh-temp SonnetCreative tasks
gptOpenAIGPT-4 TurboAlternative perspective
localOllamaLocal modelsOffline, private

Usage

# Create from template
/profile template fast

# Activate
/profile fast

# Create custom profile
/profile create my-custom

# Set prefix trigger (e.g., "quick: your message" auto-activates fast profile)
/profile prefix fast quick

# Set a default profile
/profile default fast

# Export/import profiles for team sharing
/profile export
/profile import profiles.json

8. Convergence Engine

The Convergence Engine runs your query through multiple AI models simultaneously and synthesizes the best response.

4 Strategies

StrategyHow It WorksBest For
Merge (MoA)Parallel generation + synthesisMaximum quality
VoteDemocratic selection via model votingConsensus decisions
DebateMulti-round critique and refinementThorough analysis
ReviewGenerate + review + revise workflowCode quality

Built-in Presets

/converge preset duo-merge      # 2 models with merge
/converge preset trio-merge     # 3 models for max quality
/converge preset code-review    # Cost-effective review workflow
/converge preset debate         # Thorough analysis
/converge preset vote           # Democratic consensus
/converge preset local-merge    # Zero API cost (local models)

Custom Configuration

/converge on                     # Enable convergence
/converge strategy merge         # Set strategy
/converge add sonnet             # Add Claude Sonnet
/converge add gpt4               # Add GPT-4
/converge aggregator opus        # Use Opus as synthesizer
/converge rounds 3               # Set 3 debate rounds

9. Context Management

Adding Files to Context

/context add src/api/routes.ts
/context add src/**/*.ts         # Glob patterns supported

Pinning Files (Persistent Across Sessions)

/pin src/types.ts                # Always in context
/unpin src/types.ts              # Remove pin

Excluding Files

/exclude "*.test.ts"
/exclude "**/node_modules/**"
/include "*.test.ts"             # Remove exclusion

Context Overview

/context
# Output:
# Context Overview:
#   Working directory: /home/user/my-project
#   Token usage: 12,450 / 128,000 (9.7%)
#   [###.............................]
#
#   Pinned files:
#     src/types.ts (2,500 tokens)
#   Active context files:
#     src/api/routes.ts (1,800 tokens)
#     ... and 15 more files

Automatic Detection

OCCode automatically detects project type (Node.js, Python, Rust, Go, Java), frameworks (React, Django, Express), and loads relevant context.

10. Session Management

Automatic Persistence

Sessions are automatically saved to ~/.occode/sessions/. Each session gets a unique ID.

Resume a Previous Session

occode --session sess_abc123

Session Commands

/status                # Messages, tokens, cost, turns
/clear                 # Reset conversation (current session)
/compact               # Summarize to save tokens
/export session.json   # Save session to file

History Management

occode history            # View recent sessions
occode history -n 10      # Last 10 sessions
occode history --export sessions.json
occode history --clear    # Delete all history

11. Checkpoints & Undo

Checkpoints save file states before changes so you can always roll back.

Creating Checkpoints

/checkpoint Before refactoring auth module
# or
occode checkpoint create "Before refactoring"
Tip: Checkpoints are automatically created before every destructive operation (write, edit, delete).

Restoring

/checkpoint list              # See all saved states
/checkpoint restore ckpt_123  # Restore to that state

Quick Undo

/undo                         # Reverts last file change
# or
occode undo

12. Cost Tracking

Viewing Costs

/cost
# Output:
# Cost Report
# ────────────────────────────
# Provider: anthropic
# Model: claude-sonnet-4-20250514
#
# Token Usage:
#   Input:  12,450 tokens
#   Output: 3,200 tokens
#   Total:  15,650 tokens
#
# Total Cost: $0.0856

Model Tiers

TierExamplesCost
FlagshipOpus, GPT-4, Gemini Pro$$$
BalancedSonnet, o3-mini$$
FastHaiku, GPT-4o-mini, Gemini Flash$
EconomyOllama, local modelsFree
Budget Tip: Use /profile fast for simple tasks and /profile power only when needed. Switch to Ollama for unlimited free usage on private/offline tasks.

13. Git Integration

AI-Powered Commits

/commit
# OCCode analyzes your diff and generates:
# "feat: Add user authentication with JWT tokens"
# Approve? [y/N]

Enhanced Diff

/diff src/auth.ts                 # Syntax-highlighted diff
/diff --side-by-side src/auth.ts  # Side-by-side view
/diff --no-syntax src/auth.ts     # Plain text diff

Git Status

/git
# Shows branch, staged files, unstaged changes

CLI Commit Command

occode commit                 # Interactive commit
occode commit -a              # Stage all + commit
occode commit -a -p           # Stage all + commit + push

14. Feature Toggles

OCCode includes powerful features that use additional AI tokens. Toggle them on/off to control costs.

High Token Impact HIGH

FeatureCommandDefaultDescription
Test-Driven Generation/tdgOffGenerate tests + iterate until passing (3-10x tokens)
Visual UI Repair/features enable visualUIRepairOffScreenshot analysis + auto-fix UI issues
Browser E2E Testing/features enable browserTestingOffPuppeteer/Playwright auto-test generation
Auto Code Review/features enable autoCodeReviewOffAutomatic PR review + suggestions

Medium Token Impact MEDIUM

FeatureCommandDefault
Proactive Monitoring/features enable proactiveMonitoringOff
Coverage-Guided Tests/features disable coverageGuidedTestsOn
Auto Documentation/features enable autoDocumentationOff
Refactoring Suggestions/features enable refactoringSuggestionsOff
Performance Optimization/features enable performanceOptimizationOff

Low Token Impact LOW

FeatureCommandDefault
Auto-Fix (LSP Loop)/autofixOn

Token Cost Multiplier

/features cost
# Shows: Token Multiplier: 1.2x (baseline + coverage-guided tests)
# With TDG enabled: 5.0x
# With multiple features: up to 10x+
Warning: Enabling TDG + Visual UI + Code Review can increase token costs 10x+. Enable features only when needed.

15. Configuration Reference

Global Config: ~/.occode/config.json

{
  "provider": "anthropic",
  "model": "claude-sonnet-4-20250514",
  "maxTokens": 4096,
  "temperature": 0.7,
  "autoApprove": false,
  "features": {
    "tdg": false,
    "autoFix": true,
    "coverageGuidedTests": true
  }
}

Project Config: .occode.json

{
  "provider": "anthropic",
  "model": "claude-sonnet-4-20250514",
  "mode": "interactive",
  "contextPatterns": ["src/**/*"],
  "ignorePatterns": ["node_modules", "dist"]
}

Environment Variables

VariableDescriptionExample
OCCODE_PROVIDERAI provideranthropic
OCCODE_MODELModel nameclaude-sonnet-4-20250514
OCCODE_API_ENDPOINTCustom endpointhttp://localhost:11434/v1
OCCODE_MAX_TOKENSMax output tokens4096
OCCODE_TEMPERATURETemperature (0-1)0.7
OCCODE_API_KEYFallback API key
ANTHROPIC_API_KEYAnthropic keysk-ant-...
OPENAI_API_KEYOpenAI keysk-...
GOOGLE_API_KEYGoogle/Gemini key
DEEPSEEK_API_KEYDeepSeek key
MISTRAL_API_KEYMistral key
GROQ_API_KEYGroq key
TOGETHER_API_KEYTogether AI key
OPENROUTER_API_KEYOpenRouter keysk-or-v1-...
OPENCAN_API_KEYOpenCan key

Configuration Priority (highest to lowest)

  1. Command-line flags (--model, --provider)
  2. Environment variables (OCCODE_MODEL)
  3. Project config (.occode.json)
  4. Global config (~/.occode/config.json)
  5. Built-in defaults

16. Troubleshooting

API Key Not Found

occode config --set-key --provider anthropic
# Or set via environment:
export ANTHROPIC_API_KEY="sk-ant-..."

Model Not Available

# Check configuration:
occode config --list

# Verify model name for your provider:
/model                    # Shows current model
/converge catalog         # Browse all available models

Command Timeout

occode run "task" --timeout 600   # 10 minutes

Local Model Connection (Ollama)

# Ensure Ollama is running:
curl http://localhost:11434/api/tags

# Start if needed:
ollama serve

# Check available models:
ollama list

Session Won't Resume

# Check session ID:
occode history

# Sessions stored at:
ls ~/.occode/sessions/

High Token Usage

/features cost         # Check token multiplier
/features              # See which features are enabled
/compact               # Compress conversation history

17. Tips & Best Practices

1. Be Specific: Clear, detailed instructions produce much better results than vague requests.
2. Use Context: Add relevant files with /context add or -c flag so the AI understands your code.
3. Monitor Costs: Check /cost regularly. Use /profile fast for simple tasks to save money.
4. Use Checkpoints: Before risky operations, create a checkpoint with /checkpoint "before big refactor".
5. Preview First: Use --dry-run to see what the AI would do before executing.
6. Compact Long Sessions: Use /compact to summarize conversation history and reduce token usage.
7. Use Profiles: Set up model profiles for different tasks: fast for quick queries, power for complex work.
8. Local for Privacy: Use Ollama for sensitive code that should never leave your machine.

OCCode CLI User Guide — Version 1.0 — February 2026

© 2025-2026 OpenCan.ai — All Rights Reserved

Source references: occode/README.md, occode/COMPLETE_FEATURE_LIST.md, occode/CONFIGURATION.md, occode/AI_PROVIDERS.md, occode/ENVIRONMENT_VARIABLES.md, occode/DAEMON_ARCHITECTURE.md, occode/FEATURE_TOGGLES.md, occode/OCCODE_CLI_DESIGN_DOCUMENT.md, occode/src/cli/run.ts

Contact · Privacy · Terms · Back to OpenCan.ai

Login Admin Dashboard
Products and Docs
OCCode CLI User Guide OCCode CLI Admin Guide OCCode CLI Install guide OCCode CLI Full DOcumentation OCCode Pricing
Blog
AI Articles Technical Topics Applications
About Us Terms of Service Privacy Policy Contact Us