Quick Reference
| Flag | Provider | Type | Notes |
|---|---|---|---|
--ollama / --ol | Ollama | Local | Free, no API costs, cloud option |
--lmstudio / --lm | LM Studio | Local | MLX models (fast on Apple Silicon) |
--aws | AWS Bedrock | Cloud | Requires AWS credentials |
--vertex | Google Vertex AI | Cloud | Requires GCP project |
--apikey | Anthropic API | Cloud | Direct API access |
--azure | Microsoft Azure | Cloud | Azure Foundry |
--vercel | Vercel AI Gateway | Cloud | Any model: Anthropic, OpenAI, xAI, Google, Meta, more |
--pro | Claude Pro | Subscription | Default if logged in |
How Configuration Works
All provider credentials are stored in one file:~/.ai-runner/secrets.sh
Initial Setup
This file is created automatically by./setup.sh from the secrets.example.sh template:
.bashrc — just add them to secrets.sh, and then switch providers freely with ai --aws, ai --vertex, etc.
Session-Scoped Behavior
All provider configurations are session-scoped:- Changes only affect the active terminal session
- On exit, original settings automatically restore
- Plain
claudealways runs in native state - Running
claudein another terminal is unaffected
ai --lmstudio in one terminal while using claude normally in another.
Provider Detection and Defaults
Automatic Provider Selection
If you don’t specify a provider flag, Andi AIRun automatically detects and uses:- Claude Pro (if logged in with
claude login) - First configured provider in secrets.sh
Setting a Default Provider
You can set a default provider to avoid typing the flag every time:Model Tier System
Andi AIRun uses a three-tier model system to balance performance and cost:Tier Levels
| Tier | Aliases | Use Case | Claude Models |
|---|---|---|---|
| High | --opus, --high | Complex reasoning, large refactors | Claude Opus 4.6 |
| Mid | --sonnet, --mid | General coding tasks (default) | Claude Sonnet 4.6 |
| Low | --haiku, --low | Fast operations, small edits | Claude Haiku 4.5 |
Usage Examples
Background Model
Andi AIRun uses a “small/fast” model for background operations (like file searches, quick checks). By default, this is set to the Low tier model (Haiku). For local providers (Ollama, LM Studio), the background model defaults to the same model as the main tier to avoid costly model swapping.Configuring Model Tiers
You can customize model tiers per provider insecrets.sh:
Cloud Providers
Local Providers
Custom Models
Override the tier system with a specific model:Agent Teams
All providers support agent teams (ai --team). Coordination uses Claude Code’s internal task list and mailbox, not provider-specific features.
Learn More
Read the Claude Code Agent Teams documentation
Next Steps
Local Providers
Set up Ollama or LM Studio for free, private AI
Cloud Providers
Configure AWS, Google, Anthropic, Azure, or Vercel
Switching Providers
Learn to switch providers to avoid rate limits