LLM Providers
JARVIS supports multiple LLM providers and can fall back between them automatically.
The provider system is configured under llm in ~/.jarvis/config.yaml.
Supported Providers
Section titled “Supported Providers”The merged product currently exposes these provider blocks:
- Anthropic
- OpenAI
- Groq
- Gemini
- Ollama
- OpenRouter
Recommended Starting Point
Section titled “Recommended Starting Point”For most users:
- Primary:
anthropic - Fallback:
["openai", "ollama"]
That gives you a strong default cloud model plus a cloud fallback and an optional local fallback.
Example Configuration
Section titled “Example Configuration”llm: primary: "anthropic" fallback: ["openai", "ollama"]
anthropic: api_key: "sk-ant-..." model: "claude-sonnet-4-6"
openai: api_key: "sk-..." model: "gpt-5.4"
ollama: base_url: "http://localhost:11434" model: "llama3"Provider Notes
Section titled “Provider Notes”Anthropic
Section titled “Anthropic”Best default for many users.
Strengths:
- Strong instruction following
- Strong tool use
- Good balance for complex autonomous tasks
OpenAI
Section titled “OpenAI”A strong alternative or fallback.
Strengths:
- Broad model family
- Good tool use
- Good fallback for cloud-hosted setups
Useful when you care about latency and compatible API access.
Gemini
Section titled “Gemini”A strong additional provider if you want another cloud fallback option.
Ollama
Section titled “Ollama”Best when you want local inference or reduced cloud dependence.
Strengths:
- No per-token cloud billing
- Local model hosting
- Useful as a fallback when cloud providers are unavailable
Important operational note:
ollama.base_urlis resolved from the daemon’s network point of view, not your browser’s.- If the daemon runs on a VPS and Ollama runs on your laptop,
http://localhost:11434will not work unless the daemon and Ollama are on the same host.
This is one of the most common setup mistakes. See Troubleshooting.
OpenRouter
Section titled “OpenRouter”Useful if you want access to many models through one provider API key.
Fallback Strategy
Section titled “Fallback Strategy”JARVIS tries:
- The
primaryprovider - Each provider in
fallback, in order
That means provider order matters. Keep the list short and intentional.
Good Provider Setups
Section titled “Good Provider Setups”Cloud-first
Section titled “Cloud-first”llm: primary: "anthropic" fallback: ["openai", "gemini"]Hybrid cloud + local
Section titled “Hybrid cloud + local”llm: primary: "anthropic" fallback: ["openai", "ollama"]Local-first
Section titled “Local-first”llm: primary: "ollama" fallback: []Video Tutorial Placeholder
Section titled “Video Tutorial Placeholder”Video tutorial placeholder: choosing providers, API keys, and fallback order.
Add your future video link here.