Skip to content

LLM Providers

JARVIS supports multiple LLM providers and can fall back between them automatically.

The provider system is configured under llm in ~/.jarvis/config.yaml.

The merged product currently exposes these provider blocks:

  • Anthropic
  • OpenAI
  • Groq
  • Gemini
  • Ollama
  • OpenRouter

For most users:

  • Primary: anthropic
  • Fallback: ["openai", "ollama"]

That gives you a strong default cloud model plus a cloud fallback and an optional local fallback.

llm:
primary: "anthropic"
fallback: ["openai", "ollama"]
anthropic:
api_key: "sk-ant-..."
model: "claude-sonnet-4-6"
openai:
api_key: "sk-..."
model: "gpt-5.4"
ollama:
base_url: "http://localhost:11434"
model: "llama3"

Best default for many users.

Strengths:

  • Strong instruction following
  • Strong tool use
  • Good balance for complex autonomous tasks

A strong alternative or fallback.

Strengths:

  • Broad model family
  • Good tool use
  • Good fallback for cloud-hosted setups

Useful when you care about latency and compatible API access.

A strong additional provider if you want another cloud fallback option.

Best when you want local inference or reduced cloud dependence.

Strengths:

  • No per-token cloud billing
  • Local model hosting
  • Useful as a fallback when cloud providers are unavailable

Important operational note:

  • ollama.base_url is resolved from the daemon’s network point of view, not your browser’s.
  • If the daemon runs on a VPS and Ollama runs on your laptop, http://localhost:11434 will not work unless the daemon and Ollama are on the same host.

This is one of the most common setup mistakes. See Troubleshooting.

Useful if you want access to many models through one provider API key.

JARVIS tries:

  1. The primary provider
  2. Each provider in fallback, in order

That means provider order matters. Keep the list short and intentional.

llm:
primary: "anthropic"
fallback: ["openai", "gemini"]
llm:
primary: "anthropic"
fallback: ["openai", "ollama"]
llm:
primary: "ollama"
fallback: []

Video tutorial placeholder: choosing providers, API keys, and fallback order.

Add your future video link here.