CrabTalkCrabTalk

Providers

How CrabTalk routes model requests — API standards, provider config, and supported providers.

CrabTalk supports multiple LLM providers through a unified Model trait. All providers are API-based — configure an endpoint, point the daemon at it, and go. Use crabtalk config for interactive setup, or edit config.toml directly.

API standards

Each provider uses a wire format selected by the kind field:

KindProtocolUsed by
openai (default)OpenAI chat completions APIOpenAI, DeepSeek, Grok, Qwen, Kimi, and any compatible endpoint
anthropicAnthropic Messages APIClaude
googleGoogle Gemini APIGemini
ollamaOllama local APIOllama
azureAzure OpenAI APIAzure OpenAI

If kind is omitted, CrabTalk defaults to openai. If the base_url contains "anthropic", the Anthropic kind is auto-detected.

Provider configuration

Each provider is a [provider.<name>] section in config.toml. A provider owns one or more models:

[provider.deepseek]
models = ["deepseek-chat", "deepseek-thinking"]
api_key = "sk-..."

Model names must be unique across all providers.

Selecting the active model

Set the default model in [system.crab]:

[system.crab]
model = "deepseek-chat"

Override per agent in [agents.*]:

[agents.researcher]
model = "claude-opus-4-6"

Supported providers

OpenAI

[provider.openai]
models = ["gpt-4o", "gpt-4o-mini"]
api_key = "sk-..."

Uses the OpenAI chat completions API (the default standard). Supports GPT-4o, GPT-4o-mini, o-series, and all OpenAI models.

Anthropic Claude

[provider.anthropic]
models = ["claude-sonnet-4-20250514", "claude-opus-4-6"]
api_key = "sk-ant-..."
kind = "anthropic"

Uses the Anthropic Messages API. Set kind = "anthropic" explicitly, or it will be auto-detected if your base_url contains "anthropic".

DeepSeek

[provider.deepseek]
models = ["deepseek-chat", "deepseek-thinking"]
api_key = "sk-..."

OpenAI-compatible API. Supports deepseek-chat and deepseek-thinking.

Google Gemini

[provider.google]
models = ["gemini-2.5-pro"]
api_key = "..."
kind = "google"

Grok

[provider.grok]
models = ["grok-3"]
api_key = "..."
base_url = "https://api.x.ai/v1"

OpenAI-compatible API. Requires explicit base_url.

Qwen

[provider.qwen]
models = ["qwen-plus"]
api_key = "..."
base_url = "https://dashscope.aliyuncs.com/compatible-mode/v1"

OpenAI-compatible via DashScope.

Kimi

[provider.kimi]
models = ["kimi-latest"]
api_key = "..."
base_url = "https://api.moonshot.cn/v1"

OpenAI-compatible API by Moonshot AI.

Ollama

[provider.ollama]
models = ["llama3.1"]
kind = "ollama"
base_url = "http://localhost:11434/v1"

No API key needed.

Azure OpenAI

[provider.azure]
models = ["gpt-4o"]
api_key = "..."
kind = "azure"
api_version = "2024-02-01"
base_url = "https://your-resource.openai.azure.com"

Custom OpenAI-compatible endpoints

Any OpenAI-compatible API works with a base_url:

[provider.my-provider]
models = ["my-model"]
api_key = "..."
base_url = "https://my-endpoint.com/v1"

Provider manager

The ProviderManager holds all configured providers and routes requests by model name. It supports hot-reload — update the config and the active provider changes without restarting the daemon.

CrabTalk's provider system is powered by CrabLLM, our open-source LLM gateway.

What's next

  • Configuration — full config setup
  • Commands — telegram, search, and custom commands
  • Auth — manage providers with the config TUI

On this page