AI Providers

Taskmaster works with 15+ AI providers. You can mix and match — use Anthropic for task generation, Perplexity for research, and a local model for fallback. Bring your own keys and switch models anytime.

Three Model Roles

Taskmaster uses three independent model roles, each configurable with its own provider and model:

Role Purpose Default
Main Task generation, expansion, updates, and analysis Anthropic Claude Sonnet
Research Live web search for current information and citations Perplexity Sonar
Fallback Used when the main model fails or is unavailable Anthropic Claude Sonnet

Configuring Models

# Interactive setup — walks through all three roles
tm models --setup

# Set models directly
tm models --set-main claude-sonnet-4-20250514
tm models --set-research sonar
tm models --set-fallback gpt-4o-mini

# View current configuration
tm models

Supported Cloud Providers

Provider Models Best for
Anthropic Claude Opus, Sonnet, Haiku General-purpose (recommended default)
OpenAI GPT-4o, o3, o3-mini Reasoning-heavy tasks
Google Gemini Pro, Gemini Flash Fast generation, large context
Perplexity Sonar, Sonar Deep Research Research with live web access
xAI Grok 4, Grok 3, Grok 2 Large codebase context
Groq Various models Fast inference
OpenRouter 100+ models Access any model through one API
Mistral Mistral models European AI provider
Azure OpenAI GPT models via Azure Enterprise deployments
Amazon Bedrock Claude via AWS AWS-native deployments
Google Vertex Gemini via GCP Google Cloud deployments

Local & CLI Providers

Run Taskmaster without sending data to external APIs:

Provider Setup Cost
Ollama Install Ollama, pull a model Free, fully offline
LM Studio Download models locally Free, fully offline
Claude Code Install Claude Code CLI Uses your Claude Code subscription
Gemini CLI Google account Free tier available
OpenAI-compatible Any compatible endpoint Varies

For details on local model setup, see Local Models.

Switching Providers

You can change providers at any time without affecting your tasks:

# Switch main model to OpenAI
tm models --set-main gpt-4o

# Switch to a local model
tm models --set-main ollama/llama3

# Use Claude Code (no API key needed)
tm models --set-main claude-code/sonnet

API Key Setup

Each provider requires its own API key. See API Keys for the complete setup guide.

Keys can be configured via:

  • .env file in your project root
  • Environment variables in your shell
  • MCP server config in .mcp.json or editor settings