Before getting started with Task Master, you’ll need to set up your API keys. There are a couple of ways to do this depending on whether you’re using the CLI or working inside MCP. It’s also a good time to start getting familiar with the other configuration options available — even if you don’t need to adjust them yet, knowing what’s possible will help down the line.
Optimize Context Usage: You can control which Task Master MCP tools are loaded using the TASK_MASTER_TOOLS environment variable. This helps reduce LLM context usage by only loading the tools you need.Options:
Create a .env file in your project root and include the keys for the providers you plan to use:
.env
Copy
Ask AI
# Required API keys for providers configured in .taskmaster/config.jsonANTHROPIC_API_KEY=sk-ant-api03-your-key-herePERPLEXITY_API_KEY=pplx-your-key-here# OPENAI_API_KEY=sk-your-key-here# GOOGLE_API_KEY=AIzaSy...# AZURE_OPENAI_API_KEY=your-azure-openai-api-key-here# etc.# Optional Endpoint Overrides# Use a specific provider's base URL, e.g., for an OpenAI-compatible API# OPENAI_BASE_URL=https://api.third-party.com/v1## Azure OpenAI Configuration# AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/ or https://your-endpoint-name.cognitiveservices.azure.com/openai/deployments# OLLAMA_BASE_URL=http://custom-ollama-host:11434/api# Google Vertex AI Configuration (Required if using 'vertex' provider)# VERTEX_PROJECT_ID=your-gcp-project-id
The main configuration file (.taskmaster/config.json) allows you to control nearly every aspect of Task Master’s behavior. Here’s a high-level look at what you can customize:
You don’t need to configure everything up front. Most settings can be left as defaults or updated later as your workflow evolves.