MCP Server
Connect TraceKit to Claude, ChatGPT, and any MCP-compatible AI assistant. Query traces, service health, LLM costs, and anomalies using natural language.
MCP Server Integration
Connect your TraceKit account to AI assistants like Claude, ChatGPT, Cursor, and any Model Context Protocol compatible client. Query your observability data using natural language.
Overview
The TraceKit MCP server exposes 11 read-only tools that AI assistants can invoke:
- search_traces -- Search and filter distributed traces
- get_trace_detail -- Get full span waterfall for a trace
- list_services -- List services with health metrics
- get_service_metrics -- Latency percentiles, error rate, throughput
- get_active_alerts -- Currently firing alert rules
- search_anomalies -- Recent anomalies with severity
- list_breakpoints -- Active code monitoring breakpoints
- search_snapshots -- Live debugging snapshot captures
- search_llm_calls -- Individual LLM API call details
- get_llm_stats -- Aggregated LLM cost and usage stats
- get_rca -- Root cause analysis for anomalies
All tools are read-only. No data is modified, deleted, or sent to third parties. Authentication uses OAuth 2.1 with PKCE.
Quick Start
1. Get your MCP server URL
Your server URL is the same for all accounts:
https://app.tracekit.dev/mcp2. Connect in Claude
- Go to Settings > Connectors > Add connector
- Paste the URL:
https://app.tracekit.dev/mcp - Click Connect
- Enter your TraceKit API key on the authorization page
- Done -- TraceKit appears as a connector in your conversations
3. Connect in other MCP clients
For Claude Code, Cursor, Windsurf, or any MCP client that supports remote servers, add the server URL and authenticate with your API key when prompted.
4. Start asking questions
Once connected, you can ask your AI assistant questions like:
- "How are my services performing?"
- "Show me the slowest requests from checkout-service"
- "How much am I spending on LLM API calls?"
- "Are there any anomalies in my backend?"
Authentication
The MCP server uses OAuth 2.1 with PKCE for authentication:
- When you connect, your AI client discovers the auth endpoints automatically
- You are redirected to a TraceKit authorization page
- Enter your API key (found at app.tracekit.dev/api-keys)
- The client receives a bearer token and uses it for all subsequent requests
Your API key is exchanged directly between your browser and TraceKit. The AI model never sees your credentials.
Example Prompts
Service Health
What services are running in my account and how are they performing?
Are there any issues I should be concerned about?Trace Investigation
Show me the slowest requests from payment-service in the last 24 hours.
What's causing the latency?LLM Cost Analysis
How much am I spending on LLM API calls?
Break it down by model and show me which service is using the most tokens.Error Detection
Are there any recent errors or anomalies in my backend?
If so, what's the root cause?Supported Clients
| Client | Status |
|---|---|
| Claude (claude.ai) | Supported |
| Claude Code (CLI) | Supported |
| ChatGPT | When MCP support is available |
| Cursor | Supported |
| Windsurf | Supported |
| Cline | Supported |
| Custom MCP clients | Supported |
Technical Details
- Transport: Streamable HTTP (JSON-RPC 2.0 over POST)
- Protocol version: 2025-06-18
- Authentication: OAuth 2.1 with PKCE (RFC 7636)
- Discovery: RFC 9728 (Protected Resource Metadata) + RFC 8414 (Authorization Server Metadata)
- Dynamic registration: RFC 7591
Endpoints
| Path | Method | Purpose |
|---|---|---|
/mcp | POST | MCP JSON-RPC endpoint |
/mcp | GET | SSE streaming (optional) |
/.well-known/oauth-protected-resource | GET | Resource metadata discovery |
/.well-known/oauth-authorization-server | GET | Auth server metadata |
/mcp/register | POST | Dynamic client registration |
/mcp/authorize | GET | Authorization page |
/mcp/token | POST | Token exchange |