TraceKitTraceKit Docs
Integrations

MCP Server

Connect TraceKit to Claude, ChatGPT, and any MCP-compatible AI assistant. Query traces, service health, LLM costs, and anomalies using natural language.

MCP Server Integration

Connect your TraceKit account to AI assistants like Claude, ChatGPT, Cursor, and any Model Context Protocol compatible client. Query your observability data using natural language.

Overview

The TraceKit MCP server exposes 11 read-only tools that AI assistants can invoke:

  • search_traces -- Search and filter distributed traces
  • get_trace_detail -- Get full span waterfall for a trace
  • list_services -- List services with health metrics
  • get_service_metrics -- Latency percentiles, error rate, throughput
  • get_active_alerts -- Currently firing alert rules
  • search_anomalies -- Recent anomalies with severity
  • list_breakpoints -- Active code monitoring breakpoints
  • search_snapshots -- Live debugging snapshot captures
  • search_llm_calls -- Individual LLM API call details
  • get_llm_stats -- Aggregated LLM cost and usage stats
  • get_rca -- Root cause analysis for anomalies

All tools are read-only. No data is modified, deleted, or sent to third parties. Authentication uses OAuth 2.1 with PKCE.

Quick Start

1. Get your MCP server URL

Your server URL is the same for all accounts:

https://app.tracekit.dev/mcp

2. Connect in Claude

  1. Go to Settings > Connectors > Add connector
  2. Paste the URL: https://app.tracekit.dev/mcp
  3. Click Connect
  4. Enter your TraceKit API key on the authorization page
  5. Done -- TraceKit appears as a connector in your conversations

3. Connect in other MCP clients

For Claude Code, Cursor, Windsurf, or any MCP client that supports remote servers, add the server URL and authenticate with your API key when prompted.

4. Start asking questions

Once connected, you can ask your AI assistant questions like:

  • "How are my services performing?"
  • "Show me the slowest requests from checkout-service"
  • "How much am I spending on LLM API calls?"
  • "Are there any anomalies in my backend?"

Authentication

The MCP server uses OAuth 2.1 with PKCE for authentication:

  1. When you connect, your AI client discovers the auth endpoints automatically
  2. You are redirected to a TraceKit authorization page
  3. Enter your API key (found at app.tracekit.dev/api-keys)
  4. The client receives a bearer token and uses it for all subsequent requests

Your API key is exchanged directly between your browser and TraceKit. The AI model never sees your credentials.

Example Prompts

Service Health

What services are running in my account and how are they performing?
Are there any issues I should be concerned about?

Trace Investigation

Show me the slowest requests from payment-service in the last 24 hours.
What's causing the latency?

LLM Cost Analysis

How much am I spending on LLM API calls?
Break it down by model and show me which service is using the most tokens.

Error Detection

Are there any recent errors or anomalies in my backend?
If so, what's the root cause?

Supported Clients

ClientStatus
Claude (claude.ai)Supported
Claude Code (CLI)Supported
ChatGPTWhen MCP support is available
CursorSupported
WindsurfSupported
ClineSupported
Custom MCP clientsSupported

Technical Details

  • Transport: Streamable HTTP (JSON-RPC 2.0 over POST)
  • Protocol version: 2025-06-18
  • Authentication: OAuth 2.1 with PKCE (RFC 7636)
  • Discovery: RFC 9728 (Protected Resource Metadata) + RFC 8414 (Authorization Server Metadata)
  • Dynamic registration: RFC 7591

Endpoints

PathMethodPurpose
/mcpPOSTMCP JSON-RPC endpoint
/mcpGETSSE streaming (optional)
/.well-known/oauth-protected-resourceGETResource metadata discovery
/.well-known/oauth-authorization-serverGETAuth server metadata
/mcp/registerPOSTDynamic client registration
/mcp/authorizeGETAuthorization page
/mcp/tokenPOSTToken exchange

On this page