Uptrack

Developer Guide

Install Uptrack in your AI — Claude, ChatGPT, Cursor, VS Code

Uptrack now ships as a remote MCP server with OAuth. Click "Connect", approve the consent screen, done — no npm, no API key paste, no JSON config. Stdio and direct REST still work for the CI-shaped use cases.

April 19, 2026

Option 1: Remote MCP over OAuth (recommended)

The hosted MCP server lives at https://api.uptrack.app/mcp. It speaks the MCP Streamable-HTTP transport and authenticates every request against Uptrack's OAuth 2.0 authorization server. Your AI client handles the token dance.

Claude.ai (Connectors)

Settings → Connectors → "Add custom connector" → paste https://api.uptrack.app/mcp. Claude walks you through the OAuth consent screen. Scopes like monitors:read and monitors:write are requested per the tools Claude needs.

Cursor

Settings → MCP → "Add new MCP server" → choose "URL", paste the endpoint:

{
  "mcpServers": {
    "uptrack": {
      "url": "https://api.uptrack.app/mcp"
    }
  }
}

Cursor detects the OAuth requirement from the server's 401 WWW-Authenticate header and opens the consent flow.

VS Code (1.100+)

Command Palette → "MCP: Add Server" → "HTTP" → paste the URL. VS Code discovers our /.well-known/oauth-authorization-server metadata automatically and completes the OAuth flow inside the editor.

ChatGPT / Windsurf

Same pattern — add a custom MCP server pointing at https://api.uptrack.app/mcp. Both clients handle OAuth discovery.

Manage or revoke the connection anytime from your dashboard under Settings → Connected apps.

Option 2: Stdio MCP with API key

When you want zero network round-trip to a third-party AS — running in an offline agent, a CI runner, or a scripted flow — the stdio server still works:

# Add to your MCP client config (Claude Desktop, etc.)
{
  "mcpServers": {
    "uptrack": {
      "command": "npx",
      "args": ["-y", "@uptrack-app/mcp"],
      "env": {
        "UPTRACK_API_KEY": "uk_live_..."
      }
    }
  }
}

Generate an API key at dashboard → API keys. Same tool surface as the remote server.

Option 3: Direct REST

Not every integration is an LLM conversation. For cron jobs, webhooks, backfills, or your own services — hit the REST API directly. Both API keys and OAuth Bearer tokens work:

# API key
curl https://api.uptrack.app/monitors \
  -H "X-API-Key: uk_live_..."

# OAuth Bearer (issued by the AS your AI client registered)
curl https://api.uptrack.app/monitors \
  -H "Authorization: Bearer ey..."

OpenAPI 3.0 spec: uptrack.app/openapi.yaml. Drop it into your codegen and you have a typed client in any language.

Discovery docs live at the well-known paths:

  • /.well-known/oauth-authorization-server — AS metadata (RFC 8414)
  • /.well-known/oauth-protected-resource — resource metadata (RFC 9728)
  • /.well-known/mcp/server-card.json — MCP server card (SEP-1649)
  • /.well-known/agent-skills/index.json — Agent Skills v0.2
  • /.well-known/api-catalog — API linkset (RFC 9727)

What the agent can do

Ten tools across monitors and incidents:

list_monitors          get_monitor          create_monitor
update_monitor         pause_monitor        resume_monitor
delete_monitor         list_incidents       get_incident
acknowledge_incident

Typical prompts that Just Work: "add a monitor for staging.acme.com at 30-second intervals", "which monitors went down in the last 24 hours?", "acknowledge the incident on the checkout API".

Let your AI manage your monitors

50 free monitors — 10 at 30-second checks, 40 at 1-minute. Remote MCP, stdio, REST — same account, same data, pick your interface.

Start Free