MCP server · live

Ask Claude "how much does this model cost?"

Install once. Get live LLM pricing tools inside Claude Desktop, Claude.ai custom connectors, Cursor, Windsurf, Zed, Cline, and any other MCP client. Free. Read-only. No auth.

Live now Free forever Read-only No auth required

Endpoint https://usagewall.vercel.app/api/mcp · Protocol MCP 2025-06-18

What it does

A read-only MCP server backed by our live pricing table. Six tools, no surprises.

Two resources are also exposed: usagewall://pricing/full (the entire table) and usagewall://pricing/<provider> (one provider, e.g. usagewall://pricing/openai).

Install in Claude

01

Open Claude settings

Claude Desktop → Settings → Developer → Edit Config.
Claude.ai → Settings → Connectors → Add custom connector.

02

Add the server

For Claude.ai custom connectors paste the URL below. For Claude Desktop config, see the JSON snippet.

03

Ask a question

"What's the cheapest reasoning model right now?" or "Cost of 100K input + 5K output on Claude Sonnet 4.6?"

Claude.ai custom connector

Paste this URL into the custom connector dialog:

https://usagewall.vercel.app/api/mcp

Claude Desktop config

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "usagewall": {
      "url": "https://usagewall.vercel.app/api/mcp"
    }
  }
}

Cursor / Windsurf / Zed / Cline

Same URL, drop it in the MCP server settings of any of these clients. UsageWall speaks the standard HTTP MCP transport — no SDK required, no auth handshake.

Quick check from the terminal

List tools without any client:

curl -s https://usagewall.vercel.app/api/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'

Estimate cost of 50K input + 2K output on GPT-4o:

curl -s https://usagewall.vercel.app/api/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"estimate_cost","arguments":{"provider":"openai","model_id":"gpt-4o","input_tokens":50000,"output_tokens":2000}}}'

What's not here

Source & reliability

Built on our public pricing JSON, hand-checked monthly against the providers' own pricing pages plus an OpenRouter live overlay. Models pending official pricing are marked verified: false. Bugs and price corrections welcome at hello@agenciabaku.com or on GitHub.

← Back to usagewall.dev