Skip to main content
The AI proxy lets your apps call Anthropic and OpenAI APIs through Major’s managed gateway. Instead of managing API keys and billing for each provider separately, your apps authenticate via JWT tokens and usage is tracked automatically.

Setup

Enable the AI proxy for your organization in Organization Settings. Once enabled, your apps can make LLM API calls through the proxy.

Spending Limits

Set a monthly spending limit to control costs. When the limit is reached, proxy requests are blocked until the next billing cycle or until you increase the limit.

Usage

AI proxy usage appears in your Usage Dashboard under the “AI Proxy” category alongside coding sessions and app compute.