Skip to main content
Proxy MCP (Model Context Protocol) requests to upstream MCP servers with identity injection, per-tool authorization, and delegation token exchange. The MCP Proxy app sits between AI agents and your existing MCP servers, adding enterprise identity and access control without modifying the upstream servers.
Console terminology: In the Maverics Console, the combination of applications, policies, and connector bindings is managed through User Flows. In YAML, these elements are configured directly within each app’s configuration block under apps[].

Overview

The MCP Proxy app type is used in AI Identity Gateway mode to secure MCP server traffic. It acts as an identity-aware proxy between AI agents and upstream MCP servers, applying inbound authorization policies and performing outbound token exchange so that upstream servers receive properly scoped delegation tokens. The MCP Proxy app requires an MCP Provider to be configured.

How It Works

The MCP Proxy operates as an identity-aware intermediary for MCP traffic:
  1. Agent connects — The AI agent connects to the Orchestrator’s MCP endpoint (SSE or Streamable HTTP).
  2. OAuth authentication — The Orchestrator validates the agent’s OAuth token against the configured authorization server.
  3. Tool discovery — The Orchestrator proxies tool discovery to the upstream MCP server and returns the available tools (optionally namespaced with a prefix).
  4. Policy evaluation — When the agent invokes a tool, the Orchestrator evaluates OPA policies before forwarding the request.
  5. Token exchange — The Orchestrator exchanges the agent’s token for a delegation token scoped to the upstream MCP server, with per-tool scopes and TTLs.
  6. Upstream forwarding — The tool call is forwarded to the upstream MCP server via the configured HTTP streaming transport.
  7. Response passthrough — The upstream server’s response flows back through the Orchestrator to the agent, with audit logging capturing the interaction.

Use Cases

  • Securing existing MCP servers — Add identity and authorization to upstream MCP servers without modifying them. The Orchestrator handles OAuth authentication and token exchange.
  • Per-tool authorization — Apply different scopes and TTLs to different tools based on the tool name. Read-only tools get read scopes; write tools get write scopes.
  • HTTP streaming transport — Connect to upstream MCP servers over the network via HTTP streaming.
  • Centralized token exchange — Exchange agent tokens for service-specific delegation tokens with per-tool granularity, preserving the user’s identity through the agent-to-server chain.

Key Concepts

Upstream Transport

MCP Proxy connects to upstream MCP servers over HTTP streaming. Configure the upstream server URL, optional TLS profile, and connection pooling settings.

Tool Namespacing

When multiple MCP Proxy apps connect to different upstream MCP servers, tool names can collide. The toolNamespace feature adds a configurable prefix to all tools from an app, enabling agents to access tools from multiple MCP servers through a single endpoint.

Token Exchange for MCP Servers

MCP Proxy uses RFC 8693 token exchange to convert the agent’s inbound OAuth token into a scoped token for the upstream MCP server. Each tool gets its own scopes and TTL, enabling least-privilege access. The delegation token carries both agent and user identity through the entire chain.

Inbound OPA Policies

OPA Rego policies evaluate inbound MCP requests before forwarding to the upstream server. This adds a centralized authorization layer to MCP servers that may lack their own authorization. Policies can block specific tool invocations based on agent identity, user attributes, or requested tool name.

Upstream Token Validation Best Practice

The upstream MCP servers receive standard MCP requests, and the Orchestrator adds identity, authorization, and audit transparently — upgrading existing MCP servers with enterprise identity without modifying them. While no protocol-level changes are required, it is best practice for upstream MCP servers to validate the received delegation token. Specifically, upstreams should:
  • Validate the token signature and expiry — Confirm the token has not been tampered with and is still valid.
  • Confirm the issuer matches the Auth Provider Orchestrator — The iss claim should match the expected authorization server.
  • Verify the audience claim matches the server’s own identifier — The aud claim should contain the MCP server’s registered audience.
  • Check that the token’s scopes authorize the requested tool invocation — The scope claim should include the permissions required for the specific tool.
This defense-in-depth approach ensures that even if the gateway is misconfigured or compromised, the upstream server independently enforces authorization.

Setup

1

Navigate to Applications

Go to Applications in the sidebar and click Create. Select MCP Proxy App from the application type list.
2

Set the application name

Enter a Name to identify this MCP Proxy application.
3

Configure tool namespacing

Tool namespacing is enabled by default. Enter a Namespace prefix for tool names (e.g., mission_control_). Only alphanumeric characters, dots, dashes, and underscores are allowed. Disable the Enable Tool Namespacing toggle only if this is the sole MCP app.
4

Upload an application icon (optional)

Drag or click to upload an icon image (JPEG, PNG, or SVG, up to 2MB).
5

Configure the upstream MCP server

Under Upstream MCP Server, the Transport Type defaults to HTTP Streaming. Enter the Upstream URL of the remote MCP server (e.g., http://mcp-server:8080/mcp).
6

Configure TLS settings (optional)

Under TLS Settings, optionally set a CA Path for self-signed certificates on the upstream. Enable Skip TLS Verification only for testing. Under mTLS, provide the TLS Cert and TLS Key file paths or secret provider references if the upstream server requires client certificate authentication.
7

Configure inbound authorization

Under Authorization > Inbound Request Policy, upload an OPA policy definition (.rego file) or click Add to write an inline policy. The policy enforces fine-grained access control for incoming MCP requests based on agent identity, requested tool name, and user attributes.
8

Configure outbound authorization

Under Outbound Request Authorization, select the Authorization Type (Token Exchange or Unprotected). For Token Exchange, choose the Exchange Type: Delegation (recommended — preserves both agent and user identity) or Impersonation (fully assumes user identity). Select the OIDC Identity Provider and enter the Audience for the upstream MCP server.
9

Add tool configurations (optional)

Click Add Tool Configuration to define per-tool authorization. Enter the Tool Name (exact match or regex pattern prefixed with ~ ), set an optional Token Lifetime (TTL), and add OAuth Scopes that will be requested during token exchange for that tool.
10

Add operation configurations (optional)

Click Add Operation Configuration to configure authorization for MCP protocol operations. Select a Method (initialize or tools/list), set an optional Token Lifetime (TTL), and add OAuth Scopes for the operation.
11

Save

Click Save to create the MCP Proxy application.

Troubleshooting

Symptoms: The agent connects to the proxy but tool discovery fails with a connection error. No tools are returned.Causes:
  • The upstream.stream.url is incorrect or unreachable from the Orchestrator.
  • TLS certificate issues when connecting to the upstream MCP server over HTTPS.
  • The upstream MCP server is not running or is not accepting connections.
Resolution:
  • Verify the upstream.stream.url is correct and reachable from the Orchestrator host.
  • If using TLS, check the TLS profile configuration and ensure the upstream server’s certificate is trusted.
  • Confirm the upstream MCP server is running and accepting connections on the expected port.
Symptoms: Tools from different upstream MCP servers have name collisions, causing unexpected behavior when agents invoke tools.Causes:
  • The toolNamespace feature is not configured on one or more MCP Proxy apps.
  • Namespace prefixes are not unique across apps, so tools from different servers still collide.
Resolution:
  • Configure unique toolNamespace.name values for each MCP Proxy app to prefix tool names (e.g., service_a_ and service_b_).
  • Ensure toolNamespace.disabled is set to false (the default) on all apps that share a single MCP endpoint.
Symptoms: The agent authenticates successfully but tool invocation fails. Orchestrator logs show a token exchange failure.Causes:
  • The Auth Provider Orchestrator’s token endpoint is unreachable from the Gateway Orchestrator.
  • The urn:ietf:params:oauth:grant-type:token-exchange grant type is not enabled on the OIDC Provider app.
  • Audience mismatch — the audience in the token exchange configuration does not match the OIDC Provider’s expectedAudiences.
Resolution:
  • Verify the Auth Provider Orchestrator’s token endpoint is reachable from the Gateway Orchestrator.
  • Ensure the OIDC Provider app has urn:ietf:params:oauth:grant-type:token-exchange in its grantTypes list.
  • Check that the audience value in the MCP Proxy app’s tokenExchange configuration matches the OIDC Provider app’s expectedAudiences.
Symptoms: The agent connection drops during tool invocation. Logs show “connection reset” or timeout errors.Causes:
  • SSE keep-alive is not configured, so the connection appears idle to intermediate proxies or load balancers.
  • A proxy or load balancer is enforcing a timeout that kills long-lived connections.
Resolution:
  • Configure upstream.stream.connection.keepAlive.interval with an appropriate interval (e.g., 15s) to keep connections alive.
  • Configure any intermediate proxy or load balancer to allow long-lived connections for the MCP endpoint.
Symptoms: Multi-step agent interactions fail partway through. The agent loses its session between tool invocations.Causes:
  • The session timeout is too short for the expected agent workflow duration.
  • The upstream MCP server takes longer to respond than the configured timeout allows.
Resolution:
  • Increase the session timeout to accommodate the expected workflow duration.
  • Review the upstream MCP server’s response times and adjust connection timeouts (upstream.stream.connection.dialTimeout) accordingly.