Skip to main content
The server speaks standard MCP over Streamable HTTP. Any HTTP client can drive it; you just need to handle the JSON-RPC envelope, session id, and SSE framing.

Initialize

curl -sD - -X POST "http://localhost:8081/mcp" \
  -H "Authorization: Bearer $MCP_API_KEY" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "X-Chat-Id: my-session" \
  -d '{
    "jsonrpc": "2.0",
    "id": 1,
    "method": "initialize",
    "params": {
      "protocolVersion": "2024-11-05",
      "capabilities": {},
      "clientInfo": {"name": "my-client", "version": "1.0"}
    }
  }'
Save the Mcp-Session-Id response header — you’ll pass it on every subsequent call.

List tools

curl -s -X POST "http://localhost:8081/mcp" \
  -H "Authorization: Bearer $MCP_API_KEY" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "Mcp-Session-Id: $SESSION_ID" \
  -H "X-Chat-Id: my-session" \
  -d '{"jsonrpc": "2.0", "id": 2, "method": "tools/list"}'

Call a tool

curl -s -X POST "http://localhost:8081/mcp" \
  -H "Authorization: Bearer $MCP_API_KEY" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "Mcp-Session-Id: $SESSION_ID" \
  -H "X-Chat-Id: my-session" \
  -d '{
    "jsonrpc": "2.0",
    "id": 3,
    "method": "tools/call",
    "params": {
      "name": "bash_tool",
      "arguments": {"command": "echo Hello", "description": "test"}
    }
  }'

List / read resources

Uploaded files show up as resources/list entries with URIs of the form file://uploads/{chat_id}/{encoded-rel-path}. See MCP methods → resources.

Dynamic system prompt

Non-MCP clients can fetch the per-session system prompt over plain HTTP:
curl "http://localhost:8081/system-prompt?chat_id=my-session&user_email=user@example.com"
Stream it into the model’s system message. The response carries an X-Public-Base-URL header with the server’s public URL — use that when rewriting file links.

Python example (mcp SDK)

from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

async def main():
    async with streamablehttp_client(
        url="http://localhost:8081/mcp",
        headers={
            "Authorization": "Bearer " + MCP_API_KEY,
            "X-Chat-Id": "my-session",
        },
    ) as (read, write, _):
        async with ClientSession(read, write) as session:
            await session.initialize()
            tools = await session.list_tools()
            result = await session.call_tool(
                "bash_tool",
                {"command": "echo Hello", "description": "test"},
            )
            print(result)

Managed Yambr

For managed Yambr, the same MCP endpoint is public at https://api.yambr.com/mcp/computer_use with Authorization: Bearer sk-yambr-.... Use the exact same initialize / tools/list / tools/call / resources/* calls shown above — just change the URL and the Bearer token. Your LLM traffic stays on your own provider (OpenAI, Anthropic, whatever); Yambr only provides the Computer Use tools. See LiteLLM gateway and Access model.

See also