Skip to main content
n8n’s MCP Tool node speaks MCP over Streamable HTTP — add Computer Use as one more tool provider.

Config (managed Yambr)

FieldValue
URLhttps://api.yambr.com/mcp/computer_use
AuthAuthorization: Bearer sk-yambr-...
HeadersX-Chat-Id: {{ $execution.id }}
Optional headersX-User-Email, X-MCP-Servers
Get the key at app.yambr.com.

Config (self-hosted)

FieldValue
URLhttp://computer-use-server:8081/mcp (same Docker network) or https://cu.example.com/mcp (public)
AuthAuthorization: Bearer <MCP_API_KEY>
HeadersX-Chat-Id: {{ $execution.id }}

Pattern: AI-assisted pipeline

A useful pattern is to wire Computer Use into an AI agent node:
  1. Trigger — webhook / schedule / Telegram message.
  2. AI Agent — any chat model, configured to use your own OpenAI / Anthropic credentials.
  3. MCP Tool — points at Yambr (or your self-hosted server). Gives the agent bash, file I/O, browser, skills.
  4. Post-process — forward artifacts to email / Slack / DB.
Per-execution X-Chat-Id = per-run sandbox. For long-lived agents, reuse a chat id across runs so the sandbox’s state persists.
The n8n AI Agent node’s model credential is your OpenAI / Anthropic / Google key. Yambr stays a tool provider — we don’t take your LLM traffic.

Headers reference

HeaderRequiredPurpose
AuthorizationIf MCP_API_KEY is setBearer <token>
X-Chat-IdYesOne sandbox per id
X-User-EmailNoPer-user skills
X-MCP-ServersNoComma-separated MCP servers passed to sub_agent