n8n’s MCP Tool node speaks MCP over Streamable HTTP — add Computer Use as one more tool provider.
Config (managed Yambr)
| Field | Value |
|---|
| URL | https://api.yambr.com/mcp/computer_use |
| Auth | Authorization: Bearer sk-yambr-... |
| Headers | X-Chat-Id: {{ $execution.id }} |
| Optional headers | X-User-Email, X-MCP-Servers |
Get the key at app.yambr.com.
Config (self-hosted)
| Field | Value |
|---|
| URL | http://computer-use-server:8081/mcp (same Docker network) or https://cu.example.com/mcp (public) |
| Auth | Authorization: Bearer <MCP_API_KEY> |
| Headers | X-Chat-Id: {{ $execution.id }} |
Pattern: AI-assisted pipeline
A useful pattern is to wire Computer Use into an AI agent node:
- Trigger — webhook / schedule / Telegram message.
- AI Agent — any chat model, configured to use your own OpenAI / Anthropic credentials.
- MCP Tool — points at Yambr (or your self-hosted server). Gives the agent bash, file I/O, browser, skills.
- Post-process — forward artifacts to email / Slack / DB.
Per-execution X-Chat-Id = per-run sandbox. For long-lived agents, reuse a chat id across runs so the sandbox’s state persists.
The n8n AI Agent node’s model credential is your OpenAI / Anthropic / Google key. Yambr stays a tool provider — we don’t take your LLM traffic.
| Header | Required | Purpose |
|---|
Authorization | If MCP_API_KEY is set | Bearer <token> |
X-Chat-Id | Yes | One sandbox per id |
X-User-Email | No | Per-user skills |
X-MCP-Servers | No | Comma-separated MCP servers passed to sub_agent |