Skip to main content
The eight scenarios were a self-contained tour: crawl with one-off prompts, walk with the live browser and sub-agents, then run by packaging a recurring task into a skill. Scenario 8 ended on the moment the skill exists. This page is about what happens after that.

The pivot

Once a skill is built, the next question isn’t “how do I deploy this skill.” Skills aren’t deployed as standalone services and they aren’t HTTP endpoints. They live inside the Computer Use sandbox alongside the browser, terminal, file system, and sub-agents, and the model picks them up during a tool-call session. The right question is: where else can I open a Computer Use session that has this skill available? That’s an MCP question.

Computer Use is an MCP server

api.yambr.com — and any self-hosted Open Computer Use instance — exposes the Computer Use tool over the Model Context Protocol. Anything that speaks MCP connects to the same /mcp endpoint and gets the full sandbox: browser, shell, files, sub-agents, and every skill installed on that instance. The chat at chat.yambr.com is just one MCP client among many. It isn’t special — it’s Open WebUI pointed at /mcp. Anything else you point at the same URL gets the same engine. A few client shapes worth picturing:
  • A workflow tool. n8n has a built-in MCP client node. Add the Yambr endpoint, point a webhook or schedule at it, and the workflow can run a Computer Use session as a step.
  • A messenger. A Telegram or Slack bot configured as an MCP client — no custom backend, just an MCP-aware bot framework — forwards user messages into a Yambr session and posts the artifact back.
  • A desktop chat. Claude Desktop lets you add MCP servers in settings; once you add api.yambr.com/mcp, every conversation has access to the same sandbox the demo course used.
  • An in-house tool. A panel inside your CRM, a CLI for the ops team, an internal app — anything that wires up an MCP client gets the same surface. Integrations → Custom walks through the protocol-level wiring; LiteLLM is the gateway layer if you want unified billing.
The list isn’t exhaustive on purpose. The point is that the answer is always “add an MCP client.” No bespoke integration, no per-product API.

Skills travel with the instance

The leverage of this model comes from where skills live. A skill is a folder inside the Computer Use sandbox; every session on that instance can see it. If the team adds an invoice-builder skill to a self-hosted instance, every MCP client that talks to that instance — the chat, the n8n workflow, the Telegram bot, the CRM panel — gets it for free, without a redeploy on the client side. The user-facing surface changes. The engine doesn’t. That’s also why the skills overview, creating skills, and dynamic skills pages are worth a second read once you start thinking about multiple clients. The skill you write once is the skill every client invokes.

What to do next

If you want to try this:
  1. Get an API key from app.yambr.com — same key works for every MCP client.
  2. Browse Integrations and pick the client closest to where your users already are.
  3. If you’re writing your own client, read Platform → CU endpoint and the MCP API reference first.
The rest is the same loop you already know — pick a recurring task, build a skill for it in the chat, then change which MCP client opens the session. The skill stays put. Back to the course overview →