The technical design of OpenClaw: a gateway that routes messages from 23+ channels to an LLM-powered agent with MCP tool integrations, running in a Docker sandbox.
OpenClaw is structured as three layers. The gateway layer handles inbound messages from 23+ channels (WhatsApp, Slack, Discord, Telegram, web chat) and routes them to the agent. The agent layer processes messages using an LLM, decides on actions, and calls tools through MCP. The sandbox layer isolates the agent in a Docker container with network controls.
The architecture is designed for self-hosting. A single Docker Compose deployment includes the gateway, agent runtime, and all dependencies. The system prompt (SOUL) defines the agent's personality and behavior.
Clawctl wraps this architecture with managed infrastructure: automated deployment, health monitoring, auto-recovery, encrypted secrets, and audit logging.
Understanding the architecture helps you debug issues, optimize performance, and make informed decisions about security and scaling. It is the foundation for everything OpenClaw does.
Clawctl manages the entire architecture for you. Deploy with one click. Health monitoring watches all three layers. Auto-recovery restarts failed components. You focus on the agent, not the infrastructure.
Try Clawctl — 60 Second DeployYes. OpenClaw is open source. Clawctl adds managed infrastructure, security, and operational tooling on top.
Minimum: 1 vCPU, 2GB RAM, Docker. Recommended: 2 vCPU, 4GB RAM for production workloads.
Each agent runs in its own container. Scale by adding more containers on more nodes.
Agent Gateway
The control plane that routes messages between users and AI agents across multiple channels, managing authentication, rate limiting, and channel-specific protocols.
AI Agent Runtime
The execution environment that hosts an AI agent, managing its lifecycle, tool access, memory, and communication with LLM providers.
Docker Sandbox
A Docker container configured with restricted permissions that isolates an AI agent from the host system and other containers.
MCP Server
A service that exposes tools and data to an AI agent through the Model Context Protocol, enabling standardized integrations with external systems.