Self-hosted AI platform showcase

OpenClaw Platform

A full AI agent stack running on a single host — VPS or Mac Mini. Multi-model support, messaging integrations, cron automation, persistent memory, and a skill system — without cloud platform lock-in or per-seat pricing.

OpenClaw is a self-hosted AI agent orchestration platform. It runs on standard Linux or macOS infrastructure, connects to multiple AI model providers, and integrates with Telegram, Slack, and HTTP APIs. All agent behaviour, memory, and configuration lives on infrastructure you control.

Infrastructure

VPS / Mac Mini

Entire platform on a single host — your choice of hardware

Model providers

5+

Anthropic, OpenAI, xAI, Google, local

Platform cost

Token-only

No per-seat or platform subscription fees

Business framing

Why this mattered

Cloud AI platforms charge per seat, limit customisation, and put conversation history on someone else's infrastructure. For teams that want full control over their AI stack — who can use it, which models they use, how conversations are stored, and what the agents can do — the right path is self-hosted. OpenClaw was built to make that path practical.

Observed pain

  • Cloud AI platforms become expensive at team scale and restrict customisation.
  • Conversation history and agent configuration stored on third-party infrastructure is a compliance and privacy concern for some organisations.
  • Self-hosted does not mean limited — the platform supports the same frontier models as any cloud product.

Guided walkthrough

Each block shows the business reason, the system move, and the operational implication.

Slide 01

Full AI stack on infrastructure you own

OpenClaw runs as a systemd service on Linux or a launchd agent on macOS. The gateway manages agent sessions, routes requests to model providers, and handles all integrations. Deploying a new agent or changing model configuration takes seconds and requires no cloud console access.

  • Runs on any Ubuntu / Debian VPS or a Mac Mini — no specialised hardware
  • Configuration managed via CLI and JSON files under version control
  • Gateway restarts automatically on failure; all state persisted locally

Slide 02

Messaging integrations out of the box

The Telegram bot and Slack integration are first-class platform features. Agents are available to users through their preferred messaging app without additional setup. The same agent can respond in Telegram, Slack, and the HTTP API simultaneously, with session context kept separate per channel.

  • Telegram bot — personal and group chat support
  • Slack bot — channel, DM, and thread integration
  • HTTP API — for custom front-ends and automation pipelines

Slide 03

Cron jobs and skills extend the platform

Scheduled cron jobs run agent tasks on a timer — daily summaries, monitoring checks, data pulls, or any task that should happen automatically. The skill system lets custom capabilities be added to agents as modular plugins without changing the core platform.

  • Cron scheduler with per-job model and agent configuration
  • Skills: reusable capability plugins attached to agents
  • Memory system: per-agent and per-user persistent context

Workflow anatomy

Each stage is small enough to inspect, yet together they form a coherent system.

01

01 · Deploy

Gateway installed and started on VPS

OpenClaw is installed via pnpm, configured with API keys in a local .env file, and started as a systemd user service. The gateway process manages all agent sessions, the model router, and integration listeners.

02

02 · Configure

Agents defined in openclaw.json

Each agent is defined with a name, system prompt, model preference, tools access level, and enabled integrations. Configuration changes take effect on gateway restart. No code changes required for most customisations.

03

03 · Connect

Integrations registered with external services

The Telegram bot token and Slack app credentials are added to configuration. The gateway registers webhook listeners and begins accepting messages through both channels, along with the local HTTP API.

04

04 · Run

Agents respond, cron jobs execute, memory accumulates

Live agents respond to messages in real time. Cron jobs run on schedule and post results to configured channels. Session memory builds up over time, giving agents persistent context about users and ongoing tasks.

Business impact

What changed for operations

  • Infrastructure costs scale with token usage only — no per-seat platform fees regardless of how many team members interact with the agents.
  • Full audit trail of every conversation stored locally — compliant with data residency requirements that prohibit sending conversation data to third-party platforms.
  • Operational teams can build and deploy custom agents without waiting for a vendor to support their use case.

Architecture note

Routing logic in plain English

  • systemd service → OpenClaw gateway → integration listeners (Telegram, Slack, HTTP) → session manager → model router → provider API → response → integration delivery
  • All state (sessions, memory, configuration) is local — the platform has no dependency on external databases or cloud storage for core operation.
  • The skill system extends agent capabilities as plugins; cron jobs extend platform automation as scheduled agents.

Stack in play

Use what the business already has, then make it behave like a coherent system instead of a collection of tabs.

OpenClaw gateway

The core runtime process. Manages agent sessions, model routing, integration listeners, cron scheduler, and the skill execution engine.

Telegram Bot API

First-class integration. Agents respond to private messages and group chat mentions through the Telegram platform.

Slack Bot / Events API

Team workspace integration. The same agents available in Slack channels and DMs, with separate session context from Telegram.

Anthropic / OpenAI / xAI APIs

Model providers. Each agent can specify a preferred model; the gateway routes requests and handles API key management.

Local persistence layer

Session memory, conversation history, and agent configuration stored as versioned files on the host (VPS or Mac Mini). No external database required for basic deployments.

Reusable pattern

The platform behind the agents

OpenClaw is the infrastructure that powers the AI Agent, Voice Intake, and other agent-based showcases on this site. Understanding the platform explains how those agents are deployed, configured, and maintained in production.