Skip to main content
Updated 2025-02-23

Products

OpenClaw

Open-source personal AI assistant that connects to WhatsApp, Telegram, Discord, Slack, and more. Runs 24/7. Choose cloud AI or run locally with Ollama.

Entry:Free Data:Flexible Best for:Multi-channel, self-host option

Use cases

Task automation (email, calendar, flights) Multi-channel assistant (WhatsApp, Telegram, Discord, Slack) Code execution and Codex integration Persistent memory across sessions 5,700+ ClawHub skills Remote control and automation

Deployment options

OpenClaw can run with cloud AI or with Ollama on your own equipment. Choose based on cost, data sensitivity, and control.

Cloud AI

Connect to Anthropic, OpenAI, or other cloud APIs

Pricing: OpenClaw is free (MIT). You pay for the cloud API: Anthropic, OpenAI, etc. Per-token pricing applies. Use our Cost Calculator to estimate.

Pros:

  • Best model quality (Claude, GPT-4)
  • No hardware to manage
  • Fast to get started
  • Same agent features: memory, skills, multi-channel

Cons:

  • API costs scale with usage
  • Data sent to cloud provider
  • Ongoing per-token bills

Data handling: Data flows to your chosen cloud provider. Check provider terms (Anthropic, OpenAI) for training and residency.

Ollama (on your equipment)

Run models locally with Ollama. No API bills.

Pricing: OpenClaw is free. Ollama is free. Cost is hardware (GPU) and power. See our Self-hosted GPU Comparisons for £/1M token estimates.

Pros:

  • No per-token costs
  • Data never leaves your machine
  • Offline capable
  • Full control
  • Same 5,700+ ClawHub skills

Cons:

  • Upfront hardware cost
  • You manage updates and security
  • Model quality depends on GPU
  • Requires 5GB+ VRAM (8B) to 40GB+ (70B)

Data handling: All data stays on your hardware. Complete data sovereignty.

Business fit

Pros

  • Open source
  • Self-hostable or cloud
  • One assistant for many channels
  • Extensible
  • 24/7 availability

Cons

  • Requires setup (Node 22+, config)
  • Cloud mode: API costs
  • Self-hosted: hardware + maintenance
When it makes sense: Cloud when you want best models without hardware. Ollama when you want zero API bills, data sovereignty, or offline use.