Home Models Compare Local Models Pricing Scorecards Evals OpenClaw Methodology

Why OpenClaw?

The Problem

  • Each AI provider has different APIs
  • Switching providers means rewriting code
  • API outages break your app
  • No standard way to build agents
  • LangChain is complex and heavy

The Solution

  • One API for 20+ providers
  • Switch providers in 1 line
  • Automatic fallbacks on failure
  • Skills system for reusable agents
  • Lightweight, TypeScript-first

Supported Providers

Works with all major AI providers. Route between them seamlessly.

๐ŸŸข

OpenAI

GPT-5.4, GPT-5.3 Codex

๐ŸŸ 

Anthropic

Claude Opus 4.6, Sonnet 4.6

๐Ÿ”ต

Google

Gemini 3.1 Pro

๐Ÿ”ด

Zhipu AI

GLM-5

๐ŸŸฃ

Meta

Llama 4

๐ŸŸก

Mistral

Mistral Large 3

Key Features

Multi-Provider Support

Route requests across 20+ providers. GPT-5 for coding, Claude for reasoning, Gemini for long-context - all in one app.

Automatic Fallbacks

If OpenAI goes down, automatically switch to Claude. If Claude hits rate limits, try Gemini. Keep your app running.

Skills System

Build reusable agent skills. Share them on ClawHub. Install skills from the community with one command.

Local Model Support

Run Ollama, LM Studio, or vLLM locally. OpenClaw treats local models as first-class citizens.

Quick Start

npm install -g openclaw
openclaw chat

That's it. OpenClaw will guide you through provider setup.

Compare to Alternatives

Feature OpenClaw LangChain Vercel AI SDK
Providers 20+ 100+ 10
Setup complexity Low High Low
Agent framework Built-in LangGraph (separate) No
Fallbacks Automatic Manual Manual
Skills marketplace Yes (ClawHub) No No
CLI included Yes No No