Guides

LLM router

LLM router is an OpenClaw plugin that classifies requests and routes them to the most suitable model through a local OpenAI-compatible proxy.

#Overview

This guide is specifically for OpenClaw users. The plugin classifies requests when needed and forwards them upstream to Rungate. It also handles x402 payment retries when the selected upstream model returns 402.

  • Local OpenAI-compatible proxy
  • Automatic request classification and model routing
  • One x402 payment retry on upstream 402
  • Wallet introspection via /wallet

#Installation

For production OpenClaw installs, use the published package:

openclaw plugins install @rungate/llmrouter
openclaw gateway restart

#Models

Pass one of the logical model names as the model field in your request.

  • llmrouter/auto — classifies the prompt and picks the best model automatically
  • llmrouter/simple — routes directly to a fast, low-cost model
  • llmrouter/coding — routes directly to a code-optimized model
  • llmrouter/reasoning — routes directly to a reasoning-optimized model

llmrouter/auto is the recommended default. Use the others to skip classification when you already know the task type.

#Wallet

The paying wallet belongs to the running proxy instance, not to the upstream service. Fund it with USDC on Base — the wallet address is shown via GET /wallet. The wallet resolution order is:

  • Use LLM_ROUTER_WALLET_KEY if set
  • Otherwise use ~/.openclaw/llmrouter/wallet.key
  • Otherwise generate a new wallet and save it

You can inspect the active wallet through GET /wallet or the native /wallet command in OpenClaw.

#Request flow

When using llmrouter/auto, the proxy classifies the latest user message into one of four categories — simple, coding, reasoning, or vision — and maps it to a concrete upstream model. If you pass a specific logical model like llmrouter/coding, classification is skipped entirely.

The proxy only intervenes if the upstream returns 402. In that case it:

  • loads the local paying wallet
  • signs and pays the x402 challenge
  • retries the same request once

If the upstream returns a retryable error (404, 429, 5xx), the proxy falls back to the next model candidate for that category before giving up.