DeepAnima
Kouri AiApp Integration

OpenClaw

OpenClaw is an open-source personal AI assistant that supports WhatsApp, Discord, Telegram, and more. Configure Kouri Ai models to power your OpenClaw instance with high-quality AI at low cost.

About OpenClaw

OpenClaw is a popular open-source personal AI assistant that runs on your own devices and connects to messaging platforms like WhatsApp, Discord, Telegram, WeCom, Feishu, and more, enabling automated AI conversations.

OpenClaw supports multiple model providers. Kouri Ai is compatible with the OpenAI API format and can be directly integrated as a custom provider.

Deploy OpenClaw

Server deployment is recommended for this project! Rainyun is our partner — you can use our exclusive link https://www.rainyun.com/kouri_ to register and receive a coupon after binding your WeChat account. Rainyun supports one-click deployment for this project!

You can also follow the commands below to deploy manually.

System Requirements: Node.js 22 or higher

macOS / Linux:

curl -fsSL https://clawd.org.cn/install.sh | bash

Windows (PowerShell):

iwr -useb https://clawd.org.cn/install.ps1 | iex

After installation, run the onboard wizard to complete initial setup:

openclaw-cn onboard --install-daemon

During the model configuration step in onboard, you can also choose a custom OpenAI provider and enter our endpoint (https://api.kourichat.com/v1), model (recommended: glm-5, kimi-k2.5, minimax-m2.5, glm-4.7, longcat-flash-chat), and token to complete the setup.

Verify the gateway status:

openclaw-cn gateway status

Open the dashboard:

openclaw-cn dashboard

Then visit http://127.0.0.1:18789/ in your browser.

For more deployment details, refer to the OpenClaw Deployment Documentation.

Configure Kouri Ai Models

Via Configuration File

In the OpenClaw configuration file ~/.openclaw/openclaw.json, add the following models configuration:

{
  "models": {
    "mode": "merge",
    "providers": {
      "kouri": {
        "baseUrl": "https://api.kourichat.com/v1",
        "apiKey": "sk-kouri-your-api-key-here",
        "api": "openai-completions",
        "models": [
          {
            "id": "longcat-flash-chat",
            "name": "longcat-flash-chat",
            "reasoning": false,
            "input": ["text"],
            "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
            "contextWindow": 128000,
            "maxTokens": 4096
          },
          {
            "id": "glm-5",
            "name": "glm-5",
            "reasoning": false,
            "input": ["text"],
            "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
            "contextWindow": 128000,
            "maxTokens": 4096
          },
          {
            "id": "kimi-k2.5",
            "name": "kimi-k2.5",
            "reasoning": false,
            "input": ["text", "image"],
            "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
            "contextWindow": 262144,
            "maxTokens": 32768
          },
          {
            "id": "minimax-m2.5",
            "name": "minimax-m2.5",
            "reasoning": false,
            "input": ["text"],
            "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
            "contextWindow": 196608,
            "maxTokens": 32768
          },
          {
            "id": "glm-4.7",
            "name": "glm-4.7",
            "reasoning": false,
            "input": ["text"],
            "cost": {"input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0},
            "contextWindow": 202752,
            "maxTokens": 16384
          }
        ]
      }
    }
  }
}

Replace sk-kouri-your-api-key-here with your actual Kouri Ai API token. If you don't have one yet, visit the Kouri Ai Console to get one.

Model Details

ModelContext WindowMax OutputImage SupportDescription
longcat-flash-chat128K4096NoFast chat model
glm-5128K4096NoZhipu GLM-5
kimi-k2.5256K32768YesKimi multimodal model with ultra-long context
minimax-m2.5192K32768NoMiniMax large language model
glm-4.7200K16384NoZhipu GLM-4.7

More Information

On this page