Claude 兼容接入

Claude-Compatible Proxy

很多开发者不是不会调用 Claude,而是卡在另一层问题上:Claude 怎么接进现有项目、能不能保留现在的 OpenAI SDK、Claude Code 怎么和业务接口统一起来。 Most developers know how to call Claude. The harder question is: how do you integrate it into an existing project without rewriting your SDK layer, and how do you unify Claude Code with your backend calls.

如果你希望直接把 Claude 放进统一入口里,可以使用 api.llapi.net。一套 Key 加一条兼容地址,就能把 Claude、Claude Code 以及后续的 GPT、Gemini 一起收进同一套工作流。 To bring Claude into a unified gateway, use api.llapi.net. One key, one compatible endpoint — Claude, Claude Code, GPT, and Gemini all in the same workflow.

TL;DR

  • 核心不是"替代 Claude",而是把 Claude 收进统一调用入口
  • The goal isn't to replace Claude — it's to route it through a unified entry point
  • OpenAI SDK 项目:只改 apiKeybaseURLmodel
  • OpenAI SDK projects: only change apiKey, baseURL, model
  • Claude Code:设置 ANTHROPIC_BASE_URLANTHROPIC_API_KEY 即可
  • Claude Code: set ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY

用 OpenAI SDK 调 ClaudeCalling Claude via OpenAI SDK

如果你的项目已经在用 OpenAI SDK,这是改造成本最低的方式。只改三处:

If your project already uses the OpenAI SDK, this is the lowest-cost path. Change only three things:

TypeScript
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.LLAPI_API_KEY,    // ← 换成灵络API KeyLingluoAPI key
  baseURL: "https://api.llapi.net/v1"     // ← 换成灵络API endpointLingluoAPI endpoint
});

const completion = await client.chat.completions.create({
  model: "claude-sonnet",               // ← 换成 Claude 模型名Claude model name
  messages: [
    { role: "system", content: "You are a concise engineering assistant." },
    { role: "user", content: "Explain why a proxy layer reduces migration cost." }
  ]
});

console.log(completion.choices[0]?.message?.content);

业务层的消息结构和调用方式完全不变,只有入口和模型名需要更新。

Your business-layer message structure and call patterns stay identical. Only the entry point and model name update.

用原生 Anthropic SDKNative Anthropic SDK

如果你想使用 Anthropic 原生 SDK 并直接调用 Claude API 格式,只需替换 base_url

If you prefer the native Anthropic SDK and want to call Claude's API format directly, just swap the base_url:

Python
import anthropic

client = anthropic.Anthropic(
    api_key="your-llapi-key",
    base_url="https://api.llapi.net"    # 注意:不带 /v1No /v1 suffix here
)

message = client.messages.create(
    model="claude-sonnet-4-5",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello, Claude"}]
)
print(message.content)
TypeScript
import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  apiKey: process.env.LLAPI_API_KEY,
  baseURL: "https://api.llapi.net"
});

const msg = await client.messages.create({
  model: "claude-sonnet-4-5",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello, Claude" }]
});

console.log(msg.content);

Claude Code

如果你在用 Claude Code,只需设置两个环境变量,就能把 Claude Code 的请求路由到灵络API,与其他业务调用统一管理:

Using Claude Code? Set two environment variables to route all Claude Code requests through LingluoAPI, unifying it with your other API calls:

bash
export ANTHROPIC_API_KEY="your-llapi-key"
export ANTHROPIC_BASE_URL="https://api.llapi.net"

# 永久生效(写入 shell 配置)Permanent (write to shell config)
echo 'export ANTHROPIC_API_KEY="your-llapi-key"' >> ~/.zshrc
echo 'export ANTHROPIC_BASE_URL="https://api.llapi.net"' >> ~/.zshrc
source ~/.zshrc
这样做之后,Claude Code 和你的业务后端可以共用同一个 Key,通过同一个入口统一管理。你今天处理的是终端编程助手,明天可能就会继续接 Codex、Gemini 或别的模型入口。先把这层统一好,后面每一步都会轻很多。 With this setup, Claude Code and your backend services share the same key and unified entry point. Whether you add Codex, Gemini, or other models next, the foundation is already in place.

Python 流式输出Python Streaming

流式输出同样支持,写法与官方 SDK 一致:

Streaming is supported. The syntax is identical to the official SDK:

Python
import anthropic

client = anthropic.Anthropic(
    api_key="your-llapi-key",
    base_url="https://api.llapi.net"
)

with client.messages.stream(
    model="claude-sonnet-4-5",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Write a haiku"}]
) as stream:
    for text in stream.text_stream:
        print(text, end="", flush=True)

适合哪些团队Who this is for

场景Scenario 为什么适合Why it fits
已有 OpenAI SDK 项目,想加入 ClaudeOpenAI SDK project adding Claude 保留 SDK,只改入口和 model 名Keep the SDK, change only endpoint and model
同时使用 Claude Code + 后端服务Claude Code + backend services 统一 Key 管理,简化配置Unified key management, simpler config
准备同时测试多个 Claude 版本Testing multiple Claude versions 模型名从配置里切换,不动业务代码Switch model in config, not in business code
想把 AI 调用收拢到同一套监控体系Centralizing AI call monitoring 单一入口更容易做日志和告警Single entry point simplifies logging and alerts

FAQ

和官方 Claude API 是冲突关系吗? Does this conflict with the official Claude API?
+

不是。很多团队会同时保留两条路径:日常业务走统一入口,必须使用平台特有能力的模块再单独直连官方。

No. Many teams run both: daily workloads go through the unified gateway, while modules requiring platform-specific capabilities connect directly to the official API.

Claude Code 一定要单独走官方链路吗? Does Claude Code have to use the official endpoint?
+

不一定。通过设置 ANTHROPIC_BASE_URL,Claude Code 可以路由到灵络API,和你的其他服务统一管理,不需要分开维护两套 Key。

No. By setting ANTHROPIC_BASE_URL, Claude Code routes through LingluoAPI alongside your other services. No need to manage two separate sets of keys.

OpenAI SDK 调 Claude 和原生 Anthropic SDK 有什么区别? What's the difference between calling Claude via OpenAI SDK vs. Anthropic SDK?
+

接口格式不同。OpenAI SDK 走 /v1/chat/completions,原生 Anthropic SDK 走 /v1/messages。如果你已经有 OpenAI SDK 代码库,建议继续用 OpenAI SDK 调,改造成本最低。如果是新项目,两种都可以。

They use different API formats. OpenAI SDK uses /v1/chat/completions; native Anthropic SDK uses /v1/messages. For existing OpenAI SDK codebases, stick with the OpenAI SDK — lowest migration cost. For new projects, either works.

← 快速接入← Quickstart FAQ →