OpenAI 兼容接入

OpenAI-Compatible Proxy

如果你的项目已经基于 OpenAI SDK 开发,接下来最常见的问题通常不是"还能不能继续用",而是:怎么把现有接口切到统一入口、现有 SDK 能不能保留、后面接 Claude 或 Gemini 要不要重写一层。 If your project already uses the OpenAI SDK, the real question isn't "can I keep using it?" — it's how to route it through a unified gateway, keep the existing SDK intact, and avoid rewriting everything when you add Claude or Gemini.

如果你不想让 OpenAI、Claude、Gemini 各自维护一套入口,可以直接使用 api.llapi.net。一套 Key 配一个统一 baseURL,就能把现有 OpenAI 风格项目先接起来,后面继续扩模型时也更省事。 If you don't want to maintain separate integrations for OpenAI, Claude, and Gemini, use api.llapi.net. One key, one baseURL — your existing OpenAI-style project is already compatible.

快速结论TL;DR

  • 核心不是换掉 OpenAI SDK,而是统一你的 API 入口
  • The goal isn't to replace the OpenAI SDK — it's to unify your API entry point
  • 对大多数项目来说,最轻的改法通常是只改 apiKeybaseURLmodel
  • For most projects, the only changes needed are apiKey, baseURL, and model
  • 通过 api.llapi.net,你可以先保留 OpenAI 风格调用,再决定是否继续扩到 Claude、Gemini
  • With api.llapi.net, you can keep the OpenAI call style and decide later whether to expand to Claude or Gemini

什么是 OpenAI 兼容接入What is the OpenAI-Compatible Proxy

简单说,就是在你的 OpenAI 风格代码和真实上游模型之间增加一层统一网关。这样做的目标不是隐藏模型,而是让模型差异停留在边缘层,不要直接渗透到项目的每个业务模块里。

In short: a unified gateway sits between your OpenAI-style code and the actual upstream models. The goal isn't to hide the models — it's to keep vendor differences at the edge, away from your business logic.

对很多团队来说,这层代理的价值非常直接:

The value for most teams is straightforward:

  • 继续使用已经跑顺的 OpenAI SDK、消息结构和错误处理习惯
  • Keep your well-tested OpenAI SDK, message structure, and error handling
  • 不需要在业务层分别维护多家模型厂商的不同接口写法
  • No need to maintain separate vendor-specific integration code in your business layer
  • 后续接入 Claude、Gemini 或其他模型时,整体结构不会越来越碎
  • When you add Claude, Gemini, or others later, the architecture stays clean

什么情况下应该考虑When to use this approach

  • 你已经有一套 OpenAI SDK 代码库,不想为了切模型重写基础设施
  • You have an existing OpenAI SDK codebase and don't want to rewrite infrastructure to switch models
  • 你要同时测试 GPT、Claude、Gemini,希望模型切换成本更低
  • You're running A/B tests across GPT, Claude, and Gemini and need cheap model switching
  • 你有多个项目共用一套 AI 接入层,想统一 Key、入口和配置方式
  • Multiple projects share an AI layer and you want unified key, endpoint, and config management
  • 你准备先让业务跑起来,再决定后续是否做更深的模型抽象
  • You want to ship first and decide on deeper model abstractions later

为什么很多团队会优先保留 OpenAI SDKWhy keep the OpenAI SDK

原因很现实。大多数已经上线的项目里,最成熟的一层往往就是 OpenAI 风格调用:SDK 已经装好了、消息结构已经跑顺了、错误处理和重试逻辑也调过很多轮、现有业务代码已经围绕这套接口写了不少封装。

The reason is practical. In most production projects, the most battle-tested layer is usually the OpenAI-style call: the SDK is installed, the message structure works, retry logic has been tuned through multiple iterations, and plenty of business code wraps it.

更稳的方式是先保留 OpenAI SDK,只把真正变化的部分放到入口层,让灵络API承接模型差异,再决定后面要不要往更深的架构演进。

The stable path is to keep the OpenAI SDK and only change the entry point. Let LingluoAPI absorb vendor differences. Decide on deeper architectural changes later, if at all.

第 1 步:在 api.llapi.net 申请 API KeyStep 1: Get an API Key

前往 api.llapi.net 申请你的 API Key,建议放进环境变量:

Go to api.llapi.net and create an API Key. Store it as an environment variable:

bash
# 临时生效Temporary
export LLAPI_API_KEY="your-api-key"

# 永久生效 (zsh)Permanent (zsh)
echo 'export LLAPI_API_KEY="your-api-key"' >> ~/.zshrc && source ~/.zshrc

第 2 步:把 OpenAI SDK 的 baseURL 指向统一入口Step 2: Point baseURL at the unified gateway

把现有项目里的 OpenAI 入口换成 https://api.llapi.net/v1

Replace your project's OpenAI endpoint with https://api.llapi.net/v1:

TypeScript
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.LLAPI_API_KEY,
  baseURL: "https://api.llapi.net/v1"
});

第 3 步:跑通一个最小请求Step 3: Run a minimal test request

TypeScript
const result = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [
    { role: "system", content: "You are a concise coding assistant." },
    { role: "user", content: "Summarize the latest incident report in five bullets." }
  ]
});
console.log(result.choices[0]?.message?.content);

确认:Key 能读到、baseURL 已切换、请求能正常返回结果。

Confirm: key is readable, baseURL is switched, and the response comes back correctly.

第 4 步:把模型切换能力留在入口层Step 4: Keep model switching at the config layer

很多团队接统一入口,不只是为了今天继续调 GPT,而是为了后面继续测试 Claude、Gemini 或其他模型。最稳的做法是不要把模型名散落在业务代码里,而是通过配置变量来管理:

Many teams adopt a unified gateway not just for today's GPT calls, but to make model switching cheap later. The right approach: keep the model name in config, not scattered across business code:

TypeScript
// config.ts — 模型名统一从配置里取Centralize model name in config
export const MODEL = process.env.LLM_MODEL ?? "gpt-4o";

// 切换模型只需改一个环境变量To switch models, just change one env var
// LLM_MODEL=claude-sonnet

怎么判断项目是否已经走了统一入口How to verify you're on the unified gateway

  • apiKey 已经来自 LLAPI_API_KEY
  • apiKey comes from LLAPI_API_KEY
  • baseURL 已经改成 https://api.llapi.net/v1
  • baseURL is set to https://api.llapi.net/v1
  • 现有 OpenAI SDK 调用无需大改就能正常返回结果
  • Existing OpenAI SDK calls work without further changes
  • 后续切模型时,不需要重做一整套供应商适配层
  • Model switching doesn't require rewriting a new vendor adapter

这条路线适合哪些团队Who this approach is for

场景Scenario 为什么适合Why it fits
已有 OpenAI 代码库Existing OpenAI codebase 改造范围最小,最快能跑起来Minimal changes, fastest path to production
准备做多模型 A/B 测试Multi-model A/B testing 模型切换更容易收口Model switching stays in one place
有多个项目共用 AI 能力Multiple projects sharing AI 更适合统一入口、Key 和文档Unified key, endpoint, and docs
想先上线再慢慢重构Ship first, refactor later 可以先兼容、再演进Compatible now, extensible later

FAQ

会不会把现有项目全打乱? Will this break my existing project?
+

通常不会。对大多数项目来说,变化主要集中在 apiKeybaseURLmodel 这几个位置。只要你原来就是 OpenAI 风格调用,这条路一般都比较顺。

Almost certainly not. Changes are limited to apiKey, baseURL, and model. If your project already uses OpenAI-style calls, the transition is usually seamless.

我还需要保留 OpenAI SDK 吗? Do I still need the OpenAI SDK?
+

大多数情况下需要,而且也应该保留。很多团队选择统一入口,正是为了不重写 SDK 层。

Yes, and you should. Most teams choose the unified gateway precisely because they don't want to rewrite the SDK layer.

什么时候不适合走这条路? When is this approach not the right fit?
+

如果你已经明确只会长期深度绑定某一个单一平台,而且准备为它单独维护全部原生能力,那也可以继续走纯官方直连路线。

If you're fully committed to a single provider long-term and plan to build around its native capabilities exclusively, direct connection is fine.

← 快速接入← Quickstart Claude 兼容接入Proxy