You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is useful for using personal Anthropic accounts without API keys.
374
+
Use the `anthropic` backend with a normal [Anthropic API key](https://docs.anthropic.com/en/api/getting-started). Do not route production traffic through unofficial OAuth-token or Claude-Code credential paths; those conflict with Anthropic’s terms and are not documented here.
Copy file name to clipboardExpand all lines: docs/user_guide/backends/overview.md
+62-34Lines changed: 62 additions & 34 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,35 +4,46 @@ The LLM Interactive Proxy supports multiple backend providers, allowing you to r
4
4
5
5
## Supported Backends
6
6
7
-
The proxy supports the following backend providers out of the box:
7
+
Backend IDs are the `type:` values in YAML and the `backend_type` carried on requests. **Core connectors** live in this repository and are always import-registered. **OAuth plugin connectors** ship in the sibling package **`llm-interactive-proxy-oauth-connectors`** and register when you install the optional extra, for example `pip install "llm-interactive-proxy[oauth]"` (see `pyproject.toml` optional dependency `oauth`).
8
+
9
+
### Core connectors (this repository)
8
10
9
11
| Backend ID | Provider | Authentication | Best For |
|`openai`| OpenAI | API Key | Production applications, standard OpenAI models |
12
-
|`openai-codex`| OpenAI (ChatGPT/Codex OAuth) | Local OAuth token | Using ChatGPT login instead of API key |
13
-
|`anthropic`| Anthropic | API Key | Claude models via standard API |
14
-
|`anthropic-oauth`| Anthropic (OAuth) | Local OAuth token | Claude via OAuth credential flow |
15
-
|`cline`| Cline | Local OAuth token | Internal development & debugging |
14
+
|`openai-responses`| OpenAI | API Key | Same credentials as OpenAI; targets `/v1/responses` for structured outputs (see [OpenAI backend](openai.md#openai-responses-backend)) |
15
+
|`openai-codex`| OpenAI (ChatGPT / Codex CLI) | Local OAuth token | ChatGPT login instead of an API key |
16
+
|`anthropic`| Anthropic | API Key | Claude via the standard Anthropic API |
16
17
|`gemini`| Google Gemini | API Key | Metered API usage, production apps |
17
-
|`gemini-oauth-plan`| Google Gemini (CLI) | OAuth | Users with Google One subscription |
|`gemini-cli-acp`| Google Gemini (ACP via Gemini CLI) | Local OAuth token | Quality verifier agents, file-search sub-agents, web-search sub-agents using Google Search |
18
+
|`gemini-cli-acp`| Google Gemini (ACP via Gemini CLI) | Local OAuth token | Sub-agents and tooling via Gemini CLI |
20
19
|`cursor-cli-acp`| Cursor (ACP via Cursor CLI `agent acp`) | Local Cursor login (`agent login`) | Cursor-hosted models through the official CLI; requires `agent` on PATH or `CURSOR_AGENT_BIN`|
21
-
|`gemini-cli-cloud-project`| Google Gemini (GCP) | OAuth + GCP Project| Enterprise, team workflows, central billing|
22
-
|`openrouter`| OpenRouter | API Key |Access to many hosted models |
23
-
|`nvidia`| NVIDIA (NIM / OpenAI-compatible) | API Key (`NVIDIA_API_KEY`) |Hosted NVIDIA integrator or self-hosted NIM |
20
+
|`gemini-cli-cloud-project`| Google Gemini (GCP) | OAuth + GCP project| Enterprise / team billing on Vertex-style flows|
21
+
|`openrouter`| OpenRouter | API Key |Many third-party hosted models behind one API|
22
+
|`nvidia`| NVIDIA (NIM / OpenAI-compatible) | API Key (`NVIDIA_API_KEY`) | NVIDIA integrator or self-hosted NIM |
24
23
|`zenmux`| ZenMux | API Key | OpenAI-compatible ZenMux router |
25
-
|`zai`| ZAI | API Key | Zhipu/Z.ai access|
26
-
|`zai-coding-plan`| ZAI Coding Plan | API Key | Coding-specific workflows |
24
+
|`zai`| ZAI | API Key | Zhipu / Z.ai |
25
+
|`zai-coding-plan`| ZAI Coding Plan | API Key | Coding-plan SKU / workflows |
27
26
|`kimi-code`| Kimi | API Key | Kimi For Coding (OpenAI-compatible) |
28
-
|`opencode-go`| OpenCode Go | API Key | OpenCode Go models with internal OpenAI/Anthropic protocol routing |
29
-
|`minimax`| Minimax | API Key | Minimax AI models |
These entry points are defined in the sibling repo’s `pyproject.toml` under `[project.entry-points."llm_proxy_backends"]`. They are **not** present unless the optional package is installed.
36
+
37
+
| Backend ID | Provider | Authentication | Best For |
@@ -123,19 +134,36 @@ Or use one-off commands for a single request:
123
134
124
135
For detailed configuration and usage information for each backend, see:
125
136
126
-
-[OpenAI Backend](openai.md)
127
-
-[Anthropic Backend](anthropic.md)
128
-
-[Gemini Backends](gemini.md)
129
-
-[Cline Backend](cline.md)
130
-
-[OpenRouter Backend](openrouter.md)
131
-
-[Nvidia Backend](nvidia.md)
132
-
-[ZAI Backend](zai.md)
133
-
-[Qwen Backend](qwen.md)
134
-
-[MiniMax Backend](minimax.md)
135
-
-[ZenMux Backend](zenmux.md)
136
-
-[Kimi Code Backend](kimi-code.md)
137
-
-[OpenCode Go Backend](opencode-go.md)
138
-
-[Ollama Backend (Local)](ollama.md)
137
+
**Core**
138
+
139
+
-[OpenAI and OpenAI Responses](openai.md) (`openai`, `openai-responses`)
140
+
-[OpenAI Codex](openai-codex.md) (`openai-codex`)
141
+
-[Anthropic](anthropic.md)
142
+
-[Gemini](gemini.md) (API keys, CLI OAuth variants, `gemini-cli-acp`, and `gemini-cli-cloud-project`)
143
+
-**Cursor CLI ACP** (`cursor-cli-acp`): same idea as Gemini CLI ACP but via Cursor’s `agent acp` CLI; install and log in with Cursor’s agent tooling, ensure `agent` is on `PATH` or set `CURSOR_AGENT_BIN`. There is no separate backend guide page yet.
0 commit comments