Skip to content

Commit 11a705a

Browse files
author
Mateusz
committed
Remove anthropic-oauth connector and align docs with supported backends
- Drop Anthropic OAuth plugin surface from core: CLI flag, applicator mapping, access-mode validator, resilience personal list, and OAuth discovery tests. - Remove user-guide and dev-guide references; delete anthropic-oauth.md; add TOS-aligned Anthropic API key guidance in anthropic backend and error_handler. - Refresh backend overview (core vs oauth plugin), access modes, and configuration/resilience docs; update sample.env example. Made-with: Cursor
1 parent 471f442 commit 11a705a

35 files changed

Lines changed: 211 additions & 275 deletions

config/sample.env

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ DISABLE_INTERACTIVE_COMMANDS=false
6868
# MEMORY_PROJECT_DISCOVERY_MODE=any
6969

7070
# Resilience scoping overrides (comma-separated backend types)
71-
# RESILIENCE_PERSONAL_BACKEND_TYPES=anthropic-oauth,openai-codex
71+
# RESILIENCE_PERSONAL_BACKEND_TYPES=qwen-oauth,openai-codex
7272
# RESILIENCE_SHARED_BACKEND_TYPES=openai,openrouter
7373

7474
# Tool Call Repair (auto-convert plain-text tool calls)

docs/development_guide/code-organization.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,6 @@ src/connectors/
136136
├── openai_codex.py # OpenAI Codex (OAuth) connector
137137
├── openai_responses.py # OpenAI Responses API connector
138138
├── anthropic.py # [Anthropic](../user_guide/backends/anthropic.md) connector
139-
├── anthropic_oauth.py # Anthropic OAuth connector
140139
├── gemini.py # [Gemini](../user_guide/backends/gemini.md) API key connector
141140
├── gemini_oauth_base.py # Base for Gemini OAuth connectors
142141
├── gemini_oauth_free.py # Gemini OAuth free tier

docs/user_guide/access-modes.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Access modes prevent common misconfigurations that could:
2525

2626
### Allowed Behaviors
2727

28-
- **OAuth Connectors**: Full access to OAuth-based connectors (e.g., `gemini-oauth-auto`, `anthropic-oauth`, `qwen-oauth`, `openai-codex`)
28+
- **OAuth Connectors**: Full access to OAuth-based connectors (e.g., `gemini-oauth-auto`, `qwen-oauth`, `openai-codex`)
2929
- **Host Binding**: Must bind to `127.0.0.1` only (localhost)
3030
- **Authentication**: Optional - can be disabled for convenience
3131
- **OAuth Debugging Flags**: Allowed (e.g., `--enable-gemini-oauth-auto-backend-debugging-override`)
@@ -247,7 +247,7 @@ In Multi User Mode, the following connectors are automatically filtered during s
247247

248248
**By Naming Pattern:**
249249
- Connectors containing `-oauth-` (e.g., `gemini-oauth-auto`, `gemini-oauth-free`, `gemini-oauth-plan`)
250-
- Connectors ending with `-oauth` (e.g., `anthropic-oauth`, `qwen-oauth`)
250+
- Connectors ending with `-oauth` (e.g., `qwen-oauth`)
251251

252252
**By Property:**
253253
- Connectors with `has_static_credentials = False`
@@ -256,7 +256,6 @@ In Multi User Mode, the following connectors are automatically filtered during s
256256
- `gemini-oauth-auto`
257257
- `gemini-oauth-free`
258258
- `gemini-oauth-plan`
259-
- `anthropic-oauth`
260259
- `qwen-oauth`
261260
- `openai-codex` (uses OAuth via auth.json)
262261

@@ -265,11 +264,11 @@ In Multi User Mode, the following connectors are automatically filtered during s
265264
```
266265
# Single User Mode startup logs
267266
INFO: Starting LLM Proxy in Single User Mode (default)
268-
DEBUG: Loaded OAuth connectors: gemini-oauth-auto, anthropic-oauth, qwen-oauth, openai-codex
267+
DEBUG: Loaded OAuth connectors: gemini-oauth-auto, qwen-oauth, openai-codex
269268

270269
# Multi User Mode startup logs
271270
INFO: Starting LLM Proxy in Multi User Mode
272-
INFO: Skipped 4 OAuth connectors in Multi User Mode (OAuth not allowed in production)
271+
INFO: Skipped OAuth connectors in Multi User Mode (OAuth not allowed in production)
273272
```
274273
275274
### Backend Registry Impact
@@ -325,7 +324,6 @@ Replace OAuth backends with API key-based alternatives:
325324
| OAuth Connector | Static Credential Alternative |
326325
|----------------|-------------------------------|
327326
| `gemini-oauth-auto` | `gemini` (requires `GEMINI_API_KEY`) |
328-
| `anthropic-oauth` | `anthropic` (requires `ANTHROPIC_API_KEY`) |
329327
| `qwen-oauth` | `qwen` (requires API key configuration) |
330328
| `openai-codex` | `openai` (requires `OPENAI_API_KEY`) |
331329

docs/user_guide/backends/anthropic-oauth.md

Lines changed: 0 additions & 59 deletions
This file was deleted.

docs/user_guide/backends/anthropic.md

Lines changed: 2 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -369,16 +369,9 @@ Use Claude models for:
369369
- Validating instruction following
370370
- Long context window testing
371371

372-
## Anthropic OAuth Backend
372+
## Anthropic accounts and API keys
373373

374-
The proxy also supports an `anthropic-oauth` backend that uses OAuth tokens instead of API keys:
375-
376-
```bash
377-
# Configure OAuth token location
378-
python -m src.core.cli --default-backend anthropic-oauth
379-
```
380-
381-
This is useful for using personal Anthropic accounts without API keys.
374+
Use the `anthropic` backend with a normal [Anthropic API key](https://docs.anthropic.com/en/api/getting-started). Do not route production traffic through unofficial OAuth-token or Claude-Code credential paths; those conflict with Anthropic’s terms and are not documented here.
382375

383376
## Dedicated Anthropic Port
384377

docs/user_guide/backends/overview.md

Lines changed: 62 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -4,35 +4,46 @@ The LLM Interactive Proxy supports multiple backend providers, allowing you to r
44

55
## Supported Backends
66

7-
The proxy supports the following backend providers out of the box:
7+
Backend IDs are the `type:` values in YAML and the `backend_type` carried on requests. **Core connectors** live in this repository and are always import-registered. **OAuth plugin connectors** ship in the sibling package **`llm-interactive-proxy-oauth-connectors`** and register when you install the optional extra, for example `pip install "llm-interactive-proxy[oauth]"` (see `pyproject.toml` optional dependency `oauth`).
8+
9+
### Core connectors (this repository)
810

911
| Backend ID | Provider | Authentication | Best For |
1012
|------------|----------|----------------|----------|
1113
| `openai` | OpenAI | API Key | Production applications, standard OpenAI models |
12-
| `openai-codex` | OpenAI (ChatGPT/Codex OAuth) | Local OAuth token | Using ChatGPT login instead of API key |
13-
| `anthropic` | Anthropic | API Key | Claude models via standard API |
14-
| `anthropic-oauth` | Anthropic (OAuth) | Local OAuth token | Claude via OAuth credential flow |
15-
| `cline` | Cline | Local OAuth token | Internal development & debugging |
14+
| `openai-responses` | OpenAI | API Key | Same credentials as OpenAI; targets `/v1/responses` for structured outputs (see [OpenAI backend](openai.md#openai-responses-backend)) |
15+
| `openai-codex` | OpenAI (ChatGPT / Codex CLI) | Local OAuth token | ChatGPT login instead of an API key |
16+
| `anthropic` | Anthropic | API Key | Claude via the standard Anthropic API |
1617
| `gemini` | Google Gemini | API Key | Metered API usage, production apps |
17-
| `gemini-oauth-plan` | Google Gemini (CLI) | OAuth | Users with Google One subscription |
18-
| `gemini-oauth-free` | Google Gemini (CLI) | OAuth | Free tier users |
19-
| `gemini-cli-acp` | Google Gemini (ACP via Gemini CLI) | Local OAuth token | Quality verifier agents, file-search sub-agents, web-search sub-agents using Google Search |
18+
| `gemini-cli-acp` | Google Gemini (ACP via Gemini CLI) | Local OAuth token | Sub-agents and tooling via Gemini CLI |
2019
| `cursor-cli-acp` | Cursor (ACP via Cursor CLI `agent acp`) | Local Cursor login (`agent login`) | Cursor-hosted models through the official CLI; requires `agent` on PATH or `CURSOR_AGENT_BIN` |
21-
| `gemini-cli-cloud-project` | Google Gemini (GCP) | OAuth + GCP Project | Enterprise, team workflows, central billing |
22-
| `openrouter` | OpenRouter | API Key | Access to many hosted models |
23-
| `nvidia` | NVIDIA (NIM / OpenAI-compatible) | API Key (`NVIDIA_API_KEY`) | Hosted NVIDIA integrator or self-hosted NIM |
20+
| `gemini-cli-cloud-project` | Google Gemini (GCP) | OAuth + GCP project | Enterprise / team billing on Vertex-style flows |
21+
| `openrouter` | OpenRouter | API Key | Many third-party hosted models behind one API |
22+
| `nvidia` | NVIDIA (NIM / OpenAI-compatible) | API Key (`NVIDIA_API_KEY`) | NVIDIA integrator or self-hosted NIM |
2423
| `zenmux` | ZenMux | API Key | OpenAI-compatible ZenMux router |
25-
| `zai` | ZAI | API Key | Zhipu/Z.ai access |
26-
| `zai-coding-plan` | ZAI Coding Plan | API Key | Coding-specific workflows |
24+
| `zai` | ZAI | API Key | Zhipu / Z.ai |
25+
| `zai-coding-plan` | ZAI Coding Plan | API Key | Coding-plan SKU / workflows |
2726
| `kimi-code` | Kimi | API Key | Kimi For Coding (OpenAI-compatible) |
28-
| `opencode-go` | OpenCode Go | API Key | OpenCode Go models with internal OpenAI/Anthropic protocol routing |
29-
| `minimax` | Minimax | API Key | Minimax AI models |
30-
| `qwen-oauth` | Alibaba Qwen | Local OAuth token | Qwen CLI OAuth |
31-
| `qwen-oauth` | Alibaba Qwen | Local OAuth token | Qwen CLI OAuth |
32-
| `internlm` | InternLM AI | API Key | InternLM models with key rotation |
33-
| `ollama` | Ollama (Local) | None (local server) | Locally-hosted models + cloud via Ollama app |
34-
| `hybrid` | Virtual (orchestrates two models) | Inherits from sub-backends | Two-phase reasoning + execution |
35-
| `antigravity-oauth` | Google Gemini (Antigravity) | Antigravity Token | Internal debugging (Gemini models) |
27+
| `opencode-go` | OpenCode Go | API Key | OpenCode Go with internal OpenAI/Anthropic-style routing |
28+
| `minimax` | Minimax | API Key | Minimax models |
29+
| `internlm` | InternLM | API Key (rotation supported) | InternLM with optional key rotation |
30+
| `ollama` | Ollama | None (local) | Local and remote models via Ollama |
31+
| `hybrid` | Virtual (two backends) | Inherits from sub-backends | Two-phase reasoning + execution |
32+
33+
### OAuth plugin connectors (`llm-interactive-proxy-oauth-connectors`)
34+
35+
These entry points are defined in the sibling repo’s `pyproject.toml` under `[project.entry-points."llm_proxy_backends"]`. They are **not** present unless the optional package is installed.
36+
37+
| Backend ID | Provider | Authentication | Best For |
38+
|------------|----------|----------------|----------|
39+
| `antigravity-oauth` | Google Gemini (Antigravity) | Antigravity token | Internal / debugging (Gemini-shaped traffic) |
40+
| `cline` | Cline | Local OAuth token | Internal development and compatibility testing |
41+
| `gemini-oauth-auto` | Google Gemini (CLI) | Multi-account OAuth | Automatic account rotation across Google logins |
42+
| `gemini-oauth-plan` | Google Gemini (CLI) | OAuth | Google One / paid CLI tier |
43+
| `gemini-oauth-free` | Google Gemini (CLI) | OAuth | Free-tier CLI usage |
44+
| `kiro-oauth-auto` | Amazon Kiro / Q Developer | Self-managed OAuth | Kiro streaming via local OAuth tokens |
45+
| `opencode-zen` | OpenCode Zen | OAuth | OpenCode Zen API (distinct from `opencode-go`) |
46+
| `qwen-oauth` | Alibaba Qwen (CLI) | Local OAuth token | Qwen CLI OAuth |
3647

3748
## Frontend APIs
3849

@@ -123,19 +134,36 @@ Or use one-off commands for a single request:
123134

124135
For detailed configuration and usage information for each backend, see:
125136

126-
- [OpenAI Backend](openai.md)
127-
- [Anthropic Backend](anthropic.md)
128-
- [Gemini Backends](gemini.md)
129-
- [Cline Backend](cline.md)
130-
- [OpenRouter Backend](openrouter.md)
131-
- [Nvidia Backend](nvidia.md)
132-
- [ZAI Backend](zai.md)
133-
- [Qwen Backend](qwen.md)
134-
- [MiniMax Backend](minimax.md)
135-
- [ZenMux Backend](zenmux.md)
136-
- [Kimi Code Backend](kimi-code.md)
137-
- [OpenCode Go Backend](opencode-go.md)
138-
- [Ollama Backend (Local)](ollama.md)
137+
**Core**
138+
139+
- [OpenAI and OpenAI Responses](openai.md) (`openai`, `openai-responses`)
140+
- [OpenAI Codex](openai-codex.md) (`openai-codex`)
141+
- [Anthropic](anthropic.md)
142+
- [Gemini](gemini.md) (API keys, CLI OAuth variants, `gemini-cli-acp`, and `gemini-cli-cloud-project`)
143+
- **Cursor CLI ACP** (`cursor-cli-acp`): same idea as Gemini CLI ACP but via Cursor’s `agent acp` CLI; install and log in with Cursor’s agent tooling, ensure `agent` is on `PATH` or set `CURSOR_AGENT_BIN`. There is no separate backend guide page yet.
144+
- [OpenRouter](openrouter.md)
145+
- [NVIDIA](nvidia.md)
146+
- [ZAI](zai.md)
147+
- [Kimi Code](kimi-code.md)
148+
- [OpenCode Go](opencode-go.md)
149+
- [Ollama](ollama.md)
150+
- [InternLM](internlm.md)
151+
- [MiniMax](minimax.md)
152+
- [ZenMux](zenmux.md)
153+
- [Hybrid backend](../features/hybrid-backend.md) (`hybrid`)
154+
155+
**OAuth plugin (`llm-interactive-proxy-oauth-connectors`)**
156+
157+
- [Antigravity OAuth](antigravity-oauth.md)
158+
- [Cline](cline.md)
159+
- [Gemini OAuth Auto](gemini-oauth-auto.md) (`gemini-oauth-auto`; overview also in [Gemini backends](gemini.md))
160+
- [Kiro OAuth Auto](kiro-oauth-auto.md)
161+
- [OpenCode Zen](opencode-zen.md)
162+
- [Qwen OAuth](qwen.md)
163+
- [Gemini OAuth plan / free](gemini.md) (`gemini-oauth-plan`, `gemini-oauth-free`)
164+
165+
**Extensibility**
166+
139167
- [Custom Backends](custom-backends.md)
140168

141169
## Related Features

0 commit comments

Comments
 (0)