- Status: Accepted
- Date: 2026-03-05
CodexSharpSDK wraps the Codex CLI with a bespoke API surface (CodexClient/CodexThread/RunResult). The .NET ecosystem has standardized on Microsoft.Extensions.AI abstractions (IChatClient) for provider-agnostic AI integration with composable middleware pipelines.
Implement IChatClient from Microsoft.Extensions.AI.Abstractions in a separate NuGet package (ManagedCode.CodexSharpSDK.Extensions.AI) that adapts the existing SDK types without modifying the core SDK.
-
Separate package — Core SDK remains M.E.AI-free. The adapter is opt-in, following the pattern of
Microsoft.Extensions.AI.OpenAIbeing separate fromOpenAI. -
Custom AIContent types — Rich Codex items (command execution, file changes, MCP tool calls, web searches, multi-agent collaboration) are surfaced as typed
AIContentsubclasses rather than being flattened to text. This preserves full fidelity of Codex output. -
Codex-specific options via AdditionalProperties — Standard
ChatOptionsproperties (ModelId,ConversationId) map directly. Codex-unique features usecodex:*prefixed keys inChatOptions.AdditionalProperties(e.g.,codex:sandbox_mode,codex:reasoning_effort). -
Thread-per-call with ConversationId resume — Each
GetResponseAsynccall creates or resumes aCodexThread. Thread ID flows viaChatResponse.ConversationIdfor multi-turn continuity. -
No AITool support — Codex CLI manages tools internally (commands, file changes, MCP). Consumer-registered
ChatOptions.Toolsare ignored; tool results surface as customAIContenttypes instead.
flowchart LR
Consumer["Consumer code\n(IChatClient)"]
Adapter["CodexChatClient\n(Extensions.AI)"]
Core["CodexClient\n(Core SDK)"]
CLI["codex exec --json"]
Consumer --> Adapter
Adapter --> Core
Core --> CLI
subgraph "M.E.AI Middleware (free)"
Logging["UseLogging()"]
Cache["UseDistributedCache()"]
Telemetry["UseOpenTelemetry()"]
end
Consumer -.-> Logging
Logging -.-> Cache
Cache -.-> Telemetry
Telemetry -.-> Adapter
- SDK participates in .NET AI ecosystem: DI registration, middleware pipelines, provider swapping.
- Consumers get logging, caching, and telemetry for free via M.E.AI middleware.
- Rich Codex items preserved as typed content, not lost.
- Impedance mismatch: Codex is an agentic coding tool, not a simple chat API. Multi-turn via message history doesn't map cleanly (uses thread resume instead).
- No temperature/topP/topK (Codex uses
ModelReasoningEffort). - Streaming is item-level, not token-level.
- Additional NuGet package to maintain.
ChatOptions.Toolsis a no-op; documented as limitation.
- Implement
IChatClientdirectly in core SDK: rejected to avoid mandatory M.E.AI dependency. - Flatten all Codex items to
TextContent: rejected to preserve rich output fidelity. - Map Codex commands/file changes as
FunctionCallContent: rejected because tools are internal to CLI, not consumer-invocable.