Skip to content

fix: do not enforce temperature of 0 for OpenAI compatible provider#12043

Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-no-default-temperature
Draft

fix: do not enforce temperature of 0 for OpenAI compatible provider#12043
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-no-default-temperature

Conversation

@roomote-v0
Copy link
Copy Markdown
Contributor

@roomote-v0 roomote-v0 bot commented Apr 1, 2026

Related GitHub Issue

Closes: #12042

Description

This PR attempts to address Issue #12042. When using the OpenAI compatible provider without a custom temperature set, the handler was hardcoding temperature: 0 in the API request. This overrode whatever server-side default the provider may have configured.

Changes:

  1. src/api/providers/openai.ts (createMessage) - Changed the temperature fallback from 0 to undefined for the non-DeepSeek case, so the field is omitted from the request when no custom temperature is set.

  2. src/api/providers/openai.ts (getModel) - Changed defaultTemperature: 0 to defaultTemperature: undefined so getModelParams also returns undefined when no temperature source provides a value.

  3. src/api/transform/model-params.ts - Updated the GetModelParamsOptions type to allow defaultTemperature to be number | undefined.

  4. Tests updated - Updated the Azure AI Inference test expectation and added a new test case verifying that undefined defaultTemperature produces undefined temperature output.

Scope and safety:

  • Only affects the OpenAiHandler used for "OpenAI Compatible" provider type
  • OpenAiNativeHandler (actual OpenAI) is in a separate file and unaffected
  • DeepSeekHandler extends OpenAiHandler but overrides createMessage with its own temperature handling
  • Other providers each have their own temperature logic and are unaffected

Feedback and guidance are welcome.

Test Procedure

  • Ran npx vitest run api/transform/__tests__/model-params.spec.ts - all 58 tests pass
  • Ran npx vitest run api/providers/__tests__/openai.spec.ts - all 48 tests pass
  • All lint and type checks pass via pre-push hooks

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue.
  • Scope: Changes are focused on the linked issue.
  • Self-Review: Performed a thorough self-review.
  • Testing: Updated existing test and added new test for undefined temperature behavior.
  • Documentation Impact: No documentation updates required.
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Documentation Updates

  • No documentation updates are required.

Additional Notes

This fix aligns with the intent of Issue #7187 which previously addressed a similar problem.

Interactively review PR in Roo Code Cloud

When no custom temperature is set and the model is not a DeepSeek
reasoner, the OpenAI compatible provider was hardcoding temperature=0.
This overrode whatever server-side default the provider may have.

Changed the fallback to undefined so the temperature field is omitted
from the API request, letting the remote provider use its own default.

Fixes #12042
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] OpenAI compatible provider setting should not enforce a temperature of 0

1 participant