Skip to content

fix: compaction fallback to lowest model context limit#2379

Open
shaj13 wants to merge 1 commit intodocker:mainfrom
shaj13:compact-customp
Open

fix: compaction fallback to lowest model context limit#2379
shaj13 wants to merge 1 commit intodocker:mainfrom
shaj13:compact-customp

Conversation

@shaj13
Copy link
Copy Markdown
Contributor

@shaj13 shaj13 commented Apr 11, 2026

Problem

When the provider is not found in the models store, compaction fails to determine the correct context limit for the model.

providers:
  xxx:
    api_type: openai_chatcompletions
    base_url: xxxxx

models:
  claude-opus-4-5:
    provider: xxx
    model: claude-opus-4-5
..... 
image

Fix

Added a fallback that searches all providers for the model and uses the lowest context limit found.
This ensures compaction triggers conservatively regardless of which provider is actually being used at runtime.

image

@shaj13 shaj13 requested a review from a team as a code owner April 11, 2026 02:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant