Skip to content

Fix SSE stream corruption by buffering raw bytes#17

Open
insign wants to merge 1 commit into
mainfrom
fix/sse-streaming-corruption-14698576312227409449
Open

Fix SSE stream corruption by buffering raw bytes#17
insign wants to merge 1 commit into
mainfrom
fix/sse-streaming-corruption-14698576312227409449

Conversation

@insign
Copy link
Copy Markdown
Contributor

@insign insign commented May 14, 2026

Previously, the AI provider streams (OpenAI, Anthropic, Gemini) converted individual network chunks directly to strings using String::from_utf8_lossy. This caused silent corruption when multi-byte UTF-8 characters or JSON lines were split across network boundaries, dropping text or causing deserialization failures. This commit fixes the issue by buffering raw Vec<u8> bytes and splitting on newlines (\n) before converting to UTF-8 and processing, ensuring payloads are always complete and intact.


PR created automatically by Jules for task 14698576312227409449 started by @insign

Previously, the AI provider streams (OpenAI, Anthropic, Gemini) converted individual network chunks directly to strings using `String::from_utf8_lossy`. This caused silent corruption when multi-byte UTF-8 characters or JSON lines were split across network boundaries, dropping text or causing deserialization failures. This commit fixes the issue by buffering raw `Vec<u8>` bytes and splitting on newlines (`\n`) before converting to UTF-8 and processing, ensuring payloads are always complete and intact.

Co-authored-by: insign <1113045+insign@users.noreply.github.com>
@google-labs-jules
Copy link
Copy Markdown
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: bf91d5e4dd

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/providers/openai.rs
let text = String::from_utf8_lossy(&chunk);
raw_buf.extend_from_slice(&chunk);

while let Some(newline_pos) = raw_buf.iter().position(|&b| b == b'\n') {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Flush trailing OpenAI SSE bytes at EOF

The new parser in stream_with_options only processes frames when raw_buf contains a \n, so a final data: {...} line without a trailing newline is dropped when the HTTP stream closes. This can happen with EOF-terminated last lines (or intermediaries that trim the final newline), causing the last streamed tokens to be silently lost. The previous text.lines() behavior would still yield an unterminated final line, so this is a regression introduced by the newline-gated loop.

Useful? React with 👍 / 👎.

let text = String::from_utf8_lossy(&chunk);
raw_buf.extend_from_slice(&chunk);

while let Some(newline_pos) = raw_buf.iter().position(|&b| b == b'\n') {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Flush trailing Anthropic SSE bytes at EOF

Like the OpenAI path, this loop now parses only newline-terminated records and never drains raw_buf after the stream ends. If the last Anthropic data: event arrives without a terminating newline, it remains buffered and is never deserialized, which drops the final text chunk. This behavior is newly introduced by the raw-byte buffering change and can truncate outputs in EOF-terminated streams.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant