fix(telemetry): ensure all model callbacks observe the same call_llm span#5041
Draft
caohy1988 wants to merge 2 commits intogoogle:mainfrom
Draft
fix(telemetry): ensure all model callbacks observe the same call_llm span#5041caohy1988 wants to merge 2 commits intogoogle:mainfrom
caohy1988 wants to merge 2 commits intogoogle:mainfrom
Conversation
…span Move before_model_callback inside the call_llm span, wrap after_model_callback and on_model_error_callback with trace.use_span(span, end_on_exit=False) to rebind to the call_llm span. Thread call_llm_span into _run_and_handle_error so error callbacks also run under the correct span context. Fixes google#4851 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
Collaborator
|
Response from ADK Triaging Agent Hello @caohy1988, thank you for creating this PR! It looks like you haven't signed a Contributor License Agreement (CLA) yet. You'll need to sign one before we can review your pull request. Please visit https://cla.developers.google.com/ to sign a new one. Thanks! |
Adds two tests exercising the CFC (support_cfc=True) code path: - test_cfc_before_and_after_callbacks_share_same_span - test_cfc_error_callback_shares_span Uses a BaseLlmFlow subclass with stubbed run_live to exercise the CFC branch in _call_llm_async without a real live connection. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
before_model_callbackinside thecall_llmspan so plugins see a valid span (previously ran before the span started)after_model_callbackwithtrace.use_span(span, end_on_exit=False)to rebind to thecall_llmspan (previously could see a child inference span)call_llm_spaninto_run_and_handle_errorand wrapson_model_error_callbackthe same waycall_llmspan IDFixes #4851
Changes
src/google/adk/flows/llm_flows/base_llm_flow.py:_call_llm_async(): Movedbefore_model_callback, label setup, and__get_llm()inside thewith tracer.start_as_current_span('call_llm') as span:blockafter_model_callbackcall sites wrapped withtrace.use_span(span, end_on_exit=False)_run_and_handle_error(): New optionalcall_llm_spanparameter; error callbacks run underuse_span(call_llm_span)when providedcall_llm_spantests/unittests/flows/llm_flows/test_llm_callback_span_consistency.py(new):test_before_and_after_callbacks_share_same_span— success pathtest_callbacks_same_trace_id— same tracetest_before_and_error_callbacks_share_same_span— error pathtest_short_circuit_before_callback_sees_valid_span— short-circuit pathtest_all_three_callbacks_share_span_on_error— error recovery path (all 3 callbacks)Design decisions
end_on_exit=False: Thecall_llmspan is already managed by thewith tracer.start_as_current_span(...)context manager.use_spanis only used to re-activate the span context, not to end it.call_llm_spandefaults toNone; when absent, behavior is unchanged.BigQueryAgentAnalyticsPlugin. That should be a separate follow-up PR after this fix is proven in production.Test plan
llm_flowstests pass (0 regressions)🤖 Generated with Claude Code