Skip to content

fix(openai): add context manager support to traced stream wrappers#120

Open
Abhijeet Prasad (AbhiPrasad) wants to merge 1 commit intomainfrom
abhi-fix-openai-stream-context-manager-http2
Open

fix(openai): add context manager support to traced stream wrappers#120
Abhijeet Prasad (AbhiPrasad) wants to merge 1 commit intomainfrom
abhi-fix-openai-stream-context-manager-http2

Conversation

@AbhiPrasad
Copy link
Member

@AbhiPrasad Abhijeet Prasad (AbhiPrasad) commented Mar 23, 2026

AI Summary

Fix OpenAI HTTP/2 streaming compatibility in the Braintrust wrapper by preserving the SDK stream interface on wrapped streaming responses.

Root Cause
When the OpenAI SDK goes through the LegacyAPIResponse.parse() streaming path, Braintrust was returning a bare traced generator instead of a stream-like object. That dropped SDK stream internals such as _iterator and response, which broke downstream consumers on HTTP/2.

Primary Fix
The main fix is to return _TracedStream / _AsyncTracedStream from the wrapped streaming paths instead of returning a bare generator. This preserves the OpenAI stream interface while still routing iteration through Braintrust tracing.

Context Manager Update
As part of that change, the traced stream wrappers also now implement context-manager methods:

  • _TracedStream.__enter__ / __exit__
  • _AsyncTracedStream.__aenter__ / __aexit__

This ensures with stream as s: and async with stream as s: continue to behave correctly while keeping iteration on the traced wrapper rather than falling back to the raw OpenAI stream.

Tests
Added a dedicated HTTP/2 OpenAI regression session and VCR-backed tests covering:

  • sync HTTP/2 streaming interface preservation
  • sync HTTP/2 context-manager behavior
  • async HTTP/2 context-manager behavior

h2 is installed only in the dedicated HTTP/2 nox session, not in the general OpenAI test session.

_TracedStream and _AsyncTracedStream lacked __enter__/__exit__ and
__aenter__/__aexit__ methods, causing failures when callers used the
OpenAI SDK's HTTP/2 streaming path which goes through
LegacyAPIResponse.parse() and expects a context-manager-compatible
stream.

Also replace AsyncResponseWrapper and bare generator returns with
_AsyncTracedStream/_TracedStream on all streaming paths so the wrapper
type is consistent regardless of HTTP version.

Adds a dedicated nox session (test_openai_http2_streaming) and
regression tests covering sync/async context manager usage with h2.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant