Skip to content

OpenAI stream mode not delivering chunks in real time #5

@Kezino

Description

@Kezino

First of all, thank you for the great work on this package — it's been incredibly helpful and well-designed.

I’m currently working with the stream mode in the OpenAI API and noticed that the response seems to be fully buffered before being delivered, rather than streaming chunks as they arrive. From what I can tell, this might be related to how the Swoole HTTP client handles responses.

Has anyone encountered this behavior? Is there a known workaround or configuration that could help ensure the response is truly streamed as data becomes available?

Any insights or shared experience would be greatly appreciated — thanks in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions