Skip to content

VLLM integration hangs due to blocking ray.get in AsyncvLLMServer #55

@Yemonade

Description

@Yemonade

I'm trying to use the vLLM backend for serving/rollout, but I'm encountering an issue where the service hangs during initialization.

Problem Description
When the AsyncvLLMServer actor is being initialized, the process gets stuck and never proceeds. I see the following key warning message in the logs, which points to a potential deadlock:

Using blocking ray.get inside async actor. This blocks the event loop. Please use await on object ref with asyncio.gather if you want to yield execution to the event loop instead.

By the way, is vllm currently not supported?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions