Skip to content

Commit f93ef37

Browse files
[Frontend] run-batch supports V1 (vllm-project#21541)
Signed-off-by: DarkLight1337 <[email protected]>
1 parent 9045853 commit f93ef37

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

throughput.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,9 @@ async def run_vllm_async(
148148
from vllm import SamplingParams
149149

150150
async with build_async_engine_client_from_engine_args(
151-
engine_args, disable_frontend_multiprocessing) as llm:
151+
engine_args,
152+
disable_frontend_multiprocessing=disable_frontend_multiprocessing,
153+
) as llm:
152154
model_config = await llm.get_model_config()
153155
assert all(
154156
model_config.max_model_len >= (request.prompt_len +

0 commit comments

Comments
 (0)