Skip to content

Commit ced212f

Browse files
committed
Fix vllm version and remove formatting parentheses
Signed-off-by: Pooya Davoodi <[email protected]>
1 parent e9570ec commit ced212f

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

examples/offline_inference/openai/openai_batch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -208,7 +208,7 @@ $ cat results.jsonl
208208

209209
### Additional prerequisites
210210

211-
* Ensure you are using `vllm >= 0.6.7`.
211+
* Ensure you are using `vllm >= 0.7.0`.
212212

213213
### Step 1: Create your batch file
214214

vllm/entrypoints/openai/protocol.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1231,7 +1231,7 @@ class BatchResponseData(OpenAIBaseModel):
12311231

12321232
# The body of the response.
12331233
body: Optional[Union[ChatCompletionResponse, EmbeddingResponse,
1234-
ScoreResponse]] = (None)
1234+
ScoreResponse]] = None
12351235

12361236

12371237
class BatchRequestOutput(OpenAIBaseModel):

0 commit comments

Comments
 (0)