-
-
Notifications
You must be signed in to change notification settings - Fork 10.6k
Closed
Labels
feature requestNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Description
🚀 The feature, motivation and pitch
Currently the vllm
commands has support for serve|chat|complete
, while the benchmark commands lives in https://github.com/vllm-project/vllm/tree/main/benchmarks.
It would be great to have commands such as following
vllm benchmark-throughput
vllm benchmark-latency
vllm benchmark-serving
Alternatives
Status quo.
Additional context
No response
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
gaby and csy1204
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Type
Projects
Status
Done