Skip to content

Conversation

WoosukKwon
Copy link
Collaborator

This PR adds the default option to dtype, which uses FP16 for FP16 and FP32 models and BF16 for BF16 models. While this option will be used by default, users can specify the data type if they want to use BF16 for FP32 models.

In addition, the PR integrates Dolly V2, a recent LLM with the GPT-NeoX architecture. The model is trained and saved in BF16.

@WoosukKwon WoosukKwon requested a review from zhuohan123 May 3, 2023 23:28
Copy link
Member

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@WoosukKwon WoosukKwon merged commit 189ae23 into main May 4, 2023
@WoosukKwon WoosukKwon deleted the dolly-v2 branch May 4, 2023 10:05
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
yukavio pushed a commit to yukavio/vllm that referenced this pull request Jul 3, 2024
SUMMARY:
* delete NOTICE.txt file

TEST PLAN:
none

Co-authored-by: andy-neuma <[email protected]>
dllehr-amd pushed a commit to dllehr-amd/vllm that referenced this pull request Jul 22, 2024
dtrifiro pushed a commit to dtrifiro/vllm that referenced this pull request Jul 30, 2024
sync release with main @ v0.5.0.post1-99-g8720c92e
JHLEE17 pushed a commit to JHLEE17/vllm that referenced this pull request Aug 1, 2024
* Add more detailed event names to profiler

* Add more profiler stats

* separate prompt and decode batch utilization

* Add more metrics

* revert engine/metrics.py changes

* un-singletonify (what a funny word) habana profiler

* formatting

* add batch block utilization metric

* fix division by zero

* fix batch_block_utilization formula

* minor refactors
@alixiaodi alixiaodi mentioned this pull request Aug 2, 2024
wuhuikx pushed a commit to wuhuikx/vllm that referenced this pull request Mar 27, 2025
### What this PR does / why we need it?
Switch to cann latest version

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
tlrmchlsmth pushed a commit to tlrmchlsmth/vllm that referenced this pull request May 3, 2025
* updated

Signed-off-by: [email protected] <[email protected]>

* mypy

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* stash

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* updated

Signed-off-by: [email protected] <[email protected]>

* update typing

Signed-off-by: [email protected] <[email protected]>

---------

Signed-off-by: [email protected] <[email protected]>
heheda12345 added a commit to heheda12345/vllm that referenced this pull request Sep 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants