File tree Expand file tree Collapse file tree 1 file changed +8
-0
lines changed
docs/source/getting_started/installation/gpu Expand file tree Collapse file tree 1 file changed +8
-0
lines changed Original file line number Diff line number Diff line change @@ -13,6 +13,14 @@ vLLM supports AMD GPUs with ROCm 6.2.
13
13
14
14
Currently, there are no pre-built ROCm wheels.
15
15
16
+ However, the [ AMD Infinity hub for vLLM] ( https://hub.docker.com/r/rocm/vllm/tags ) offers a prebuilt, optimized
17
+ docker image designed for validating inference performance on the AMD Instinct™ MI300X accelerator.
18
+
19
+ ``` {tip}
20
+ Please check [LLM inference performance validation on AMD Instinct MI300X](https://rocm.docs.amd.com/en/latest/how-to/performance-validation/mi300x/vllm-benchmark.html)
21
+ for instructions on how to use this prebuilt docker image.
22
+ ```
23
+
16
24
### Build wheel from source
17
25
18
26
0 . Install prerequisites (skip if you are already in an environment/docker with the following installed):
You can’t perform that action at this time.
0 commit comments