From ac755f7c82062ad7a48f7cda0d10ceab6a6637a6 Mon Sep 17 00:00:00 2001 From: WangErXiao <863579016@qq.com> Date: Sat, 8 Mar 2025 14:23:49 +0800 Subject: [PATCH] [Doc] Added QwQ-32B to the supported models list in the reasoning outputs documentation. Signed-off-by: WangErXiao <863579016@qq.com> --- docs/source/features/reasoning_outputs.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/source/features/reasoning_outputs.md b/docs/source/features/reasoning_outputs.md index e5c03793f755..b5fad26368bd 100644 --- a/docs/source/features/reasoning_outputs.md +++ b/docs/source/features/reasoning_outputs.md @@ -13,6 +13,7 @@ vLLM currently supports the following reasoning models: | Model Series | Parser Name | Structured Output Support | |--------------|-------------|------------------| | [DeepSeek R1 series](https://huggingface.co/collections/deepseek-ai/deepseek-r1-678e1e131c0169c0bc89728d) | `deepseek_r1` | `guided_json`, `guided_regex` | +| [QwQ-32B](https://huggingface.co/Qwen/QwQ-32B) | `deepseek_r1` | `guided_json`, `guided_regex` | ## Quickstart