Skip to content

Conversation

@yucai-intel
Copy link
Contributor

@yucai-intel yucai-intel commented Oct 10, 2025

To solve #2207
This PR adds FP8 data types support for torch.cat and torch.where on XPU backend.

@yucai-intel yucai-intel changed the title Add FP8 data types to some ops Add FP8 data types to concat_xpu and where_xpu Oct 10, 2025
@yucai-intel yucai-intel changed the title Add FP8 data types to concat_xpu and where_xpu Enable FP8 concat_xpu and where_xpu Oct 10, 2025
@yucai-intel
Copy link
Contributor Author

torch.cat() float8 UT
image

@yucai-intel
Copy link
Contributor Author

torch.whehe() float8 UT
image

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR enables FP8 (float8) dtype support for the concat_xpu and where_xpu operations by updating the kernel dispatchers and adding comprehensive regression tests.

  • Updates where_kernel to use AT_DISPATCH_V2 macro with FP8 dtype support
  • Adds FP8 dtype support to cat_out_kernel dispatcher
  • Adds regression tests for torch.where() and torch.cat() with FP8 dtypes

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.

File Description
src/ATen/native/xpu/sycl/TensorCompareKernels.cpp Updates where_kernel to use AT_DISPATCH_V2 macro and adds FP8 dtype support
src/ATen/native/xpu/sycl/Shape.cpp Adds FP8 dtype support to cat_out_kernel dispatcher
test/regressions/test_where.py Adds comprehensive regression tests for torch.where() with FP8 dtypes
test/regressions/test_cat.py Adds regression tests for torch.cat() with FP8 dtypes

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Contributor

@CuiYifeng CuiYifeng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main part of this PR looks good to me.

@CuiYifeng CuiYifeng requested a review from liangan1 October 31, 2025 06:38
@CuiYifeng CuiYifeng added this to the PT2.10 milestone Oct 31, 2025
@CuiYifeng CuiYifeng added this pull request to the merge queue Nov 3, 2025
Merged via the queue into main with commit a3efbb3 Nov 3, 2025
49 of 50 checks passed
@CuiYifeng CuiYifeng deleted the yucai/fp8 branch November 3, 2025 02:17
RUIJIEZHONG66166 pushed a commit that referenced this pull request Nov 4, 2025
To solve #2207 
This PR adds FP8 data types support for `torch.cat` and `torch.where` on
XPU backend.

---------

Co-authored-by: Cui, Yifeng <[email protected]>
Co-authored-by: Copilot <[email protected]>
wincent8 pushed a commit that referenced this pull request Nov 5, 2025
To solve #2207 
This PR adds FP8 data types support for `torch.cat` and `torch.where` on
XPU backend.

---------

Co-authored-by: Cui, Yifeng <[email protected]>
Co-authored-by: Copilot <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants