Skip to content

Conversation

@Athe-kunal
Copy link
Contributor

  1. Fixes in the softmax operation (max is not supported)
  2. For the convolution kernel, I fixed the shape and devices issue, but the result does not match with conv2d_spec

I was getting an error. Resolved after enabling gradient checking for y

```
RuntimeError: One of the differentiated Tensors does not require grad
```
Enable gradient tracking for 'y' in mul_relu_block_back_spec
Change accumulator initialization to use the device of input tensor. And use `zeroes` instead of `zeroes_like` because the alter accepts tensor
torch.max is not supported for helion kernels
Updated tensor operations to ensure compatibility with input device.
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 8, 2025
Copy link
Contributor

@jansel jansel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That for the fix, just a minor nit.

@Athe-kunal
Copy link
Contributor Author

Athe-kunal commented Nov 8, 2025

@jansel
The testcases for the Conv2D puzzle is failing though. I am trying to debug, but I am not sure which part is going wrong here

AssertionError: Tensor-likes are not close!

Mismatched elements: 256 / 256 (100.0%)
Greatest absolute difference: 13.31396770477295 at index (0, 2, 3) (up to 1e-05 allowed)
Greatest relative difference: inf at index (1, 0, 7) (up to 1.3e-06 allowed)

@jansel jansel merged commit 5d0cd02 into pytorch:main Nov 9, 2025
14 of 15 checks passed
@jansel
Copy link
Contributor

jansel commented Nov 9, 2025

It looks like the test passed in CI

@Athe-kunal
Copy link
Contributor Author

Hmm I see, probably some issues with the dtypes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants