-
Notifications
You must be signed in to change notification settings - Fork 65
[Example] Layer Norm Forward #170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
| for n in nodes: | ||
| if "output_index" in n.meta: | ||
| output_nodes[n.meta["output_index"]] = n.name | ||
| import pdb; pdb.set_trace() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| import pdb; pdb.set_trace() | |
| breakpoint() |
same thing but shorter.
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw probably need to add a unit test to test_examples.py similar to other examples
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
yf225
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks a lot!
[ghstack-poisoned]
[ghstack-poisoned]
jansel
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ghstack doesn't work in this repo, https://github.com/modular/stack-pr does though
| return out | ||
|
|
||
|
|
||
| def helion_layer_norm_wrapper( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need a wraper?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PyTorch's layer_norm https://docs.pytorch.org/docs/stable/generated/torch.nn.functional.layer_norm.html takes in normalized_shape as second arg, a list[int]. This allows us to work around it by not having to pass this into the kernel
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should be able to pass it into the kernel, I think we can remove this.
| import helion.language as hl | ||
|
|
||
|
|
||
| # TODO(PaulZhang12): Support autotuning, setting reduction_loops currently errors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the error you are getting? Do you need help fixing this one? For benchmarking we need to run the autotuner.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I filed an issue #345, haven't had time to look into it but can address it before merging
Stack from ghstack (oldest at bottom):