-
Notifications
You must be signed in to change notification settings - Fork 64
Support layernorm without bias #585
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support layernorm without bias #585
Conversation
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
800873f to
9b722ee
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
9b722ee to
9c751e0
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
9c751e0 to
5540fd8
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
5540fd8 to
95b363d
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
| baseline_name="torch", | ||
| rtol=1e-3, | ||
| atol=1e-3, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mengluy0125 thanks for the PR! would you like to add the no-bias test in test_examples.py too?
Lines 614 to 628 in 247be92
| def test_layernorm(self): | |
| x = torch.randn([32, 64], device=DEVICE, dtype=torch.float16) | |
| weight = torch.randn([64], device=DEVICE, dtype=torch.float16) | |
| bias = torch.randn([64], device=DEVICE, dtype=torch.float16) | |
| args = (x, [64], weight, bias) | |
| self.assertExpectedJournal( | |
| check_example( | |
| "layer_norm", | |
| args, | |
| torch.nn.functional.layer_norm(*args), | |
| fn_name="layer_norm_fwd", | |
| ) | |
| ) |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
95b363d to
c19532d
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
Summary: The current layernorm only supports bias case, we thus add the case without bias. Differential Revision: D82171738
c19532d to
af875dc
Compare
|
This pull request was exported from Phabricator. Differential Revision: D82171738 |
yf225
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @mengluy0125 !
Differential Revision: D82171738 Pull Request resolved: pytorch#585
Summary: The current layernorm only supports bias case, we thus add the case without bias.
Differential Revision: D82171738