-
Notifications
You must be signed in to change notification settings - Fork 70
[Rewriter]: fuse successive Relu/Clip nodes #2410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
❌ 13 Tests Failed:
View the top 2 failed test(s) by shortest run time
View the full list of 1 ❄️ flaky tests
To view more test analytics, go to the Test Analytics Dashboard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds graph rewrite rules to fuse consecutive Relu and Clip operations, updates the test harness to control ONNX Runtime’s optimization level, and provides unit tests to validate the new transformations.
- Introduce four fusion rules (
Relu(Relu)
,Relu(Clip)
,Clip(Relu)
,Clip(Clip)
) infuse_relus_clips.py
- Extend
assert_numerically_equal
intesting.py
to accept anort_optimization_level
argument - Add comprehensive tests in
fuse_relus_clips_test.py
to cover valid and invalid fusion scenarios
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.
File | Description |
---|---|
onnxscript/rewriter/fuse_relus_clips.py | Implement new RewriteRule classes and assemble them into a set. |
onnxscript/rewriter/testing.py | Update test helper to pass through ONNX Runtime optimization level. |
onnxscript/rewriter/fuse_relus_clips_test.py | Add unit tests for each fusion pattern and edge‐case validations. |
Comments suppressed due to low confidence (2)
onnxscript/rewriter/fuse_relus_clips.py:161
- The variable name
fuse_sucessive_relu_clip_rule
has a typo (sucessive
vs.successive
). Rename it tofuse_successive_relu_clip_rule
for consistency with the other rules, and update any references.
fuse_sucessive_relu_clip_rule = FuseSuccessiveReluClip().rule()
onnxscript/rewriter/testing.py:27
- [nitpick] The
Args:
section in the docstring does not match the parameter order of the function signature. Consider reordering the entries so they follow(original_model_proto, rewritten_model_proto, args, ort_optimization_level, rtol, atol)
.
ort_optimization_level: Onnxruntime optimization level.
expected_op_type: str, | ||
dtype: str = "float", | ||
): | ||
base_model = ir.serde.deserialize_model(base_model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
base_model = ir.serde.deserialize_model(base_model) |
|
||
def run_test( | ||
self, | ||
base_model: onnx.ModelProto | ir.Model, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
base_model: onnx.ModelProto | ir.Model, | |
base_model: ir.Model, |
model_proto = onnx.parser.parse_model(""" | ||
< ir_version: 10, opset_import: ["" : 20] > | ||
test_model (float[N, 32, 14] X) => (float [N, ?, ?] Y) | ||
{ | ||
x1 = Relu(X) | ||
x2 = Relu(x1) | ||
Y = Relu(x2) | ||
} | ||
""") | ||
self.run_test(model_proto, expected_op_type="Relu") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model_proto = onnx.parser.parse_model(""" | |
< ir_version: 10, opset_import: ["" : 20] > | |
test_model (float[N, 32, 14] X) => (float [N, ?, ?] Y) | |
{ | |
x1 = Relu(X) | |
x2 = Relu(x1) | |
Y = Relu(x2) | |
} | |
""") | |
self.run_test(model_proto, expected_op_type="Relu") | |
model = ir.from_onnx_text(""" | |
< ir_version: 10, opset_import: ["" : 20] > | |
test_model (float[N, 32, 14] X) => (float [N, ?, ?] Y) | |
{ | |
x1 = Relu(X) | |
x2 = Relu(x1) | |
Y = Relu(x2) | |
} | |
""") | |
self.run_test(model, expected_op_type="Relu") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here and below, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you - I think this can be part of the default rewrite rules
cc @gramalingam
- Relu(Relu(X)) -> Relu - Relu(Clip(X)) -> Clip - Clip(Relu(X)) -> Clip - Clip(Clip(X)) -> Clip
ab90aaf
to
e0f6332
Compare
This PR adds the following transformation: