Skip to content

🐛 [Bug] Torch-TRT's prelu is slower than ONNX-TRT #3715

@zewenli98

Description

@zewenli98

Bug Description

In Monai/Unet, I observed that Torch-TRT's prelu is slower than ONNX-TRT, and what is worse is that using prelu converter is even slower than decomposing it. The reason seems to be additional layers are inserted into the graph:

ONNX-TRT:
{ "name" : "PWN(val_9, PWN(node_prelu))", "timeMs" : 1.10378, "averageMs" : 0.00951531, "medianMs" : 0.009216, "percentage" : 0.0934733 }

Torch-TRT w/o decomposition:
{ "name" : "__myl_Cast_myl3_1", "timeMs" : 0.95728, "averageMs" : 0.00878239, "medianMs" : 0.009216, "percentage" : 0.0743985 }
{ "name" : "model.0.conv.unit0.adn.A/_prelu_kernel_constant_1", "timeMs" : 0, "averageMs" : 0, "medianMs" : 0, "percentage" : 0 }
{ "name" : "PWN([PARAMETRIC_RELU]-[aten_ops._prelu_kernel.default]-[model.0.conv.unit0.adn.A/_prelu_kernel])", "timeMs" : 2.87757, "averageMs" : 0.0263997, "medianMs" : 0.026624, "percentage" : 0.223641 }

Torch-TRT w/ decomposition:
{ "name" : "__myl_MulGtrSele_myl3_1", "timeMs" : 0.946529, "averageMs" : 0.00884606, "medianMs" : 0.009216, "percentage" : 0.0707629 }

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions