Closed
Description
aten.gelu + aten.tanh
-
Function Schema:
torch.ops.aten.gelu.default: ((torch.float32,), {})
,torch.ops.aten.tanh.default: ((torch.float32,), {})
-
Original PyTorch API:
torch.gelu
,torch.tanh
-
Relevant TensorRT Documentation: IActivationLayer
- Potentially can take inspiration from existing TorchScript GeLU lowering pass
Add support for gelu
and tanh
as aten converters.