Skip to content

Error trying to export models with timm encoder to onnx format #589

@rgsousa88

Description

@rgsousa88

Hi,

Is there any issue related to onnx exporting when using timm models as encoders? I'm trying to export a model with timm-MobileNetV3 as encoder and FPN as decoder.

I'm running the script in a conda environment with python=3.7, pytorch=1.8, segmenation_model_pytorch=0.2.1

Building network:

import segmentation_models_pytorch as smp

def build_fpn_mobv3(input_shape): 
    model = smp.FPN(encoder_name="timm-mobilenetv3_large_100",
                    encoder_weights=None, 
                    in_channels=3,
                    classes=1,
                    activation="sigmoid")
    
    shape = (1,3,) + input_shape
    x = torch.zeros(shape,dtype=torch.float32,device=torch.device('cpu'))
    model.eval()
    model(x)
    
    return model 

Exporting trained model

onnxpath = os.path.join(ckpt_path, f"{model_name}.onnx")
dynamic_axes = {0: 'batch_size'}
shape = (1,3,) + args.shape
dummy_input = torch.zeros(shape,requires_grad=True).float().to(device)
net.eval()

torch.onnx._export(net, dummy_input, onnxpath,
                   export_params=True, verbose=True,
                   input_names=['image'], output_names=['maps'],
                   keep_initializers_as_inputs=False,
                   dynamic_axes={'image': {0: 'batch'}, 'maps': {0: 'batch'}},
                   opset_version=10)

The error is:
RuntimeError: Unsupported: ONNX export of Pad in opset 9. The sizes of the padding must be constant. Please try opset version 11.

Is there any workaround to export using opset version 10? I'm able to export using other encoders but none of them are timm ones.

Thanks for attention and time.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions