Describe the bug
When trying to convert a tensorflow saved model which contains several Bidirectional ConvLSTM2D Layers with python -m tf2onnx.convert --saved-model ... I get the following error:
ValueError: node BiasAdd output needs to be rank 4, is 3
Urgency
Nope.
System information
- Arch Linux
- 2.2 & 2.5
- Python 3.8
Model
This is the model I used (pastebin.com)
Terminal Output
Output from terminal (pastebin.com)
To Reproduce
Simply use
python -m tf2onnx.convert --saved-model /saved_model_dir --tag=serve --opset 10 --output /output_dir/model.onnx --continue_on_error
from a terminal on a model containing Bidirectional ConvLSTM2D Layers.
Expected behavior
Converting the saved model to onnx without errors.
Additional context
I tried using opset 11 & 12, and using --fold_const as option, but neither worked.