Hi,
We try to reproduce this issue but the onnx model works good in our environment.
Here are our steps for your reference:
1. test.py
import tensorflow as tf
import keras2onnx
import onnx
batch_size = 1
input_height = 224
input_width = 224
input_layer = tf.keras.layers.Input(batch_shape=[batch_size, 3, input_height, input_width], dtype=tf.float32)
min = 0.5
x1 = tf.keras.layers.Conv2D(16, 5, padding="same", data_format="channels_first")(input_layer)
x2 = tf.keras.layers.Conv2D(1, 3, padding="same", data_format="channels_first", activation="sigmoid")(x1) / min
concat = tf.concat([x1, x2], axis=1)
x3 = tf.keras.layers.Conv2D(16, 5, padding="same", data_format="channels_first")(concat)
model = tf.keras.Model([input_layer], [x2, x3])
onnx_model = keras2onnx.convert_keras(model, model.name)
onnx.save_model(onnx_model, "output.onnx")
2. Test with trtexec:
$ python3 test.py
$ /usr/src/tensorrt/bin/trtexec --onnx=output.onnx
The model works good with trtexec.
It’s recommended to try your model with trtexec also.
3. Here is the library version of our environment:
$ pip3 freeze
Keras-Applications==1.0.8
Keras-Preprocessing==1.1.2
keras2onnx==1.7.0
onnx==1.7.0
onnxconverter-common==1.7.0
tensorflow==1.15.3+nv20.7
Thanks.