trtexec fails on flipped image with "slice size must be positive"

I created a simple network in tensorflow that contains an horizontal flip of an input image:

ph = tf.placeholder(tf.float32, (1, 16, 32, 1), name='ph')
x = tf.concat([ph, ph[:, :, ::-1]], axis=0)
y = tf.add(x, 1, name='y')

I converted it in onnx format using tf2onnx. However trtexec failed on the generated onnx with a

[E] [TRT] strided_slice: slice size must be positive, size = [1,16,-32,1]

Is tensor flipping supported in TensorRT? If so, how to handle it?

Hi,

It looks like a negative stride step should be handled here: onnx-tensorrt/builtin_op_importers.cpp at main · onnx/onnx-tensorrt · GitHub

But I tried building most up to date OSS from source and it still fails with same error:

[02/06/2020-20:51:25] [E] [TRT] strided_slice: slice size must be positive, size = [1,16,-32,1]

I’ll look into this, thanks for bringing it up and thanks for the minimal repro!

Hi,

Attached screenshot of the ONNX graph. On axis[2] = 2, the slice is attempting to slice from starts[2] = 0 to ends[2] = 10000 going backwards at steps[2] = -1, which doesn’t make sense.

Can you double check your model? Does it make sense in the Tensorflow model? If so, maybe an error with conversion using tf2onnx? Might want to post an issue at https://github.com/onnx/tensorflow-onnx/issues if so.

External Media

I don’t know the details of how ONNX works, but I think it is working as intended. Although the default values for begin/end indices are indeed rather strange, when I run

shape_inference.infer_shapes(model)

on my onnx model, I get the correct shapes:

[name: "strided_slice:0"
type {
  tensor_type {
    elem_type: 1
    shape {
      dim {
        dim_value: 1
      }
      dim {
        dim_value: 16
      }
      dim {
        dim_value: 32
      }
      dim {
        dim_value: 1
      }
    }
  }
}
, name: "concat:0"
type {
  tensor_type {
    elem_type: 1
    shape {
      dim {
        dim_value: 2
      }
      dim {
        dim_value: 16
      }
      dim {
        dim_value: 32
      }
      dim {
        dim_value: 1
      }
    }
  }
}
]

I tried to feed this model with inferred shapes to trtexec but got the same error.

The ONNX model may be corrupt after all, since I cannot use it for inference with the onnx runtime. I filed an issue there, and will update here if needed. Thanks.

Thanks for the update. Can you link to the issue as well just for future reference?

Sure, it can be found here:

https://github.com/onnx/tensorflow-onnx/issues/810