ONNX parser

Exported a pytorch network to ONNX with succes (tested with ONNX runtime, giving correct answer). When parsing it in Tensort, I get this message (no more info, even when logging is “verbose”):
[01/27/2021-10:19:44] [V] [TRT] ModelImporter.cpp:129: If_22 [If] inputs: [255 → (1)],
While parsing node number 22 [If]:
ERROR: ModelImporter.cpp:134 In function parseGraph:
[8] No importer registered for op: If
This error occurs in a subnetwork forward() @ the following command:
output = output.transpose(4,2).squeeze(2). The squeeze(2) poses a problem. The tensor has dimension (1, 38, 128, 17,1),-> (1,38,1,17,128) after transpose and want to get rid of the third dimension (dim = 2).
I tried an alternative to squeeze(2) with output = torch.mean(output, 2) on the transposed output (which squeezes also dim 2) but get a new error then:
(input0)=[1,1,128,17,17][NONE] dims(input1)=[1,38,128,17,1][NONE].
[01/27/2021-10:33:41] [E] [TRT] Layer: Where_102’s output can not be used as shape tensor.
[01/27/2021-10:33:41] [E] [TRT] Network validation failed.
It fails at a “where” node is defined as:
Return elements, either from X or Y, depending on condition (with Numpy-style broadcasting support). Where behaves like numpy.where with three parameters: numpy.where — NumPy v1.23 Manual

Any idea how to solve this?