Assertion failed: (axes.allValuesKnown())

Please provide complete information as applicable to your setup.

• Hardware Platform (GPU)
**• DeepStream Version6.1
**• TensorRT Version8.2
**• NVIDIA GPU Driver Version 515

Dear professor:
I am applying YoloV4_tiny. I have translate the torch version into onnx. But when I use tensorrt to make engine, I failed.

The onnx model and cfg as below
yolov4_-1_3_416_416_dynamic.onnx (22.5 MB)

yolov4-tiny-person.cfg (2.9 KB)

(1) I checked the onnx model by:
import onnx
filename = “yolov4_1_3_416_416_static.onnx”
model = onnx.load(filename)
onnx.checker.check_model(model)
print(onnx.helper.printable_graph(model.graph))

There is no mistake.

(2) I use the command to translate the engine:
/usr/src/tensorrt/bin/trtexec --onnx=yolov4_-1_3_416_416_dynamic.onnx --minShapes=input:1x3x416x416 --optShapes=input:8x3x416x416 --maxShapes=input:8x3x416x4160 --workspace=4096 --saveEngine=my.engine --fp16 --verbose

I got the mistake:
。。。。。。
08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:103: Parsing node: Constant_79 [Constant]
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:125: Constant_79 [Constant] inputs:
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:179: Constant_79 [Constant] outputs: [onnx::Slice_179 → (1)],
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:103: Parsing node: Slice_80 [Slice]
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Shape_161
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_176
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_177
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_1101
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_179
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:125: Slice_80 [Slice] inputs: [onnx::Shape_161 → (-1, 128, 52, 52)], [onnx::Slice_176 → (1)], [onnx::Slice_177 → (1)], [onnx::Slice_1101 → (1)], [onnx::Slice_179 → (1)],
ERROR: builtin_op_importers.cpp:3122 In function importSlice:
[8] Assertion failed: axes.allValuesKnown()
[08/08/2022-10:07:03] [E] Failed to parse onnx file
[08/08/2022-10:07:03] [E] Parsing model failed
[08/08/2022-10:07:03] [E] Engine creation failed
[08/08/2022-10:07:03] [E] Engine set up failed
。。。。。

here, I found the tensorrt try to find the “Shape_161” , “Slice_176”, “Slice_177”, “Slice_1101”, “Slice_179”.
However, all of them can not find in the result of “print(onnx.helper.printable_graph(model.graph))”

what is the matter? Could you help me?

1 Like

This looks like TensorRT related. We are moving this post to the TensorRT forum to get better help.

Thank you very much

Hi,

TensorRT does not support dynamic axes for Slice currently.
We recommend you to please run the Polygraphy tool.

polygraphy surgeon sanitize model.onnx --fold-constants --output model_folded.onnx

And also looks like you mentioned dimension wrongly in the command. Please use the command as follows.

&&&& PASSED TensorRT.trtexec [TensorRT v8402] # /usr/src/tensorrt/bin/trtexec --onnx=model_folded.onnx --minShapes=input:1x3x416x416 --optShapes=input:8x3x416x416 --maxShapes=input:8x3x416x416 --workspace=4096 --saveEngine=my.engine --fp16 --verbose

Thank you.

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Thank you very much. I will try.