Please provide complete information as applicable to your setup.
• Hardware Platform (GPU)
**• DeepStream Version6.1
**• TensorRT Version8.2
**• NVIDIA GPU Driver Version 515
Dear professor:
I am applying YoloV4_tiny. I have translate the torch version into onnx. But when I use tensorrt to make engine, I failed.
The onnx model and cfg as below
yolov4_-1_3_416_416_dynamic.onnx (22.5 MB)
yolov4-tiny-person.cfg (2.9 KB)
(1) I checked the onnx model by:
import onnx
filename = “yolov4_1_3_416_416_static.onnx”
model = onnx.load(filename)
onnx.checker.check_model(model)
print(onnx.helper.printable_graph(model.graph))
There is no mistake.
(2) I use the command to translate the engine:
/usr/src/tensorrt/bin/trtexec --onnx=yolov4_-1_3_416_416_dynamic.onnx --minShapes=input:1x3x416x416 --optShapes=input:8x3x416x416 --maxShapes=input:8x3x416x4160 --workspace=4096 --saveEngine=my.engine --fp16 --verbose
I got the mistake:
。。。。。。
08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:103: Parsing node: Constant_79 [Constant]
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:125: Constant_79 [Constant] inputs:
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:179: Constant_79 [Constant] outputs: [onnx::Slice_179 → (1)],
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:103: Parsing node: Slice_80 [Slice]
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Shape_161
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_176
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_177
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_1101
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:119: Searching for input: onnx::Slice_179
[08/08/2022-10:07:03] [V] [TRT] ModelImporter.cpp:125: Slice_80 [Slice] inputs: [onnx::Shape_161 → (-1, 128, 52, 52)], [onnx::Slice_176 → (1)], [onnx::Slice_177 → (1)], [onnx::Slice_1101 → (1)], [onnx::Slice_179 → (1)],
ERROR: builtin_op_importers.cpp:3122 In function importSlice:
[8] Assertion failed: axes.allValuesKnown()
[08/08/2022-10:07:03] [E] Failed to parse onnx file
[08/08/2022-10:07:03] [E] Parsing model failed
[08/08/2022-10:07:03] [E] Engine creation failed
[08/08/2022-10:07:03] [E] Engine set up failed
。。。。。
here, I found the tensorrt try to find the “Shape_161” , “Slice_176”, “Slice_177”, “Slice_1101”, “Slice_179”.
However, all of them can not find in the result of “print(onnx.helper.printable_graph(model.graph))”
what is the matter? Could you help me?