Description
I currently have an ONNX file for object detection. Its based on the Mobilenet-v2 with SSDlite tail. I have linked the file above as a Google drive link. I want to generate the corresponding TensorRT file with dynamic batch support using the explicit batch mode. To achieve this, first I made edits to the ONNX file such that the batch dimension of the inputs and the outputs are -1.
But, upon generating the engine file with this updated ONNX file using the CLI command trtexec
, I get the following error:
[08/11/2022-10:28:07] [E] Error[2]: [graphShapeAnalyzer.cpp::throwIfError::1306] Error Code 2: Internal Error (Mul_184: dimensions not compatible for elementwise ) [08/11/2022-10:28:07] [E] Error[2]: [builder.cpp::buildSerializedNetwork::417] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed.) Segmentation fault (core dumped)
The command I enter using trtexec
is:
/usr/src/tensorrt/bin/trtexec --onnx=mobilenet-v2-ssdlite.onnx --saveEngine=mobilenet-v2-ssdlite.trt --fp16 --inputIOFormats=fp32:chw --outputIOFormats=fp32:chw --workspace=4096 --minShapes=input:1x3x300x300 --maxShapes=input:10x3x300x300 --optShapes=input:10x3x300x300 --explicitBatch
Can you please guide on the next steps on fixing this. Thanks!
Environment
TensorRT Version: 8.0.1.6
GPU Type: GTX 1660Ti
Nvidia Driver Version: 470.141.03
CUDA Version: 11.4
CUDNN Version: 10.4.0
Operating System + Version: Ubuntu 18.04
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):
Relevant Files
Google Drive link: mobilenet-v2-ssdlite.onnx - Google Drive
Steps To Reproduce
/usr/src/tensorrt/bin/trtexec --onnx=mobilenet-v2-ssdlite.onnx --saveEngine=mobilenet-v2-ssdlite.trt --fp16 --inputIOFormats=fp32:chw --outputIOFormats=fp32:chw --workspace=4096 --minShapes=input:1x3x300x300 --maxShapes=input:10x3x300x300 --optShapes=input:10x3x300x300 --explicitBatch