Yolov4 tensorrt PluginV2Layer must be V2DynamicExt when there are runtime

Hi,
I want to implement yolov4 tensorrt in xavier nx,
I follow this forum and try to build the progeam.
Everything went well when I build the environment and tensorrt oss toolkit.

But when I run ../bin/yolov4 to convert model from .onnx to .engine, the error showed like this,

&&&& RUNNING TensorRT.sample_yolo # ../bin/yolov4 --fp16
There are 0 coco images to process
[12/04/2020-15:59:06] [I] Building and running a GPU inference engine for Yolo
[12/04/2020-15:59:08] [I] Parsing ONNX file: ../data/yolov4.onnx
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:09] [W] [TRT] onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
[12/04/2020-15:59:10] [I] [TRT] ModelImporter.cpp:135: No importer registered for op: BatchedNMS_TRT. Attempting to import as plugin.
[12/04/2020-15:59:10] [I] [TRT] builtin_op_importers.cpp:3659: Searching for plugin: BatchedNMS_TRT, plugin_version: 1, plugin_namespace: 
[12/04/2020-15:59:10] [I] [TRT] builtin_op_importers.cpp:3676: Successfully created plugin: BatchedNMS_TRT
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [I] Building TensorRT engine../data/yolov4.engine
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] (Unnamed Layer* 3429) [PluginV2Ext]: PluginV2Layer must be V2DynamicExt when there are runtime input dimensions.
[12/04/2020-15:59:10] [E] [TRT] Layer (Unnamed Layer* 3429) [PluginV2Ext] failed validation
[12/04/2020-15:59:10] [E] [TRT] Network validation failed.
&&&& FAILED TensorRT.sample_yolo # ../bin/yolov4 --fp16

Is anyone know how to solve this problem?
Any suggestion is appreciated, thanks.

Hi, is anyone have any idea for this problem?
Really need for help, thanks.

Hi,

The error indicates your model using a dynamic shape, but the plugin only supports the static version.
Please provide a batch size parameter when generating the onnx model to get a static version.

Thanks.

Hi, @AastaLLL, thanks for your reply.
I follow this github to build onnx model


Here is my input command and get static onnx model successfully.
python demo_darknet2onnx.py yolov4.cfg yolov4.weights test.jpg 1

I also follow this guideline to generate YOLOv4 ONNX model with BatchedNMSPlugin node included.

But after generate BatchedNMSPlugin node, when I run ../bin/yolov4 the problem still show "PluginV2Layer must be V2DynamicExt when there are runtime input dimensions."

If I skip generate add BatchedNMSPlugin part and just run ../bin/yolov4, it also have another error as below.

[12/07/2020-11:02:13] [I] [TRT] Detected 1 inputs and 8 output network tensors.
[12/07/2020-11:02:24] [I] TRT Engine file saved to: ../data/yolov4.engine
4
[12/07/2020-11:02:24] [I] Loading or building yolo model done
[12/07/2020-11:02:26] [E] [TRT] INVALID_ARGUMENT: Cannot find binding of given name: num_detections
[12/07/2020-11:02:26] [E] [TRT] INVALID_ARGUMENT: Cannot find binding of given name: nmsed_boxes
[12/07/2020-11:02:26] [E] [TRT] INVALID_ARGUMENT: Cannot find binding of given name: nmsed_scores
[12/07/2020-11:02:26] [E] [TRT] INVALID_ARGUMENT: Cannot find binding of given name: nmsed_classes
NULL value output detected!
Segmentation fault (core dumped)

Besides, I found I use wrong tag, My device is Xavier NX instead of AGX Xavier.

Is there any mistake or any wrong step I do when I generate tensorrt engine?
Thanks.

Hi,

Please generate the ONNX model with this section:

We have checked the repository, and it can work correctly with the model.
Thanks.

After testing with Tianxiaomo/pytorch-YOLOv4 - 4. Pytorch2ONNX, the problem is the same, maybe there are some environment problem.

However, I follow this github and run successfully.

Good to know this.
Thanks for the update.

Successful conversion but with zero detection: https://github.com/NVIDIA-AI-IOT/yolov4_deepstream/issues/3#issuecomment-753421935

Blunder resolved: https://github.com/NVIDIA-AI-IOT/yolov4_deepstream/issues/3#issuecomment-757589640