Random Bounding Box in FasterRCNN etlt model in Xavier 30W Mode

I also run INT8 mode under 30w mode, using your trained model,
Here is the output, seems can not repro your issue either under INT8 mode 30w power mode, please confirm.
INT8, BS 8, 30W power:

INT8, BS 6, 30W power:

I glad to hear the good news from you. I will check further. Do you mind outline and share the procedure you have done ? For example, did you compile the TensorRT OSS ? What is the config file you are using ? Thanks.

I am using original built TRT OSS library,
sudo cp /usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.7.1.0 ~/libnvinfer_plugin.so.7.1.0.orig
sudo cp TRT-OSS/Jetson/libnvinfer_plugin.so. /usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.7.1.0

and config used is from yours, just change the file location, here i pasted.
config_infer_primary_kitti_fp16_b8.txt (1.4 KB)
config_infer_primary_kitti_int8.txt (1.3 KB)

./deepstream-custom -c config_infer_primary_kitti_fp16_b8.txt -i /opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264 -b 8

./deepstream-custom -c config_infer_primary_kitti_int8.txt -i /opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264 -b 8

Thanks. Previously, I tested with JetPack 4.4 Developer Preview. I just found that the offical JetPack 4.4 is released. I think I will test it again with the offical release and update you here again.

@amycao @mchi I have tested again with JetPack 4.4 official release in the same Jetson Xavier AGX with the same config file I shared and deepstream-app 5.0. I only replaced with libnvinfer_plugin.so.7.1.3 with libnvinfer_plugin.so. Same problem happened. @amycao In your testing, is it working fine with the deepstream-app as well ? Thanks.

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.

Can you use Jetpack 4.4 DP version? DS 5.0 is dependent on Jetpack 4.4 DP version.