Detection result is different between Xavier and Orin for the same model and weights

Hi @junshengy,
Here is the output data based on the patch you provided, including the content of the patch linke to DeepStream SDK FAQ - #9 by mchi.
Detection_result_different_io_data_with_plugin.7z (2.8 MB)

From the data of each plugin dump

1.There is no problem in nvv4l2decoder, It’s ok.

2.nvstreammux scale width and height from 512x432 to 512x416,
This plugin caused a big difference.
If also using gpu scaling like you describe, a little hard to explain.
Can you share the property settings of nvstreammux?

3.And on xavier, It looks like nvvideoconvert converts the data to rgb.
I think need to confirm the negotiation of the caps from nvvideoconvert to nvinfer

Can you update the jetpack and deepstream version on xavier ?
Differences caused by different versions are difficult to explain


I think I’m using VIC for scaling, is that the default?

I have provided the modified source code and config file in the document below. The code is modified based on the patch file.

There will be two folders to distinguish files for different machines, namely “Xavier” and “Orin”.

The difference in their code lies in the modification I made the “Xavier” app to enable it to save image files, while the “Orin” app remains unchanged in this aspect. And, both apps will output inference results.

Sorry, I cannot update the jetpack and deepstream version on Xavier.

detection_result_diff-relative_data.7z (37.6 KB)

I checked your patch,There are two places worth noting

1.on orin and Xavier,make sure nvstreammux scale use gpu.

g_object_set (G_OBJECT (streammux), "compute-hw", 1, NULL);

2.on orin,remove nvvideoconv before nvinfer.

// delete 
nvvideoconv = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter1");

nvstreammux can be input directly to nvinfer,to reduce some preprocessing

make sure the pipeline is the same for both Orin and Xavier

Hey @junshengy,

I modified the code to make nvstreammux and nvvideoconvert use the GPU.
And I found that the results of GPU inference are strange, while the results of VIC inference are normal.

In the “orin” folder, you can find the results, along with the corresponding inputs and outputs data of VIC inference. The inference results for test17.jpg and test21.jpg are included.

In the “orin-gpu” folder, you will see test0~4.jpg, where there are no objects inside. There were originally objects present. Additionally, there is test22.jpg, and its inference result is strange as it appears to have merged two images. These are all inference results obtained using GPU.

The code used for this test is also included in the compressed file, “0523_detection_result_different_io_data.7z (2.7 MB)”.

I also tested this feature, and when I deleted the code related to nvvideoconv, I found that it seems unable to be removed, which results in the following error message.

ERROR from element source: Internal data stream error.
Error details: gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:dstest-image-decode-pipeline/GstBin:source-bin-00/GstMultiFileSrc:source:
streaming stopped, reason not-negotiated (-4)
Returned, stopping playback

Sorry for long delay!

1.jpegenc use software buffer is not compatible with gpu buffer.

remove the code,will be ok

g_object_set (G_OBJECT (nvvideoconvert), "compute-hw", 1, NULL);

2.Sorry for the mistake of the previous reply.
Due to the difference of version, remove this plugin will cause pipeline not work, please ignore this patch

3.I set propety of compute-hw,let streammux and videoconvert use gpu scaling.
Use test17.jpg and test21.jpg as input image,The inference results is null
It’s a bug,We will look into this issue and will be back once there is any progress.

g_object_set (G_OBJECT (streammux), "compute-hw", 1, NULL);
g_object_set (G_OBJECT (nvvideoconv), "compute-hw", 1, NULL);


Please update once you have any finding.

Need to wait for new release.

At present, it seems that 6.0 and 6.2 are not fully compatible,We recommend using the same version on the production environment.