Jetson Xavier AGX inferences error

• Hardware Platform (Jetson / GPU): Jetson Xavier AGX
• DeepStream Version: 5.1
• JetPack Version (valid for Jetson only): 4.5.1
• TensorRT Version: 7.1.3
Hello , I am trying to run deepstream-segmentation python code on AGX.
Here is the GitHub link: (deepstream_python_apps/apps/deepstream-segmentation at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub)
I put the trained model I trained, also edit the “dstest_segmentation_config_semantic.txt”
But I met a problem, below is the problem I met, how can I solve it, thanks.


P.S. I had tried the model which downloaded from the GitHub, it totally ran successfully, show below.

Can you revert your changes one by one to narrow down which change cause the fail as it works fine without your change?

I highlighted the information what i had changed based on my model, show below.


The error is show below, also the files in the folder.

Is there any information in the python file that needs to be changed?

Seems error report in python code. Can you have a debug?

I find something strange in the error message.
The thing is that the model I exported its output node is “Softmax_1 (Softmax)” and its output shape is “(None, 720, 1280, 19)”, show below.


The code in “deepstream_segmentation.py” is “bgr = np.zeros((shp[0], shp[1], 3))”, I print the shp[0] and shp[1] and get 1280 and 19, I think the correct ones should be gotten are 720 and 1280.

I guess that’s why there are errors.


I have directly changed “bgr = np.zeros((720,1280,3))”, but it still not work.

How can I solve it?
Thanks.

Do you output the model output and parsed the model output in python?