Inavalid paf inputs

Hi there i just trained a model that is Bodypose2d and while replacing it iam having issues the engine file is created but when inference is run it shows this error,


Please let me know why this issue is occuring and what could be possibly solution to this.
Lastly please note that i have finetuned 2dbodypose model on COCO 2017 dataset.

Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
• The pipeline being used

Fine turning does not cause these problems usually.
Some users reported that restarting the system or regenerating the engine file can solve the problem.

iam working on jetson xavier but i dont think that will be an issue,please look into this effectively and let me know what is possible solution to this.
I have fine-tuned my bodypose2d model on coco 2017 keypoints using tao toolkit.


This above is my Config file and the error logs is as below:

Please review it and let me know a detailed problem and its solution

I need to know the versions of jetpack and deepstream you are using, otherwise I cannot locate the problem. For xavier, the highest supported version is DS-6.3.
In addition, you seem to have modified the output of the model, which does not match the
release/tao4.0_ds6.3ga
branch. You need to modify the corresponding code

HI there i saw your link to branch that you have mentioned above,however let me clear about this.
I have cloned deepstream_tao_apps/apps/tao_others/deepstream-bodypose2d-app/deepstream_bodypose2d_app.cpp at release/tao5.3_ds7.0ga · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub
this branch and iam working with this.
2- I have output layer paf and heat_ out like i mentioned above i have updated the config files but i havent updated it inside code yes,please let me know about this in detail shall i update the layer names only in code or the dimensions etc also needs to be done?
3-If you still insist i will let you know my jetpack version and DS for the xavier iam working on.

This needs to be consistent with your DeepStream SDK version, which is why I need to determine your Jetpack and DeepStream SDK versions
Xavier should use DS-6.3 branch or below

Yes, the link I gave above is the corresponding line of code that needs to be modified. The names of these layers need to be consistent with your model, then parse the output tensor from the model. Please decide whether to modify dimensions according to your needs.

  1. Your Jetson Xavier NX is running JetPack 5.1.2 with L4T 35.4.1. This is my jetpack version.
  2. I have updated the layers names for my model.The output layers are output-blob-names: paf_out/BiasAdd:0;heatmap_out/BiasAdd:0- also i have updated them in my config file and named it inside cpp file too.
    Please let me know what could be this issue because of.

Please check the output of the model first. Fine-tuning should not change the dimensions of the model’s output tensor.

This is the default onnx model output , This doesn’t match the log you provided.

I see you also have provided me the output layers with netron app.
Yes i have verified it and my output layers are different from this.
What is the issue? Why it have been changed? Though i didnt see any any step during fine-tuning where these layers were being defined. I have also attached my model below pls also have a look at it and let me know what output layers you get just for confimation.
bpnet_model.zip (60.1 MB)

Fine tuning should only modify the coefficients, but the structure of bodyposenet has been changed. You may need to consult the TAO forum to find out why this happened. This is not something DeepStream can handle.

HI there i have finalized the model and now i need help with setting up /home/orinnx1/2dBodypose_sampleapp/deepstream_tao_apps/configs/nvinfer/bodypose2d_tao/sample_bodypose2d_model_config.yml
i need to know how should i set this file for me.
i have tested out different things but the issue is same please provide me updated file content that i can use in this file.
Please also note that i have Fine-tuned the model on COCO 2017 keypoints and the model will definitely work with 17 points and not the 18 which default model has.
Lastly also my input dimensions and values are set as below:

Sorry for the long delay, The definition and explanation of the BodyPose2DPostProcessor parameters are in the project’s README

You can try to modify the numJoints/jointEdges parameters. We have not tried 17 key points. If it does not work, since we have removed the support of bodypose2D in DS-7.1 and later versions, you can consider adding your own post-processing instead of BodyPose2DPostProcessor

I know you guys have removed support for DS-7.1.
Please make sure that iam working with previous branch of you guys mentioned below:

and in this i dont see any BP2dpostprocessor.
Pleease let me know how this will work now.
Lastly have a look at this below atatched logs:


I have been able to set PAF dimension issue and pipeline is running but do detections are being performed.

this is my codel config file and iam sure this is what needs to be updated please let me know what you suggest