How to get co-ordinates and label in pose _estimation app

**• Hardware Platform - jetson nano
**• DeepStream Version - 5.0
**• TensorRT Version - 7.0
**• Issue Type - questions **

I referred to this example [GitHub - NVIDIA-AI-IOT/deepstream_pose_estimation: This is a sample DeepStream application to demonstrate a human pose estimation pipeline.]. and
trying to get co-ordinates with the label or index position through deep stream app.

For Example:
[nose having 0 index and some x,y co-ordinates]
reference:
[NVIDIA Jetson: JetsonNano - NVIDIA AI IOT - Human Pose estimation using TensorRT]

Which variables have to access for printing co-ordinates with the label or index position in same deep stream app.

Thanks

Hey, I really don’t get what you are trying to do, could you explain more details.
BTW, for the post process in deepstream_pose_estimation_app, you can refer https://github.com/NVIDIA-AI-IOT/deepstream_pose_estimation/blob/master/post_process.cpp

Hi

I am trying to get x,y co-ordinates with index position of detected key points of human body

Reference:[trt_pose/human_pose.json at master · NVIDIA-AI-IOT/trt_pose · GitHub]
in this json file keypoints are like nose, left_eye, right_eye
want to get where this keypoints are located(i.e x and y coordinate) in reference to frame height width.

The co_ordinates is same like drawing circle on frame.
But not getting which co-ordinates are of which keypoints.

Hey, so you can get the x,y co-ordinates using this repo, right? Can you point me where the post process in the repo? so I can avoid to go through all the source code in the repo.

Hey

I am using deepstream pose estimation app only [GitHub - NVIDIA-AI-IOT/deepstream_pose_estimation: This is a sample DeepStream application to demonstrate a human pose estimation pipeline.]

I got x,y coordinate by accessing
cparams.xc = x;
cparams.yc = y;
this variables from the same deepstream app repo.

Please help me to get this co-ordinates are of which keypoints.

Ok, can refer the Post-processing section in https://developer.nvidia.com/blog/creating-a-human-pose-estimation-application-with-deepstream-sdk/?ncid=so-link-52952-vt24&sfdcid=EM08#cid=em08_so-link_en-us

Yes, i refer the post-processing part

i just got some index value of body part by accessing
int c_b and int c_a in ‘connect_parts’ function

But not properly understood how its related to its x,y coordinate.

1 Like

have you solved it? Im at the same point. Where are the labels? Any pointer will be very much appreciated. Thanks

yes, but not sure whether it is right way or not

But by accessing c_a and c_b variables in create_display_meta() of deepstream_pose_estimation_app.cpp

here c_a and c_b variables gives the integer value which are nothing but the labels

for e.g c_b == 6 then its indicating right shoulder.
for other labels point refer [NVIDIA Jetson and Raspberry Pi: JetsonNano - NVIDIA AI IOT - Human Pose estimation using TensorRT]

1 Like