Tf-pose-estimation FPS issues

Hello, so im using my Jetson xavier agx for fall detection using opencv and the tf-pose-estimation library however my framerate is incredibly low like below 2 fps. I tried it on the jetson nano as well and the fps was the same. Any idea why? Ill link my github and upload the code there but im not sure whats the issue. i dont think i should be getting 2fps and less.


First, please try to maximize the device performance with below command:

$ sudo nvpmodel -m 0
$ sudo jetson_clocks

If the performance doesn’t improve, please share the tegrastats output with us.
You can find the GPU utilization ratio from the GR3D X%@Y info.

$ sudo tegrastats



So I did what you asked for and the situation is still pretty much the same. The fps has only increased by 1 fps i think.

sudo tegrastats gave this result (Note: i replaced @ with at so that i can list the whole message):

“11-14-2022 01:27:11 RAM 2452/14899MB (lfb 2467x4MB) SWAP 0/7449MB (cached 0MB) CPU [1%at2264,0% at 2268,0% at 2265,0% at 2265,0% at 2265,0% at 2265,0% at 2266,0% at 2258] EMC_FREQ 0% at 2133 GR3D_FREQ 0% at 1377 VIC_FREQ 115 APE 150 AUX at 38.5C CPU at 39.5C thermal at 39.25C Tboard at 39C AO at 39.5C GPU at 39.5C Tdiode at 42C PMIC at 50C GPU 1083mW/1083mW CPU 773mW/773mW SOC 2630mW/2630mW CV 0mW/0mW VDDRQ 154mW/154mW SYS5V 2495mW/2495mW”



Based on your log, the GPU utilization is 0%.
This indicates that the GPU is waiting for the input to execute and idle.

... GR3D_FREQ 0% at 1377 ...

Is TensorRT an option for you?
If yes, please give it a try since it has optimized the inference performance for Jetson.


is there anyway i can make this work on tf-pose-estimation maybe i can make the GPU utilization work somehow. Because then i will have to redo my whole project.


How do you install TensorFlow for Jetson?
Could you check if the package has been built with CUDA support first?

It’s recommended to use our prebuilt and below is the installation guide for your reference:


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.