Hello,
Jetpack 7.0
L4T: 38.2.2
jetson_release output:
We ran GPU performance tests on the Thor AGX devkit and noticed that the results are highly dependent on GPU power consumption.
When the tests are run for the first time, the performance is “poor”. Tegrastats shows VDD_GPU 64497mW/64347mW.
However, after a while of running tests, GPU power consumption suddenly increases at about 10W (VDD_GPU 64497mW/64347mW → VDD_GPU 74692mW/64693mW) and the performance becomes “good”.
Our tests don’t configure anything in OS, just do heavy-load operations on GPU.
We tried run jetson_clocks, disable DVFS and all other “knobs” described in guide Jetson Thor Product Family — NVIDIA Jetson Linux Developer Guide , but it doesn’t help.
After digging deeper in issue we have learned to reproduce an increase in GPU power consumption and to compare performance before and after increasing.
We use two applications:
- benchmark - a GPU performance testing application
- warmupapp - GPU warm-up application, continuously performs heavy-load operations on GPU.
Steps to reproduce:
- run benchmark and tegrastats. Look at VDD_GPU of tegrastats.
current/average values of VDD_GPU are almost the same (VDD_GPU 64497mW/64347mW).
Benchmark finishes with poor results. - run warmupapp. Look at VDD_GPU of tegrastats. Wait 20-60sec until VDD_GPU current value increased (VDD_GPU 64497mW/64347mW → VDD_GPU 74692mW/64693mW).
- stop warmupapp and run bechmark.
Benchmark finishes with good results. - Wait at about 5min.
- run benchmark. Look at VDD_GPU of tegrastats.
current/average values of VDD_GPU are almost the same again (VDD_GPU 64497mW/64347mW). No increases in GPU power consumption.
Benchmark finishes with poor results.
So GPU performance increased after warm up, and decreased again after some time if no operations are performed on GPU.
How to make GPU work on maximum performance without warming up?
If you are not aware of this issue we are ready to provide you more details.
