I am looking into jpeg encoding on the Jetson AGX Orin, and I saw that – according to the Orin datasheet – there are actually two nvjpeg hardware engines embedded on the Orin SOM. I was curious if there was a way via the multimedia API to parallelize them such that they can encode two different frames at the same time. I’ve looked through the existing 05_jpeg_encode sample but it seems to only encode frames serially. Any help or guidance would be much appreciated, and thank you!
You can create multiple NvJPEGEncoder classes to encode frames parallelly. And it should show two NVJPG engines in sudo tegrastats.
I started 8 process of jpeg_encode in parallel, but only see NVJPG1 used in jtop:
tegrastats only see NVJPG1:
07-21-2023 13:56:39 RAM 5853/30536MB (lfb 5473x4MB) SWAP 0/15268MB (cached 0MB) CPU [5%@2188,6%@2188,5
%@2188,6%@2188,51%@2188,2%@2514,1%@2188,0%@2188] EMC_FREQ 3%@3199 GR3D_FREQ 0%@929 GR3D2_FREQ 0%@929 NVJPG1 729 VIC_FREQ 68%@524 APE 233 CV0@-256C CPU@58.125C Tboard@47C SOC2@55C Tdiode@48.25C SOC0@54.718C CV1@-256C GPU@53.468C email@example.comC SOC1@55.937C CV2@-256C VDD_GPU_SOC 5164mW/5164mW VDD_CPU_CV 1192mW/1192mW VIN_SYS_5V0 6379mW/6379mW NC 0mW/0mW VDDQ_VDD2_1V8AO 1492mW/1492mW NC 0mW/0mW
But when I start jpeg_decode, it show NVJPG will used.
Would you pls tell me why?
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.