I have an application which captures frames from a Basler camera at 25FPS, encodes them as H265 video using the NvV4L encoder and transmits the video over UDP sockets.
Each frame’s resolution is: 1920x1080
I measured the time it takes to encode each frame, and I can see that when the target encoding bitrate is raised from 2Mbps towards the 10Mbps the encoding time starts to fluctuate, and reaches extreme values.
I ran the test both on a Jetson AGX & a Jetson NX, with similar results:
When the resolution was 1920x1080 and target bitrate was around 2Mbps , the results were fairly stable (though much higher than expected):
Jetson NX: 20-40ms per frame
Jetson AGX: 20-25ms per frame
When the resolution was 1920x1080 but target bitrate was raised to 10Mbps , the results were extremely high and with great fluctuations on both Jetsons , with most frames taking 35-40ms to encode, but some frames taking even as much as 100ms or longer to encode, and the values were fluctuating all over the place.
For comparison’s sake, the same application (but using the NvEnc Encoder) on an x86 laptop, with GeForce GTX 1650:
With Encoder’s target bitrate set to 10Mbps, it consistently takes 5-10 milliseconds to encode each frame, with no fluctuations (measured Encoding time for several thousand frames and all values were 5-10ms).
me@jetson:~$ uname -a Linux jetson 4.9.140-tegra #1 SMP PREEMPT Thu Sep 24 16:09:59 PDT 2020 aarch64 aarch64 aarch64 GNU/Linux
Please see attached a snippet of our CPP code, showing the parameters we configure for the Encoder.
Any input or suggestions would be much appreciated.
NvV4L_encoder_code.cpp (1.9 KB)