Dear NVIDIA Support Team,
I am writing to seek assistance regarding an issue we’ve encountered with our video encoding process using NVIDIA’s encoding technologies. Despite configuring our encoder to use Variable Bitrate (VBR) settings, the actual bitrate exceeds our expectations. Additionally, our configuration sets both vbvBufferSize
and vbvInitialDelay
parameters to zero.
Issue Details:
- Expected Behavior: Achieve target bitrate as specified in VBR settings.
- Observed Behavior: Actual bitrate is consistently higher than the target.
- Configuration Parameters:
RateControlMode
: VBR
vbvBufferSize
: 0
vbvInitialDelay
: 0
avgBitrate:
25Mbps
maxBitrate:
30Mbps
Actual Bitrate received:
49Mbps
System Information:
- Operating System: Windows 11
- NVIDIA GPU Model: NVIDIA RTX A2000 8GB Laptop GPU
- Driver Version: 553.62
- Encoding SDK Version: NVENCAPI_MAJOR_VERSION 11, NVENCAPI_MINOR_VERSION 1
Steps Taken:
- Configured the encoder with VBR settings and zeroed
vbvBufferSize
and vbvInitialDelay
.
- Initiated the encoding process on various video samples.
- Monitored the output bitrates, noting consistent elevation above the target.
We would appreciate any guidance or recommendations you can provide to help us achieve the desired bitrate control in our encoding process.
Thank you for your attention to this matter.
Best regards,
Mehul
Hi @mehul.kumar I am not an expert in this space, but remember seeing somewhere in the past that setting vbvBufferSize
and vbvInitialDelay
to 0 means they use their default values, rather than using 0
as the rate and the delay.
I’ll reach out to the computer vision team and ask them to confirm or deny!
Best,
Sophie
1 Like
Thanks for the clarification @sophwats,
Yes please, if you can confirm the same behavior with the computer vision team it would be of great help.
Thanks,
Mehul
Hey @mehul.kumar,
from the vis team:
How long was the stream and how is the bitrate measured is an important factor here.
Setting the vbvBufferSize and vbvInitialFullness to zero means they will take default values (which can be rather large, easily corresponding to a few seconds at the maximum bitrate)
That buffer size of a few seconds can easily distort the bitrate measurements
Best,
Sophie
Thanks for the confirmation @sophwats!
Thanks,
Mehul
I know about vbvInitialDelay
and vbvBufferSize
, what is vbvInitialFullness
?
Best,
Mehul
Depending on the API, vbvInitialDelay can be expressed in time (delay) or bits (decoder buffer fullness), with the vbv buffer size representing the maximum vbv delay
If measuring bitrate as a moving average within a N seconds window with a vbv buffer size of M seconds, the maximum bitrate is roughly (maxBitrate * (N + M) / N)
I believe setting the vbv buffer size to zero results in a default vbv buffer size that is typically a few seconds (typically, the larger the better from a quality point of view)