Encoder changing bitrate when told to change resolution

Hello,

I have an encoder created with NV_ENC_PARAMS_RC_CBR_LOWDELAY_HQ, it maintains a pretty stable bitrate that I set with NV_ENC_RC_PARAMS::averageBitRate. So far everything is good.

But when I change the resolution of the encoder, even just one percent, the bitrate will more than cut in half. So there seems to be some decision happening in the encoder that bitrate should also be reduced when resolution is reduced. This would be fine if I got the same decision at 100% resolution, but that doesn’t happen, so I get a wildly different bitrate that I can’t control among different resolutions.

When resolution is reduced, bitrate is cut in half:

Manually increasing the bitrate just causes a spike and then bitrate goes down again.

When changing the resolution I use:

resetEncoder = 1
forceIDR = 1
encodeWidth = myNewWidth
encodeHeight = myNewHeight
encodeConfig->RcParams.averageBitRate = myBitrate

TLDR: My encoder listens to the NV_ENC_RC_PARAMS::averageBitRate when at 100% resolution only. When a lower resolution is reconfigured, the bitrate is decided by some internal function unseen by me. I want to enable this function also in the 100% resolution case or be able to disable it for reduced resolutions.