If I alloc input frames using nvEncCreateInputBuffer, do I need to allocate a bitstream object for each one? I noticed code like that in ffmpeg that seems to allocate them together, n times, but it wasn’t clear at all in the documentation. I’m not using async encoding as it’s not supported for datacentre cards (or at least the one I’m using) so I assumed I only needed a single bitstream buffer.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
[NVENC] Is there any way to externally allocate bitstream buffers? | 0 | 1037 | May 8, 2014 | |
Is NvEncUnmapInputResource really required after NvEncLockBitstream? | 0 | 448 | January 29, 2021 | |
More than single bitstream output buffer per input | 0 | 543 | December 9, 2019 | |
NvEnc: Multirate encoding? | 0 | 509 | October 5, 2021 | |
How to handle buffers when nvEncEncodePicture produces NV_ENC_ERR_NEED_MORE_INPUT | 2 | 37 | February 9, 2025 | |
Why buffer frames in low-latency scenarios? | 2 | 2252 | February 7, 2017 | |
NVENC SDK (2.0 Beta) question: IObuffer allocation, is more buffers better? | 0 | 800 | January 15, 2013 | |
Surface vs Frame | 10 | 1817 | October 12, 2021 | |
Create batch of frames for a single file stream | 6 | 1174 | October 12, 2021 | |
NVENC performance issue when using framequeue in a filter | 0 | 11 | April 14, 2025 |