I have trouble finding a definitive answer to this question:
Is it possible to encode and decode with zero latency using nvenc/nvdec with h.264?
With zero latency I mean without any frames being buffered.
There are a bunch of settings that implies that this is possible, to my understanding I have tried them all but there is still delay. Then there are other answers that implies that one frame latency is required, so I’m not sure.
Anyone who managed to get this to work?
I’m trying this in the samples “EncLowLatency.cpp” and “DecLowLatency.cpp” with some modification.
I see that the encoder creates 3 frames before the decoder will output 1 frame, I use the following:
bLowLatency = 1,
zeroReorderDelay = 1,
vbvInitialDelay = 0,
vbvBufferSize = 0,
frameIntervalP = 1,
gopLength = NVENC_INFINITE_GOPLENGTH,
idrPeriod = NVENC_INFINITE_GOPLENGTH
I see it is actually the creation of FFmpegDemuxer that consumes at least one frame, so the encoder must output a new one to get things going. How to avoid this?
I changed this in FFmpegDemuxer but it didn’t solve the issue:
ctx->probesize = 32;
ctx->flags = ctx->flags | AVFMT_FLAG_NOBUFFER | AVFMT_FLAG_FLUSH_PACKETS;
ctx->max_analyze_duration = 0;
ctx->flush_packets = 1;
I now feed the first packet twice to the demuxer, this makes things work as intended I think as far as the codec goes. There is still a delay unfortunately though, but I guess it originates from some buffering in cuda or opengl perhaps, because I get one decoded frame for one encoded frame now, so I guess zero latency works as far as the codec is concerned.
Nevermind, there is still one frame delay caused by the demuxer or decoder, I am getting frame zero twice from the decoder, and from there on one frame lag. So the question still stands.