However, the log file created by the “MeasureEncoderLatency” option gives encoding times per frame that average out to 17.5ms
The video file was 7 seconds and 420 frames. 420 frames x 17.5ms is 7.35 seconds. This is much longer than the 3.26 seconds indicated by Gstreamer at the end of execution.
What is happening here? Which encoding time is correct?
You did not answer my question. Once Gstreamer finishes encoding, I get 2 different computation times. I get one line that tells me:
“Execution ended after 0:00:03.267317578”
AND I also get a log file (which I have attached). If I add up all of the encoding times in the log file, I get a computation time of 9.526 seconds. gst_v4l2_enc_latency_9882.log (24.4 KB)
So my question is, which encoding time is the correct one?
This value is accurate for total encoding time. In latency profiling, the value has delay due to reference frames. For encoding into IDR P P P P… , the encoder has to keep one reference frame. 1st frame is sent to encoder and kept. After 2nd frame is sent to encoder, 1st frame is returned and 2nd frame is kept. For more complicated use-case such as I P B B P B B P… , the latency value can be varying in a range.
For total encoding time, please check the Execution ended print. Or may use fpsdisplaysink to show runtime framerate.
Thank you for the answer. I still don’t fully understand the latency times given inside the log file (from my previous post on this thread). I understand it shows latency per frame, but the sum of latencies latencies in the log file does not match the total execution time.
Does the Hardware encoder encode frames in series (one after the other) or does it encode frames in parallel (.i.e. multiple frames encoded simultaneously)?