Hi,
Currently I capture video from GMSL2 camera, do h264 encoding and streaming through TCP/IP on Xavier as client. Then on the Windows server side, I split stream into 5 minutes segmentation. For some reason, the video segmentation length need to be exact 5 minutes long and does not need to be starting with I frame or ending with a full cluster (I:P=1:29 in my case).
My C code is similar to these two pipelines:
For sender (Xavier):
gst-launch-1.0 videotestsrc do-timestamp=true ! “video/x-raw,width=(int)1920,height=(int)1080,framerate=30/1,format=(string)I420” ! omxh264enc profile=2 insert-sps-pps=true iframeinterval=30 ! h264parse ! matroskamux ! queue ! tcpclientsink host=192.168.1.2 port=5000
For receiver (Windows):
gst-launch-1.0 tcpserversrc host=192.168.1.2 port=5000 ! matroskademux ! h264parse ! queue ! splitmuxsink location=test%02d.mkv send-keyframe-requests=true max-size-time=300000000000 muxer=matroskamux
Since for splitmuxsink, it will find the closest keyframe and split at that location, which may cause the segment less than 5 minutes instead of exact 5 minutes. As I mentioned before, I do not need the MKV segments to start with keyframe or end with a full cluster (I:P=1:29). Hence I am using “send-keyframe-requests=true” to force a keyframe from omxh264enc.
I have managed to achieve this with Xaiver only (discard TCP/IP streaming and save video segmentation directly on Xavier). However I cannot force keyframe with TCP/IP streaming.
My question is how to send-keyframe-requests to Xaiver omxh264enc through TCP/IP from Windows. Or is there any other brilliant way to achieve what I want for exact time MKV video segmentation?
BR
Steven
Hi,
We have deprecated omx plugins. Please use v4l2 plugin such as nvv4l2h264enc.
A quick solution is to set IDR frame interval to 15 and set max-size-time to 305 second. If this is not good, you can download the source code of gst-v4l2 and add the logic for send-keyframe-requests=true. The existing implementation is to issue force-IDR signal:
"force-IDR" : void user_function (GstElement* object);
The source code is in
https://developer.nvidia.com/embedded/l4t/r32_release_v5.1/r32_release_v5.1/sources/t186/public_sources.tbz2
Hi DaneLLL,
Thank you for the quick reply. I can not change the IDR interval for some reason, it has to stay 30.
For the “force-IDR” signal, are you suggesting regardless of splitmuxsink on the Windows side, I should have a timekeeper on Xaiver sender code.
For example, maybe I can add an “identity” element after v4l2src and pull buffer to check PTS time. When PTS time reaches 5 minutes, I should call “user_function(data->omxh264enc)” to force a keyframe?
Hi,
It seems not easy to split at exact 5 minutes in gstreamer command. If you use jetson_mutlimedia_api, you can count frame number = 9000 to know it is exact 5 minutes(frate is 30fps). Can get IDR frames with this patch:
Xavier AGX : Video encoding crash - #15 by DaneLLL
However, in using jetson_multimemdia_api, you cannot utilize existing matroskamux, tcpclientsink plugins.
It is not possible to set max-size-time to 305 second in your usecase? Maybe for a try?
Hi DaneLLL,
The reason I need the H264 video segmentation to be exact 5 minutes is that I need to align the video data to another type of sensor data which split in exact 5 minute time for further sensor fusion algorithm.
Also for the camera sensor (30FPS) itself, it can randomly drop frames. Therefore in order to ensure IDR interval to be 30, the 5 minute video segmentation could be 1-3 seconds less than 5 minutes depending on how many frames dropped. When doing a long-time continuous data logging, the video data segmentation and the other sensors’ data segmentation can drift away. It is hard to make sure the alignment of each segmentation if we can not make video split at exact 5 minutes.