I have developed TC358748 Parallel to MIPI driver. And i can captures 1080p frames using yavta.
I can also preview and record h264 video using gstreamer-1.0.
I used follwing command for preview. Sensor output is in UYVY format.
gst-launch-1.0 v4l2src ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=30/1' ! videoconvert ! video/x-raw,format=I420,width=1920,height=1080,framerate=30/1 ! autovideosink
If i use Gstreamer-1.0 v1.2.4 i get no latency in previewing 1080p 30/60 fps video.
But if i use Gstreamer-1.8.1 or 1.6.0 with the same command i got latency of around 500ms-1000ms with following debug
Additional debug info:
gstbasesink.c(2846): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstNvOverlaySink-nvoverlaysink:autovideosink0-actual-sink-nvoverlay:
There may be a timestamping problem, or this computer is too slow.
If i add sync=false at last i got no debug info and latency of around 300-500ms but not good as compared to v1.2.4.
I installed gstreamer-1.6.0 using “gst-install --prefix=/home/ubuntu/gst-1.6.0 --version=1.6.0” and then i export
export LD_LIBRARY_PATH=/home/ubuntu/gst-1.6.0/lib/arm-linux-gnueabihf
export PATH=/home/ubuntu/gst-1.6.0/bin:$PATH
So whats the reason of this latency?
And using gstreamer 1.6.0 and 1.8.1 i can encode h264 and record .mp4 video using
gst-launch-1.0 v4l2src ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=30/1' ! videoconvert ! video/x-raw,format=I420,width=1920,height=1080,framerate=30/1 ! omxh264enc ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! qtmux ! filesink location=test.mp4 -e
But i got same latency in recorded video.
I can’t test recording with gstreamer 1.2.4 because h264parse is missing in v1.2.4. So is there any other alternative for h264 encoding in v1.2.4?
And also how can i use nvvidconv for preview and record? Does it improve latency?