Hi guys,
We have added a new camera sensor on TX1 platform, when we use v4l2src to capture raw data to display, it delay serious, command as follows:
gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true ! ‘video/x-raw, format=UYVY, framerate=30/1’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=I420, framerate=30/1’ ! nvoverlaysink sync=false async=false
I have tested the end-to-end delay, that is the time transfer images from camera sensor to hdmi monitor, is about 100~120ms. Anyone could give some advice about how to reduced this latency?
thanks
Hi cloundliu,
We don’t observe the issue in running
ubuntu@tegra-ubuntu:~$ gst-launch-1.0 videotestsrc is-live=true do-timestamp=true pattern=18 num-buffers=300 ! 'video/x-raw, width=1920,height=1080,format=UYVY, framerate=30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12, framerate=30/1' ! nvoverlaysink sync=false async=false
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:11.419520223
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
For 300 frames, the execution is ~11 seconds, which meets the expectation.
Please check the execution time in running
gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true num-buffers=300 ! 'video/x-raw, format=UYVY, framerate=30/1' ! fakesink
And what is widthxheight of the v4l2src?
Hi DaneLLL,
I think you have misunderstood what I meant, plz refer to the question [url]https://devtalk.nvidia.com/default/topic/934387/?offset=16#5059577[/url]
We have experienced the same problem which described with the link above, our test result are coincides with it. The v4l2_buffer timestamp is two frames earlier than the time we get the buffer on the userspace, is about 66ms. That is to say, the transfer time of image from v4l2-buffer to userspace is about 66 ms, but in our test the time from sensor to v4l2-buffer is more than 66ms.
On the whole,the delay from sensor to the userspace is about 120ms, I think it is two serious,and does not meet the application requirements.
could you give me some advice about how to reduce it? Did the code of v4l2-core need be updated?
Our sensor format is 8 bits UYVY 1280*1080@30fps.
thanks