Gstreamer pipeline cant play video stream

Hi,

I try to stream my video cam (which is a imx290) via the following gstreamer pipeline:

the following pipeline works:
./test-launch -p 8554 "nvarguscamerasrc do-timestamp=1 sensor-id=0 ! video/x-raw(memory:NVMM),width=1024, height=768, framerate=10/1, format=NV12 ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96"
This just works fine.

But when I try to by-pass ISP with the following pipeline:
./test-launch -p 8554 "v4l2src device=/dev/video0 ! video/x-raw,width=1024, height=768, framerate=10/1 ! nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96"

and connect VLC the pipeline hangs and I get the following DEBUG output from gstreamer:

stream ready at rtsp://127.0.0.1:8554/test
Opening in BLOCKING MODE 
0:00:09.040527601  8412   0x55a40f0d90 ERROR             rtspclient rtsp-client.c:1054:find_media: client 0x55a4211120: can't prepare media
0:00:09.041060726  8412   0x55a40f0d90 ERROR             rtspclient rtsp-client.c:2910:handle_describe_request: client 0x55a4211120: no media

Also when I replace v4l2src with videotestsrc plugin in the latter pipeline I can see the testscreen in VLC.

Does anyone have an idea why v4l2src doesnt work?

My l4t release is:
# R32 (release), REVISION: 2.1, GCID: 16294929, BOARD: t210ref, EABI: aarch64, DATE: Tue Aug 13 04:28:29 UTC 2019

Thanks in advance for any help on this issue.

Kind regards, Steve

IMX219 is bayer sensor, so it may not provide any raw video format. Check with:

v4l2-ctl -d0 --list-formats-ext

Gstreamer only supports video/x-bayer for bayer8 or bayer16, not 10 or 12 bits bayer formats.

Anyway, debayering with gstreamer would be probably be slow on Jetson, so better use Argus that will debayer with ISP.

Hi,
Due to performance concern we don’t capture raw Bayer data in gstreamer command. The performance is better in using nvarguscamerasrc. So that hardware ISP engine is used for debayering.

For constructing gstreamer pipeline, please set video/x-bayer,width,height,format fitting the source, and try bayer2rgb plugin.
v4l2src
bayer2rgb: GStreamer Bad Plugins 1.0 Plugins Reference Manual

Hello together,

alrigth I see thanks for all the information about the cameras output-format. I think I rather stay with nvarguscamerasrc then. Using v4l2src was just an experiment anyway, since I wanted to try how RTP timestamp look like when I use v4l2src instead of nvarguscamerasrc. Somehow I experience huge delays (around 300ms between filmed timestamp and received timestamp via RTP) with RTP timestamps, but maybe this is a topic for a different thread :-)

Thanks again for information, Steve

Part of the latency may be due to VLC. Try using gstreamer on receiver side:

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test latency=0 ! rtph264depay ! decodebin ! autovideosink

Note that latency=0 is ok for localhost, but you may increase when used over network.

Hi Honey_Patouceul,

thanks for your answer. The problem is that the delay Iam experiencing is not on the receiver side. I’ve written a decoder program based on ffmpeg, where I extract the RTP timestamp along with decoding the current frame from the stream and display both. Additionally I’ve written a small CLI program for my board where it outputs CLOCK_REALTIME each 10ms on UART, and this Iam filming and when comparing both timestamps (the filmed within the frame vs the received RTP timestamp) there’s a delay of around 300ms.
Also what I noticed if I extend the gstreamer sending pipeline (e.g. by replacing nvv4l2h264enc by x264enc) the delay gets much bigger. I always thought that RTP timestamp is coming from nvarguscamerasrc plugin, but somehow this doesn’t seem to be the case, because otherwise the delay would be 300ms always (independent of the pipeline latency). I actually dont really have a problem with the 300ms latency but more with the fact that this delay is depending on the pipeline latency.

Please also see my other post according to this:

Do you have an explaination of this or maybe can point me to resources where I can understand how this RTP timestamping works within the gstreamer pipeline?

Thanks in advance,
Steve

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.