gstreamer rtp h.265 stream from ZED camera only sends single frame, then crashes

Hello,

I’m using the jetson nano to stream video from a ZED stereo camera. I’m using gstreamer, and the plugins provided by nvidia to allow for hardware acceleration of encoding to h.265.

If I use the “videotestsrc” in gstreamer instead of the ZED, everything works correctly. But when I switch my source to the ZED, then it will send a single frame then give this error message (on the client side):

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not read from resource.
Additional debug info:
gstv4l2bufferpool.c(1040): gst_v4l2_buffer_pool_poll (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
poll error 1: Success (0)

The receiver continues without any problems.

These are the commands that I’m using:

Client:

gst-launch-1.0 v4l2src device=/dev/video0 ! \
"video/x-raw, width=2560, height=720, format=(string)YUY2, framerate=(fraction)30/1" ! \
nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! \
nvv4l2h265enc bitrate=8000000 ! video/x-h265 ! rtph265pay pt=96 ! \
udpsink host=192.168.1.203 port=5000 sync=false

Receiver:

gst-launch-1.0 udpsrc port=5000 ! \
application/x-rtp,clock-rate=90000,payload=96 ! rtph265depay ! avdec_h265 ! \
xvimagesink sync=false

If I change the first line of the client from “gst-launch-1.0 v4l2src device=/dev/video0” to “gst-launch-1.0 videotestsrc”, then everything works correctly.

Any help would be appreciated, thank you.

Hi,
You may check if your v4l2src support YUYV(defined YUY2 in gstreamer).
[url]Logitech C930e on Jetson TX1 very slow and choppy video - Jetson TX1 - NVIDIA Developer Forums

Yes, that is the only format available on my camera.

ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'YUYV'
	Name        : YUYV 4:2:2
		Size: Discrete 2560x720
			Interval: Discrete 0.017s (60.000 fps)
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 1344x376
			Interval: Discrete 0.010s (100.000 fps)
			Interval: Discrete 0.017s (60.000 fps)
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 3840x1080
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 4416x1242
			Interval: Discrete 0.067s (15.000 fps)

Is it possible that it is just too much to process for the nano? It is a very large video (2560x720). But if that were true, then why does it work with the “videotestsrc”, and not my camera. Do you have any other suggestions on how I can make this encoding more efficient? Thank you.

Figured it out, needed to use “queue” and “videorate” on the sender side, and rtpbin for sender and receiver, which includes stuff like the rtpjitterbuffer for the playback. These tutorials were a lot of help:

https://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/tests/examples/rtp/client-H264.sh
https://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/tests/examples/rtp/server-v4l2-H264-alsasrc-PCMA.sh

Updated commands, for anyone who might find this useful:

Server:

gst-launch-1.0 -v rtpbin name=rtpbin \
v4l2src device=/dev/video0 ! \
"video/x-raw, width=2560, height=720, format=(string)YUY2, framerate=(fraction)60/1" ! \
queue ! videorate ! nvvidconv ! \
"video/x-raw(memory:NVMM),format=(string)I420,width=2560,height=720,framerate=60/1" ! \
nvv4l2h265enc bitrate=6000000 maxperf-enable=true ! rtph265pay ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=<host_ip> ts-offset=0

Client:

gst-launch-1.0 -v rtpbin name=rtpbin latency=50 \
udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H265" port=5000 ! \
rtpbin.recv_rtp_sink_0 rtpbin. ! \
rtph265depay ! avdec_h265 output-corrupt=false ! videoconvert ! ximagesink sync=false