What is the equivelant gstreamer pipeline for this raspivid udp stream?

When using this stream:

raspivid -t 0 -w 1280 -h 720 -fps 30 -3d sbs -cd MJPEG -o - | nc 3001 -u

I’m using some code that looks for the start and stop bits of the MJPEG in unity and I can see my stream.

When trying to use a gstreamer pipeline to stream an MJPEG, the stream is encoded differently or the format is different. Normally, my pipeline would look like this:

gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host= port=3001

The real issue was with the rtpjpegpay, the payload was messing with the bytestream. It works fine using:

gst-launch-1.0 videotestsrc ! jpegenc ! udpsink host= port=3001

However, when I try to stream a larger video using caps “video/x-raw,width=1280,height=720” I get an error:

Error sending message: A message sent on a datagram socket was larger than the internal message buffer or some other network limit, or the buffer used to receive a datagram into was smaller than the datagram itself.

Are you able to use h264/h265 encoding? Generally we use h264/h265 in RTSP/UDP streaming on Jetson platforms. You can use hardware encoder to get better performance.

That is what I would normally do. However the video input I’m getting into my jetson nano is already MJPEG encoded and the receiver in unity is built to decode the MJPEG stream. If I hardware encode to H264 from MJPEG, I have to custom build a receiver that can depay the H264 then convert it back to MJPEG for a second broadcast that is through websockets.

FFMPEG doesn’t seem to have any issue with larger resolution (up to 1080p) when transmitting the data as MJPEG, so I was wondering what the difference was when transmitting from gstreamer. Is there a way to break up the chunks sent over UDP?

Please adjust mtu size in rtpjpegpay and give it a try. It is suggested in this post:
java - Limiting send rate of gstreamer's udpsink - Stack Overflow

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.