I want to know how to send MJPEG fast

I’m trying sending Full HD MJPEG with RTSP from Jetson nano.

I use this command at jetson.

gst-launch-1.0 nvarguscamerasrc ! “video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12,framerate=30/1” ! nvjpegenc ! rtpjpegpay ! udpsink host=x.x.x.x port=9999

and I use this command at receiver PC. (Desktop PC that have Core i 7 10750 and RTX2060)
gst-launch-1.0 udpsrc port=9999 caps=“application/x-rtp,encoding-name=JPEG,payload=26” ! rtpjpegdepay ! jpegdec ! fpsdisplaysink

And I get about 5 fps. It’s too slow.

Jetson nano and receiver PC are in same LAN.
Network speed from Jetson nano to receiver pc is over 200Mbps.(I measured using iperf3)
I get 30fps with H264, but MJPG is very slow.

I want to know what is wrong.

Hi,
Please run the pipeline to check the rate of JPEG encoding:

gst-launch-1.0 nvarguscamerasrc ! “video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12,framerate=30/1” ! nvjpegenc ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

Thank you for your support.

I tried your command and I got result as below.

$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! "video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12,framerate=30/1" ! nvjpegenc  ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvJpegEnc:nvjpegenc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 4032 x 3040 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 2592 x 1944 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 2560 x 1440 FR = 40.000000 fps Duration = 25000000 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 1 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 59.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
/GstPipeline:pipeline0/GstNvJpegEnc:nvjpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = image/jpeg, sof-marker=(int)4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = image/jpeg, sof-marker=(int)4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = image/jpeg, sof-marker=(int)4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 17, dropped: 0, current: 32.59, average: 32.59
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 32, dropped: 0, current: 29.92, average: 31.28
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 47, dropped: 0, current: 29.92, average: 30.83
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 62, dropped: 0, current: 30.00, average: 30.63
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 78, dropped: 0, current: 30.02, average: 30.50
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 93, dropped: 0, current: 30.00, average: 30.42
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 109, dropped: 0, current: 30.00, average: 30.36
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 125, dropped: 0, current: 30.02, average: 30.31
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 140, dropped: 0, current: 30.00, average: 30.28

It seems to get images at 30fps.
So, Slow transferring is caused by network?
I changed wifi adapter and up to 500Mbps at LAN.
But It’s still slow.

I got the results of an experiment that may provide a clue to solving one of the problems.
I tested by connecting to 5Ghz Wifi and got 5fps.
When I connected it to 2.4GHz wifi, I was able to get about 20-25fps.

The speed of the 5GHz wifi is 500Mbps and the speed of the 2.4GHz wifi is 90Mbps.
It is getting slower, but the frame rate of the video transfer is increasing.

Assuming a Jpeg compression rate of about 90%, the bandwidth required is 1920(w)x1080(h)x24(bit)x30(fps)x0.1(compress) = 140Mbps.
So, when connected to 2.4GHz, a speed of about 20fps at 90Mbps does not look like a very strange result.
If you connect to 5GHz, the speed will be 5fps, but I think there is something wrong with the pipeline settings.
If you have any suggestions, I’d be happy to hear them.

Hi,
You may run the pipeline like:

... ! rtpjpegpay ! queue ! udpsink

And tune properties about the buffers, such as MTU size. May refer to a similar topic:
deepstream4.0 with rtsp of sink makes the picture distortion with high bitrate - #9 by DaneLLL

You may try this (buffer max size may be oversized, but it needs to be increased above 640x480@30) :

  • Jetson sender
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12,framerate=30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! jpegparse ! rtpjpegpay ! identity drop-allocation=1 ! udpsink host=127.0.0.1 port=5000 buffer-size=33554432
  • Jetson receiver:
gst-launch-1.0 -ev udpsrc port=5000 buffer-size=33554432 ! application/x-rtp,payload=26 ! rtpjpegdepay ! jpegparse ! nvjpegdec ! 'video/x-raw(memory:NVMM),format=I420' ! nvvidconv ! xvimagesink

# Or
gst-launch-1.0 -ev udpsrc port=5000 buffer-size=33554432 ! application/x-rtp,payload=26 ! rtpjpegdepay ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvvidconv ! xvimagesink