Stream CSI Camera to HTTP in MJPEG format

Hi. I would like to stream a CSI camera to HTTP in MJPEG format. Playing around for quite some time now with gstreamer, I ended up with the following command:

gst-launch-1.0 nvarguscamerasrc sensor_id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! nvjpegenc ! hlssink playlist-root=http://192.168.3.4:1234

The ip-address is the local address of my Jetson. Gstreamer appears to actually start the stream without errors. Here’s the log:

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 2
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 29.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

However, when I try to see the video stream, I cannot connect to the site and port. This neither works through browser, nor through Octoprint, which is the actual end-client of the stream.

Any hint on why this doesn’t work is highly welcome!!

Cheers
Flopana77

Hi,
Not sure if your method can work. A working case is to prepare video files and a m3u8 playlist. And start a http server.

On terget device such as iphone, you can open the playlist to play the files. Please refer to
HLS Live streaming - #4 by DaneLLL

You may check if you can apply this working case to your use-case.

Thanks for the quick reply. I’ll try this out and see how it goes. Will post once I have more info.

You may try the following. I use SimpleHttpServer that can be installed into python from pip:

# Create a new folder for this
mkdir /home/nvidia/hlstest
cd /home/nvidia/hlstest

# Start the http server in background
python -m SimpleHTTPServer 8080 &

# Launch the pipeline
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640,height=480,framerate=30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! jpegparse ! multipartmux ! hlssink playlist-root=http://192.168.0.40:8080 location=/home/nvidia/hlstest/segment_%05d.ts max-files=10 target-duration=1

You would have to cleanup before restarting the pipeline:

rm -f *.ts
rm -f playlist.m3u8

Now, it is important that you know the framerate on receiver side, otherwise the frames may be decoded as they come and it may be too fast.

gst-launch-1.0 souphttpsrc location=http://192.168.0.40:8080/playlist.m3u8 is-live=true do-timestamp=true ! application/x-hls ! hlsdemux ! multipartdemux ! image/jpeg,framerate=30/1 ! decodebin ! autovideosink

Ok. Now I got two ways to tackle the issue. I’ll try them out and let you know. Thanks for the great support. You guys rock!

Hi again. I tried around for a while and wanted to give an update/ask for any advise what I am doing wrong:

The good news: When I run your command on the jetson and check the URL, the m3u8-file is missing. I added a playlist-location property to the command, which fixed this issue. When checking the URL in a browser-window, the segment and the playlist files are there.

The bad news: On the receiver end, I want to run VLC player. However, when I open the URL (full path: http://:1234/playlist.m3u8, the stream plays for a few seconds and errors out after that looking for a segment which has been erased already.

I guess that the m3u8-file in not being read again…

In the meantime I saw that other people were successful in streaming RTSP to http-mjpeg. Is there a way to get a RTSP-stream from gstreamer? I only found RTP but that doesn’t work…Again, any help appreciated.

Cheers

This may be the symptom of no framerate set (0/1) and frames being displayed as they are received being way too fast.
Try using gstreamer on receiver side so that you can set the decoding framerate as in my previous post. If usink Linux, you may decode this way and send to a v4l2loopback node (maybe using YUY2 format) so that you can open it from VLC as a v4l source.

Maybe you can use RTSP for jpeg streaming, but not tried that.
You would install gst rtsp server package froim apt and build test-launch example, then try rtpjpegpay in it.
However, not sure this will solve the decoding framerate issue, but I’m unable to try for now.

Ok, that’s were I am at now:

  1. I cloned the gst-rtsp server repo and started the test-launch: ./test-launch "( videotestsrc ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 )"
  2. I started ffserver to broadcast onto http
  3. I ran ffmpeg to convert the rtsp stream to http: ffmpeg -rtsp_transport udp -i 'rtsp://127.0.0.1:8554/test' http://192.168.3.4:8080/camera1.ffm -async 1 -vsync 1

This works for about 450 frames (or pretty much exactly 30 sec) and then ffmpeg quits without further error.

If I change the pipeline to stream from my camera, gst-rtsp-server opens a pipeline using this command (adjusted from what you proposed):

./test-launch "( nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640,height=480,framerate=15/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 )"

gst-rtsp-server opens the pipeline without issue. When I then launch ffmpeg (same command as above), gst-rtsp-server immediately throws this error twice…:

(test-launch:30496): GStreamer-CRITICAL **: 18:20:29.187: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

ffmpeg does pretty much nothing and throws below error after 30 sec:
[rtsp @ 0x55c1339600] method DESCRIBE failed: 503 Service Unavailable
rtsp://127.0.0.1:8554/test: Server returned 5XX Server Error reply

Now come the interesting part. If I launch the pipeline bare bones (./test-launch "( nvarguscamerasrc ! nvvidconv ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 )") , ffmpeg actually picks the stream up. Obviously, the stream and ffmpeg (configured to receive 15fps) are not synchronized and therefore, there ara 1001 error messages about dropped frames. Again, exactly 30 sec later, ffmpeg quits without error…

Finally, I tried deepstream for producing a rtsp-stream, which worked fine. ffmpeg and deepstream-rtsp work great together with very little latency and very stable (was streaming the whole night without issue). The only drawback is that deepstream apparently only supports h264 and not MJPEG. Hence, ffmpeg does the conversion, which costs around 20% CPU load…

If the MJPEG-rtsp-stream from the camera would cooperate, CPU load would drop to below 5%… Any suggestions welcome and apologies if the error is obvious (I am really very new to this stuff…)

Cheers
Florian

You would remove the single quotes inside the pipeline string (these are only used for preventing parentheses from being interpreted by shell) and retry:

./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM),width=640,height=480,framerate=15/1 ! nvvidconv ! video/x-raw(memory:NVMM),format=I420 ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 "

Ok. That made sense =) Thanks a ton!
Pipeline works now and this is the solution to how mjpeg can be streamer via gst-rtsp-server.

The second part of ffmpeg is still not quite working. ffmpeg starts without issues but stops after 30 seconds. This is how it looks like:

Input #0, rtsp, from 'rtsp://127.0.0.1:8554/test':
  Metadata:
title           : Session streamed with GStreamer
comment         : rtsp-server
  Duration: N/A, start: 0.064033, bitrate: N/A
Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 640x480 [SAR 1:1 DAR 4:3],                                                                                                     15 tbr, 90k tbn, 90k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x55ab7a0f90] deprecated pixel format used, make sure you did set range correctly
Output #0, ffm, to 'http://192.168.3.4:8080/camera1.ffm':
  Metadata:
title           : Session streamed with GStreamer
comment         : rtsp-server
creation_time   : now
encoder         : Lavf57.83.100
Stream #0:0: Video: mjpeg, yuvj420p(pc), 640x360 [SAR 3:4 DAR 4:3], q=2-31, 4048 kb/s, 15 fps                                                                                                    , 1000k tbn, 15 tbc
Metadata:
  encoder         : Lavc57.107.100 mjpeg
Side data:
  cpb: bitrate max/min/avg: 8096000/0/4048000 buffer size: 8096000 vbv_delay: -1
[rtsp @ 0x55ab623600] max delay reached. need to consume packet
[rtsp @ 0x55ab623600] RTP: missed 144 packets
frame=   29 fps=0.0 q=1.6 size=     924kB time=00:00:01.86 bitrate=4055.0kbits/s dup=5 drop=0 spe                                                                                                    [rtsp @ 0x55ab623600] max delay reached. need to consume packet
[rtsp @ 0x55ab623600] RTP: missed 9 packets
[rtsp @ 0x55ab623600] max delay reached. need to consume packet
[rtsp @ 0x55ab623600] RTP: missed 3 packets
[rtsp @ 0x55ab623600] max delay reached. need to consume packet
[rtsp @ 0x55ab623600] RTP: missed 12 packets
frame=   38 fps= 31 q=1.6 size=    1280kB time=00:00:02.46 bitrate=4251.0kbits/s dup=8 drop=0 spe
.
.
.
frame=  451 fps= 16 q=2.6 Lsize=   14948kB time=00:00:30.00 bitrate=4081.8kbits/s dup=8 drop=0 sp                                                                                                    eed=1.05x
video:14884kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.4313                                                                                                    10%

I know this is not necessarily the place to ask this but maybe you have an idea. I didn’t have this issue when using deepstream. Hence, may still be something with gstreamer.

Cheers and again thanks so much for the help

Ok. Figured it out. I had udp insted of tcp in the ffmpeg command. Now it is working flawlessly with almost cero latency and approx. 10% CPU load. This is around 5% less comparing to the deepstream approach and 20% less that python-opencv approches which I tried out before. Thank you for the support!!! Solution found. Topic closed =)

Also note that this woud probably not work. For nvjpegenc, the video input should be in I420 format in NVMM memory:

./test-launch "videotestsrc ! nvvidconv ! video/x-raw(memory:NVMM),format=I420 ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 "

Not sure I can advise much more from your logs with ffmpeg, but you may better detail your case, what kind of host and requirement such as if browser is required or just used for testing.

Hi again. The pipeline you refer to was just for testing. The “final” one which works is this one:

./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM),width=640,height=480,framerate=15/1 ! nvvidconv ! video/x-raw(memory:NVMM),format=I420 ! nvjpegenc ! rtpjpegpay name=pay0 pt=26 "

Just noticed a tiny little detail which is confusing. When ffmpeg connect to the stream, gstreamer puts this message out:

stream ready at rtsp://127.0.0.1:8554/test
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 5
   Output Stream W = 1280 H = 720
   seconds to Run    = 0
   Frame Rate = 120.000005
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

what strikes me is that the framerate is 120 and not 15 as requested. is there any way to fix this or do I interpret the message the wrong way?

Cheers

Glad to see you’ve moved forward. From your resolution, probably frame-aspect-ratio 4/3 leaded to last mode to been chosen, but the framerate should then be adjusted to 15 fps. You may check with:

 v4l2-ctl -d0 --get-ctrl=frame_rate

and not worry too much ;-)
Have fun

Thanks. I’ll leave it as is! Ton of thanks again.

hi,
Looks like it selects 720p120 sensor mode. You can set

  sensor-mode         : Set the camera sensor mode to use. Default -1 (Select the best match)
                        flags: readable, writable
                        Integer. Range: -1 - 255 Default: -1

May set sensor-mode=2 to run in 1080p30.

Thanks for the Input. I suppose this must be added as argument to the `nvarguscamerasrc’- call. Will try this out later.