Xavier AGX RTSP MJPEG Stream and Decode

Hi,

My total use case is that I am trying to create a 24/7 rtsp stream via MJPEG from Xavier AGX to a server. Ideally, I would like to launch and receive via python applications and opencv.

I have Opencv4.6, Cuda11.4, Jetpack 5.2, Jetson Xavier dev kit and 4 AGX production boards.

I am relatively new to Gstreamer so a lot of the issues probably come from inexperience. So, bear with me and any help is greatly appreciated.

Some caveats before I get started. To open my camera, I can use the command;

gst-launch-1.0 -v v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! jpegdec ! xvimagesink

The framerate looks great, everything is good. My camera is a USB camera at /dev/video0.

I can create an RTSP stream with;

./test-launch "v4l2src device=/dev/video0 use-damage=0 ! nvvidconv ! nvv4l2h265enc ! h265parse ! video/x-h265, stream-format=byte-stream ! rtph265pay name=pay0 pt=96 "

I can also view it on the same device via;

gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, media=video, encoding-name=H265 ! rtph265depay ! nvv4l2decoder ! nvvidconv ! xvimagesink

Video is there but the framerate is a little slow, so I am hoping to do MJPEG for compression.

I can run this python script;

import gi

gi.require_version(“Gst”,“1.0”)

gi.require_version(“GstVideo”,“1.0”)

gi.require_version(“GstRtspServer”,“1.0”)

from gi.repository import GLib, Gst, GstVideo, GstRtspServer

Gst.init(None)

mainloop = GLib.MainLoop()

server = GstRtspServer.RTSPServer()

mounts = server.get_mount_points()

factory = GstRtspServer.RTSPMediaFactory()

factory.set_launch(‘(videotestsrc is-live=1 ! video/x-raw, width=320, height=240, framerate=30/1 ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! rtph264pay name=pay0 pt=96)’)

mounts.add_factory(“/test”, factory)

server.attach(None)

print(“stream ready at rtsp://127.0.0.1:8554/test”)

mainloop.run()

and I can view the test stream via;
gst-play-1.0 rtsp://127.0.0.1:8554/test

But, as soon as I try to format for MJPEG, I can’t get anything to work, I know it has to do with encoding/decoding, I just don’t know the right sequence.

My end goal is to launch via python app and decode via python app and write the frames to specific files. At this point, just launching and viewing via python would be amazing, I can figure it out from there.

The other question I have is whether I need to be worried about in system memory. Does the RTSP stream write to memory? I am using these 4 xaviers to collect my data for about 1-3 months for my training data.

Best,

Miles

also, video formats area

MJPG’ (Motion-JPEG, compressed)
Size: Discrete 4656x3496
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 4160x3120
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 3264x2448
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 2592x1944
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1600x1200
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 4656x3496
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.100s (10.000 fps)
[1]: ‘YUYV’ (YUYV 4:2:2)
Size: Discrete 800x600
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 800x600
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.100s (10.000 fps)

Going for 30fps in MJPEG. 1280/720 is fine.

If you just want to stream the webcam without processing, you may try :

# Test with videotestsrc
factory.set_launch('( videotestsrc is-live=1 ! jpegenc ! image/jpeg,width=640,height=480,framerate=30/1,format=MJPG ! rtpjpegpay name=pay0 )')

# Or with your camera also using 640x480@30fps:
factory.set_launch('( v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! rtpjpegpay name=pay0 )')

# Verify on Jetson:
gst-play-1.0 rtsp://127.0.0.1:8554/test

The videotestsrc/gst-play-1.0 works.

factory.set_launch('( v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! rtpjpegpay name=pay0 )')

works…says stream is ready, but when I try to view in terminal with;

gst-play-1.0 rtsp://127.0.0.1:8554/test

I get;

Press ‘k’ to see a list of keyboard shortcuts.
Now playing rtsp://127.0.0.1:8554/test
Pipeline is live.
ERROR Internal data stream error. for rtsp://127.0.0.1:8554/test
ERROR debug information: gstbasesrc.c(3072): gst_base_src_loop (): /GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Reached end of play list.

I do want to process it on the server side;

is there a decode string that would work for the mpjeg rtsp launch?

2 problems, 1 I can’t view it with the 3rd command despite it looking like it works, 2nd problem or request is the decode string for python.

Additionally, it works on VLC…so thank you for that!

Still want to decode in python though :)

Not sure if I correctly understand what works with VLC.
If it only works with videotestsrc, then next step is being able to stream your webcam.

Sorry I don’t have a JPG cam so I can only emulate…Tried with a V4L USB cam and encoding without problem.
You may post some details of what gstreamer finds from it (the webcam not being used by other software):

gst-discoverer-1.0 -v v4l2:///dev/video0

and also post the output of gst-launch-1.0 with the working pipeline enabling verbose flag :

gst-launch-1.0 -v v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! jpegdec ! xvimagesink

Not sure at all, but as workaround you might try :

factory.set_launch('( v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! jpegparse ! multipartmux ! multipartdemux single-stream=1 ! image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, colorimetry=(string)2:4:7:1, framerate=(fraction)30/1,sof-marker=(int)0 ! rtpjpegpay name=pay0 )')

For decoding with python into opencv, you would first be sure that your python environment has an opencv version with gstreamer support:

import cv2
print(cv2.getBuildInformation())

Hey @Honey_Patouceul , sorry if I wasn’t clear.

So, for the test;

factory.set_launch(‘( videotestsrc is-live=1 ! jpegenc ! image/jpeg,width=640,height=480,framerate=30/1,format=MJPG ! rtpjpegpay name=pay0 )’)

gst-play-1.0 rtsp://127.0.0.1:8554/test

Work.

So, testing wise from test video source, everything is good.

Moving to the actual usb Cam.

This works;
factory.set_launch(‘( v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! rtpjpegpay name=pay0 )’)

but trying to view from command with
gst-play-1.0 rtsp://127.0.0.1:8554/test

Does not work. However, I am able to view my usb cam with the above factory.set_launch via VLC. So, I know it’s working, it’s just some sort of decoding from command line isn’t working when trying to view.

The only thing I need at this point is the python/command decoder for

factory.set_launch(‘( v4l2src device=/dev/video0 ! image/jpeg, width=640, height=480, framerate=30/1, format=MJPG ! rtpjpegpay name=pay0 )’)

Because that command is working via vlc. What I mean by working is that I can view the steam via rtsp://127.0.0.1:8554/test on the same xavier.

So, I think your suggestion of gst-discoverer-1.0 -v v4l2:///dev/video0 is okay to skip because I can verify that it is working.

I am doing the testing from my xavier dev kit both sending and receiving so I know cv2 has gstreamer.

I can get you the build info I am just not right in front of my xavier, but I think it might be moot and I can get it for you once I am back.

Hopefully, this works in clarifying.

Also, big shoutout to how helpful you are across this forum, I know it can be thankless sometimes, but I am super appreciative of your time. Happy to donate :)

So it may be an issue with gst-play-1.0. I’m using JP-5.0.2 with a newer gstreamer version.

You may try :

gst-launch-1.0 uridecodebin uri=rtsp://127.0.0.1:8554/test ! nvvidconv ! autovideosink

If this works, you would just use an opencv VideoCapture:

import cv2

cap = cv2.VideoCapture('uridecodebin uri=rtsp://127.0.0.1:8554/test ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1', cv2.CAP_GSTREAMER)
if not cap.isOpened():
    print('Failed to open RTSP stream')
    exit(-1)

while True:
     ret, frame = cap.read()
     ...


Hi,
Thanks Honey Patouceul for the suggestion.

Additionally, you may also try this pipeline in cv2.VideoCapture():

rtspsrc location=rtsp://127.0.0.1:8554/test ! rtpjpegdepay ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1

Thank you so much, both of you. @DaneLLL and @Honey_Patouceul . Both solutions work…

I have been scratching my head at this for a week.

@DaneLLL , 3 thrings, 1; I don’t know how to mark this as solved. 2; could you explain to me the difference in the string you shared vs. the one that @Honey_Patouceul shared? 3: I know this isn’t a gstreamer forum, but do you know of any resources for decoding this rtsp stream in python on a windows and/or linux machine?

Again, main problem is solved! Thanks again.

The differences between the two pipelines are:

  1. Second pipeline uses HW decoder. This would be efficient on Jetson as it wouldn’t load CPU. But it would only work on a Jetson.
    First pipeline uses uridecodebin, that would select decoder at runtime from available ones. On Jeston it would select the HW decoder, but would also work on another platform such as x86/Windows [EDIT: may need to remove nvvidconv into BGRx and directly use videoconvert into BGR]. In fact, this one would work with any supported encoding, not only jpeg.

  2. Second difference is that the second pipeline uses jpegparse. This may ignore framerate and turn into async mode (framerate=0/1).

For decoding, have you tried the opencv code I shared above ?
You may find some more details here (you would just replace the capture pipeline with the one I shared in my previous post):

Yea! I tried the opencv command above, it works great.

Appreciate the clarification.

The reason I asked again about Opencv is the part I am stuck on now is viewing that rtsp stream via ethernet from another computer. If you have any insight into how to set that up, would be great. But, might need a different topic.

For reading from another host on the LAN, you would turn the localhost address (127.0.0.1) into the streaming Jetson address:

# If receiver is a Jeston
gst-launch-1.0 uridecodebin uri=rtsp://<Jeston_IP>:8554/test ! nvvidconv ! autovideosink

# Otherwise
gst-launch-1.0 uridecodebin uri=rtsp://<Jeston_IP>:8554/test ! autovideosink
# Or:
gst-launch-1.0 uridecodebin uri=rtsp://<Jeston_IP>:8554/test ! videoconvert ! autovideosink

Hey @Honey_Patouceul,

Thanks for that. My question is a little more specific to network configurations.

So, just to be a little more specific. I am trying to test the LAN network connection. I have a mac, my ubuntu host computer and a windows server.

I am trying to just do some testing to make sure I can see the rtsp stream from my Mac right now via VLC.

With nm-connection-editor I changed the ipv4 settings from “dhcp” to share with other computers. I got a streaming IP of 10.42.0.1. I can ping that IP via my mac, seems to be okay. But, I can’t view the stream via VLC on my mac.

Hopefully, that’s clear and again this might be another topic…but any settings or configurations you know of for setting up LAN locally. I.e. the mac and xavier are connected directly.

A common issue is a firewall blocking… Be sure that the mac can receieve from Jetson on UDP port 8554.
Though, you are correct that this may be another topic.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.