RTSP/RTMP with gstreamer on TX2


I’m trying to streaming from a camera to my TX2. My camera can only generate a stream to an external address. From my understanding the procedure would be:

  1. setup a RTSP server on TX2 that wait for a connection
  2. connect the camera to this port and start streaming
  3. read the port with gstreamer and OpenCV, e.g. ‘rtspsrc location= ! … ! appsink’

From all materials I found, they read from a external stream to TX2 and then feed it to anywhere they like, but in my case I do not have an address for the camera stream and I’m not sure how to point it to the right place on TX2. Any suggestions would be much appreciated.

Meanwhile if anyone can help me with reading rtmp streaming using gstreamer? e.g. some pipelines like ‘rtmpsrc location= ! … ! appsink’


We have some posts about rtsp streaming:

For your reference.

1 Like

Thanks for your reply, I’ve seen these posts and I don’t think they talked about setting up RTSP servers, but only the steps after that.

We usually set up rtsp server via test-launch.c. You may check the source code to get more information. Other users may also share experience of using different methods.

Please correct me if I’m wrong. From my understanding, if I do

./test-launch "videotestsrc ! omxh265enc ! rtph265pay name=pay0 pt=96"

it will generate a test stream to, which makes TX2 to be the source of stream, rather than a stream receiver?

Also I cannot do

./test-launch "rtspsrc location=<camera address> ! ..."

because the camera address is unknown?

Do you know how to open your camera via a gstreamer pipeline? You may go to http://gstreamer-devel.966125.n4.nabble.com/ to get support.

On TX2, we support nvcamerasrc for on-board cameras(Bayer Senors) and v4l2src for USB cameras(and YUV sensors). Your camera looks to be an IP camera? Please go to gstreamer forum. Users there are more experienced in gstreamer usecases and can give you better suggestion.

You probably get a working pipeline like

$ gst-launch-1.0 rtspsrc location="$RTSP_PATH" ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink

And we can suggest you a pipeline with HW acceleration:

$ gst-launch-1.0 rtspsrc location="$RTSP_PATH" ! rtph264depay ! h264parse ! <b>omxh264dec ! nvoverlaysink</b>

Please have a working pipeline first and then we can giv suggestion to leverage TX2 HW functions.

I can run xvimagesink sucessfully, but not omxh264dec with nvover

I have the following msg:

Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (request) SETUP stream 2
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.192: gst_caps_is_empty: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.192: gst_caps_truncate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.192: gst_caps_fixate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.192: gst_caps_get_structure: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.192: gst_structure_get_string: assertion ‘structure != NULL’ failed

(gst-launch-1.0:15225): GStreamer-CRITICAL **: 16:43:49.193: gst_mini_object_unref: assertion ‘mini_object != NULL’ failed


Hi @cwlinghk
This topic is for r28 release. Please start a new topic for clearness.