[RTSP Stream][Jetson-Inference] Cannot stream from another device to jetson Nx board

Hi ,

I have some problems that RTSP Stream Not Displaying on Client with Jetson Inference video-viewer.

Environment:

  • Server: Jetson Orin NX (JetPack 6.2, L4T 36.4.4, TensorRT 10.3)

  • Client: Jetson NX (Ubuntu 20.04)

  • Jetson Inference: Latest build

  • Camera: USB camera supporting H264/MJPEG/YUYV (1280x720@30fps)

Problem Description:

I’m trying to stream video from one Jetson to another using RTSP with jetson-inference’s video-viewer, but the client doesn’t display the video stream properly.

Server command:

bash

./video-viewer /dev/video2 rtsp://@:8554/mystream

Client command:

bash

ffplay rtsp://192.168.1.180:8554/mystream

Issue:

  • The RTSP server starts successfully and captures frames (confirmed by server output showing “captured X frames (1280x720)”)

  • Port 8554 is listening (confirmed with netstat -tuln | grep 8554)

  • RTSP server responds to requests (curl -I rtsp://localhost:8554/mystream returns RTSP/1.0 200 OK)

  • Client connects successfully and receives stream metadata (SDP negotiation successful, H264 codec detected)

  • BUT: No video is displayed on the client - the window shows nan M-V: nan fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0

  • Strangely, when I stop the server (Ctrl+C), the client window briefly appears with a single frame before closing

Server Warnings (with GST_DEBUG=3):

[gstreamer] GST_LEVEL_WARNING GstAppSrc basesrc
              streaming stopped, reason not-linked (-1)
[gstreamer] GST_LEVEL_WARNING rtspmedia
              got error Internal data stream error
[gstreamer] GST_LEVEL_WARNING GstAppSink basesink  
              Pipeline construction is invalid, please add queues

What I’ve Tried:

  1. Different ffplay options: -fflags nobuffer -flags low_delay -framedrop

  2. Different transport protocols: -rtsp_transport tcp/udp

  3. Various video-viewer options: --output-codec=h264 --bitrate=4000000

  4. Different input codecs: --input-codec=mjpeg

  5. VLC and gst-launch-1.0 as clients (same issue)

Client Debug Output:

[h264 @ 0xffff5c003ea0] Format yuv420p chosen by get_format().
[h264 @ 0xffff5c003ea0] Reinit context to 1280x720, pix_fmt: yuv420p
Input #0, rtsp, from 'rtsp://192.168.1.180:8554/mystream':
  Metadata:
    title           : Session streamed with GStreamer
    comment         : rtsp-server
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: h264 (Constrained Baseline), yuv420p, 1280x720, 30 tbr

Questions:

  1. How can I fix this issue?

  2. Is there a working example of RTSP streaming between two Jetson devices using jetson-inference tools?

Any help or guidance would be greatly appreciated.

Thank you very much!

Hi,
Please try
Jetson AGX Orin FAQ
Jetson AGX Orin FAQ

Please set up RTSP server through test-launch.

Hi,

Thank you for your suggestion, it now can stream with commands in my devices:

# server: ./test-launch “v4l2src device=/dev/video2 do-timestamp=true ! video/x-raw,width=2560,height=1440,framerate=30/1 ! nvvidconv ! nvv4l2h264enc bitrate=2000000 ! h264parse ! rtph264pay name=pay0 pt=96 config-interval=1”

#client: ffplay rtsp://192.168.1.180:8554/test

As you can see, the resolution for streaming is 2k@30FPS, but when I tried with 4k@30FPS, it shows the error on the client side:

[rtsp @ 0xffff6c000ba0] method DESCRIBE failed: 503 Service Unavailable rtsp://192.168.1.180:8554/test: Server returned 5XX Server Error reply

Is there anyway to optimize the pipeline to stream at 4k@30FPS?

Thank you