UDP video from ffmpeg to gstreamer

Hardware: AGX ORIN
Software: Jetpack 5.0.2

I have been attempting to send a video file locally via UDP using ffmpeg:

ffmpeg -stream_loop -1 -re -i test.ts -map 0 -c copy -preset ultrafast -f mpegts "udp://127.0.0.1:5000"

And receiving the same stream via UDP using gstreamer:

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264 ! rtph264depay ! decodebin ! videoconvert ! aasink

But I get an error on the receiving gstreamer end:

/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
gstrtpbasedepayload.c(505): gst_rtp_base_depayload_handle_buffer (): 
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Received invalid RTP payload, dropping
ERROR: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
The stream is in the wrong format.
Additional debug info:
gstrtph264depay.c(1298): gst_rtp_h264_depay_process ():
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
NAL unit type 27 not supported yet

More detailed information on the video file:

Original ID: 1002
Codec: H264 - MPEG-4 AVC (part 10) (h264)
Type: Video
Video resolution: 1920x1080
Buffer dimensions: 1920x1088
Frame rate: 30
Decoded format: 
Orientation: Top left
Chroma location: left

When I listen with the command gst-launch-1.0 -v udpsrc port=5000 ! fakesink dump=1, it is quite apparent that the packets from FFMPEG are being received.
I am not sure why gstreamer’s rtph264depay says the stream is in the wrong format.

Would I have to check some details on the FFMPEG side?
This is what information FFMPEG shows by default while running.

Input #0, mpegts, from 'test.ts':
  Duration: 00:00:57.36, start: 20902.827056, bitrate: 2504 kb/s
  Program 1
    Stream #0:0[0x3ea]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
Output #0, mpegts, to 'udp://127.0.0.1:5000':
  Metadata:
    encoder         : Lavf58.29.100
    Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 29.97 fps, 29.97 tbr, 90k tbn, 90k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame=  611 fps= 30 q=-1.0 Lsize=    5847kB time=00:00:20.62 bitrate=2323.0kbits/s speed=   1x
video:5350kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 9.301388%

Any advice would be appreciated.

Hi,
Please try

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264 ! rtph264depay ! fakesink

To check if rtph264depay can extract the stream from the packets.

This yields the same errors.

WARNING: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
gstrtpbasedepayload.c(505): gst_rtp_base_depayload_handle_buffer (): /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Received invalid RTP payload, dropping
ERROR: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: The stream is in the wrong format.
Additional debug info:
gstrtph264depay.c(1298): gst_rtp_h264_depay_process (): /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
NAL unit type 27 not supported yet

Hi,

I am not sure what is wrong with the sender, but I have seen that your reception pipeline works correctly if Gstreamer generates the video stream:

For example:

gst-launch-1.0 videotestsrc  ! video/x-raw,width=640,height=480 ! videoconvert ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, clock-rate=90000,payload=96   ! rtph264depay ! video/x-h264 ! queue ! h264parse ! decodebin ! videoconvert ! xvimagesink

Maybe you can compare the video stream produced by GStreamer with your stream produced by ffmpeg,
Use the option -v in the pipeline to see the caps produced by the rtph264pay.
Also, I recommend sending test.ts using Gstreamer to compare easily with video stream using videotestsrc.

Hi ManuelLeiva,

Thank you for your suggestions.
I ran your suggested sending pipeline with a verbose option.
The receiving pipeline was modified to use aasink to show the feed as animated ascii characters directly in the terminal since I don’t have a graphical interface available.
This worked fine.

gst-launch-1.0 -vvv videotestsrc ! video/x-raw,width=640,height=480 ! videoconvert ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, clock-rate=90000,payload=96 ! rtph264depay ! video/x-h264 ! queue ! h264parse ! avdec_h264 ! aasink

The verbose option showed that rtph264pay negotiated these caps with it’s preceding plugin (x264enc):
video/x-h264, codec_data=(buffer)0164001effe1001d6764001eacd940a03db016a0c020b4a0000003002000000791e2c5b2c001000568ebecb22c, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)high, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, chroma-site=(string)jpeg, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono

…and negotiated these caps with the following plugin (udpsink):
application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001e, sprop-parameter-sets=(string)"Z2QAHqzZQKA9sBagwCC0oAAAAwAgAAAHkeLFssA\=\,aOvssiw\=", payload=(int)96, ssrc=(uint)142441043, timestamp-offset=(uint)3735297446, seqnum-offset=(uint)8096, a-framerate=(string)30

I am not sure how I should go about comparing these gstreamer caps with what FFMPEG does.


You also suggested trying gstreamer pipelines to stream the video file in question:
gst-launch-1.0 -vvv filesrc location=test.ts ! tsdemux ! h264parse ! avdec_h264 ! queue ! videoconvert ! videoconvert ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, clock-rate=90000,payload=96 ! rtph264depay ! video/x-h264 ! queue ! h264parse ! avdec_h264 ! aasink
This worked fine.


I also tried sending the file via UDP where both sides of the pipeline are ffmpeg:

ffmpeg -re -i test.ts -map 0 -c copy -preset ultrafast -f mpegts "udp://127.0.0.1:5000"

ffmpeg -i udp://localhost:5000 -vcodec copy output.mp4
This worked fine and the resulting output.mp4 is viewable.

The idea was to check if there is some error in test.ts codification, but since you can use Gstreamer as sender with that file, it seems correct.

I try a quick test reproducing your problem using a video test source:

ffmpeg -f lavfi -i testsrc=duration=1000:size=1920x1080:rate=30 -map 0 -c copy -f mpegts "udp://127.0.0.1:5000"

I was checking GST_DEBUG=3 logs, but they doesn’t provides more information.

Hi,
Please try uridecodebin and see if it can be run successfully:

$ gst-launch-1.0 uridecodebin uri="udp://127.0.0.1:5000" ! fakesink

If it runs well, you can set the environment variable:

$ export GST_DEBUG=*FACTORY*:4

To see what elements are picked by uridecodebin. This should give some information.

Hi DaneLLL,

Your suggestion did it for me.

I started the sending pipeline in a terminal:
ffmpeg -stream_loop -1 -re -i test.ts -map 0 -c copy -preset ultrafast -f mpegts "udp://127.0.0.1:5000"

The rest of the commands are run in a second terminal.
I set the environment variable to get the desired debug information:
export GST_DEBUG=*FACTORY*:4

I then ran the following command to allow gstreamer to choose what plugins to use to receive the stream:
gst-launch-1.0 uridecodebin uri="udp://127.0.0.1:5000" ! fakesink

This is the debug output I got:

0:00:00.022519225   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "uridecodebin"
0:00:00.023246370   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "fakesink"
0:00:00.023386884   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "pipeline"
Setting pipeline to PAUSED ...
0:00:00.026723213   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:360:gst_element_factory_create: creating element "udpsrc" named "source"
0:00:00.027014513   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "decodebin"
0:00:00.027131954   199 0xaaab17afd470 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:360:gst_element_factory_create: creating element "typefind" named "typefind"
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.070875475   199 0xaaab17b10180 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "tsdemux"
0:00:00.071591708   199 0xaaab17b10180 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "multiqueue"
0:00:00.072655209   199 0xaaab17b10180 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "h264parse"
0:00:00.072980302   199 0xaaab17b10180 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "capsfilter"
No EGL Display
nvbufsurftransform: Could not get EGL display connection
0:00:00.257979470   199 0xaaab17b105e0 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:363:gst_element_factory_create: creating element "nvv4l2decoder"

This indicated that uridecodebin was resolving to a pipeline that looks something like this:
gst-launch-1.0 udpsrc port=5000 ! tsdemux ! multiqueue ! h264parse ! nvv4l2decoder ! fakesink
It was interesting to see that nowhere did it choose to use the plugin rtph264depay, which I had been trying to use earlier.

I then modified the pipe to better suit me:
gst-launch-1.0 udpsrc uri="udp://127.0.0.1:5000" ! tsdemux ! queue ! h264parse ! avdec_h264 ! videoconvert ! aasink
Which did give a promising ascii output in the terminal.

There was still some stutter in the video, and I remembered previously that setting a clock rate could help with this:
gst-launch-1.0 udpsrc uri="udp://127.0.0.1:5000" ! video/mpegts, systemstream=true, clock-rate=90000 ! tsdemux ! queue ! h264parse ! avdec_h264 ! videoconvert ! aasink
And VoilĂ , this pipeline receives the ffmpeg stream perfectly! :)

My problem all along was that I had been insisting on using rtph264depay right after udpsrc.
Thank you for your help. I had been stuck on this for quite a while.

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.