Receive RTP (Pipeline demo)

setup:
• Hardware Platform:Jetson Xavier NX
• DeepStream Version: 6.1
• JetPack Version: 5.0.1 DP
• TensorRT Version: 8.4.0.11
• Issue Type( questions)

Hi folks,
I work with deepstream and python bindings I have pipeline that captare UDP H264 stream pass to Yolo object detection and tracking, everything work great.
Now I need to handle with something new so before I integrate my network I try build simple pipeline in command line.
someone Send to me RTP stream is pipeline (Sender):

gst-launch-1.0 v4l2src device="/dev/video” ! jpegdec ! omxh264enc ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

I build simple receive pipeline:

gst-launch-1.0 udpsrc port=46002 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33" ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! autovideosink

Unfortntaly I don’t see video on screen I thing something worng with the element rtpmp2tdepay or maybe I missing out something I will be happy to any help.
I add some graphsView of the pipeline


In addition I add graphs of the pipeline

gst-launch-1.0 udpsrc port=46002 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33" ! rtpmp2tdepay ! fakesink

You may try adding rtpjitterbuffer (latency is in ms):

gst-launch-1.0 udpsrc port=46002 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=MP2T, payload=33 ! rtpjitterbuffer latency=1000 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! autovideosink

Also note that omx plugins are deprecated. For sender you may try instead:

gst-launch-1.0 v4l2src device=/dev/video ! nvjpegdec ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 insert-vui=1 ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

Hi, thank you for your reply. Unfortunately, I added the element and still I have the same problem I attached an image of the Graph.
Maybe you have another idea?

I add wireshark packet for more details

The following two pipelines can work:

gst-launch-1.0 udpsrc uri=udp://localhost:5004 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33" ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! nv3dsink
gst-launch-1.0 v4l2src device="/dev/video0" ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! jpegdec ! nvvideoconvert ! nvv4l2h264enc ! mpegtsmux ! rtpmp2tpay ! udpsink

The key is the receiver pipeline should run first.

Hi,
I try use your pipeline:

gst-launch-1.0 udpsrc uri=udp://234.0.0.0:46002 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33" ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! autovideosink

Unfortunately, still have the same problem I don’t see the video on the screen, Pay attention that I can’t change the sender pipeline (maybe I need to mention this before).

I try replace some PCAP file that I transmit and use your pipeline I see some progress but now I get this error :

gstrtpjitterbuffer.c(3286): gst_rtp_jitter_buffer_chain (): /GstPipeline:pipeline0/GstRtpJitterBuffer:rtpjitterbuffer0:
Received invalid RTP payload, dropping

Just checked now with that sender pipeline for localhost:

gst-launch-1.0 videotestsrc ! jpegenc ! queue ! nvv4l2decoder mjpeg=1 ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! mpegtsmux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=46002

and then on same Jeston, received and displayed with:

gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink

Can you try this ? (Note that I don’t have DS installed on that Jetson)

The pipelines you post can work.

The key is the receiver pipeline should run first.

Re-reading that now, maybe the issue is that your are tring to decode RTP/H264 as RTP/MP2T.
rtp streaming with jetson-utils would probably use RTP/H264 by default.
For RTP/H264, you would use as receiver:

# Receiving on Jeston (sender sends to 127.0.0.1)
gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer latency=100 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink

# Or
gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer latency=100 ! rtph264depay ! h264parse ! video/x-h264,width=<width>,height=<height>,framerate=<fps>/1 ! nvv4l2decoder ! nvvidconv ! autovideosink

For jetson-utils sending RTP/MP2T, which would be convenient for receiving with VLC or FFMPEG without writing a SDP file, there is an undocumented option. See added support for rtpmp2ts protocol · dusty-nv/jetson-utils@9d37024 · GitHub)

I try your pipeline and see nothing, the sender pipeline is mpeg:

gst-launch-1.0 v4l2src device="/dev/video” ! jpegdec ! omxh264enc ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

Unfortunately I can’t change this but I can tell you that I have some “Blackbox” player that call mpv that sucssed to play the video. but I don’t know what pipeline this player use

I change the udpsink to 234.0.0.0 and your exa,ple work on my jetson

I try run command line with debug flag on my stream and I get:

GST_DEBUG=3 gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder enable-max-performance=1 ! nvvidconv ! autovideosink
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
0:00:00.251617216 86324 0xaaaab51d74d0 WARN                    v4l2 gstv4l2object.c:4477:gst_v4l2_object_probe_caps:<nvv4l2decoder0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: Unknown error -1
0:00:00.251738368 86324 0xaaaab51d74d0 WARN                    v4l2 gstv4l2object.c:2394:gst_v4l2_object_add_interlace_mode:0xaaaab5193960 Failed to determine interlace mode
0:00:00.251848160 86324 0xaaaab51d74d0 WARN                    v4l2 gstv4l2object.c:2394:gst_v4l2_object_add_interlace_mode:0xaaaab5193960 Failed to determine interlace mode
0:00:00.251925280 86324 0xaaaab51d74d0 WARN                    v4l2 gstv4l2object.c:2394:gst_v4l2_object_add_interlace_mode:0xaaaab5193960 Failed to determine interlace mode
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

So I’d suggest that you try this as sender:

# Note that /dev/video is an old thing where a symbolic link was pointing to default camera video node.
# You may also try direct video node path such as /dev/video0.
gst-launch-1.0 v4l2src device=/dev/video ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=15 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

# Or:
gst-launch-1.0 v4l2src device=/dev/video ! jpegparse ! nvjpegdec ! queue ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=15 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

# Or if sender is not a Jetson and has to use omxh264enc, you may try adding h264parse after encoder :
gst-launch-1.0 v4l2src device=/dev/video ! jpegparse ! jpegdec ! omxh264enc insert-sps-pps=1 insert-vui=1 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.0 port=46002

with that Jetson receiver:

gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink

# Or simpler:
gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! decodebin ! nvvidconv ! autovideosink

# Or general CPU-based H264 decoding (for remote host try increasing latency):
gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

Also note that you may stream JPEG from sender with RTP/JPEG over UDP with:

gst-launch-1.0 v4l2src device=/dev/video0 ! jpegparse ! identity ! rtpjpegpay ! udpsink host=127.0.0.1 port=46004

So you would receive, decode jpeg and display on receiver with:

gst-launch-1.0 udpsrc port=46004 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=JPEG,payload=26 ! rtpjitterbuffer latency=100 ! rtpjpegdepay ! jpegparse ! jpegdec ! videoconvert ! autovideosink

So you can also H264 encode from receiver.

You may better tell your case for more accurate advice.

Hi thank you for offering this solution, but again I mention that I can’t change the sender side only the receiver side.
What element in receiver side I can add to solve my problem, maybe there is diffrent because the sender side use omx encoder?

I think maybe there is some difference with the UDP protocol, I investigate the wireshark packets and I see only one difference in the UDP field in Wireshark that calls the stream index when I open the video stream index equal to 0 when I play the pcap file I get stream index equal to 2 maybe this is the problem?


Did you get the warnings with all 3 proposed receiver pipelines ?

Note that these are just warnings in your logs, the receiver pipeline started…maybe it failed to get correct video info because the sender don’t properly send these…you may specify video format in receiver pipeline caps.
If receiver doesn’t start displaying, you may have to wait for sender to send an IDR frame.

When I try your second example :

gst-launch-1.0 udpsrc port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! decodebin ! nvvidconv ! autovideosink

I did not get the warning message but still can’t succeed to see the video on a screen in addition I used JTOP and look at the field of HW engines NVDEC and did not see any response (the Dec field did not become green).

when you write specify video format that did you mean because in general I specify application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33

maybe there is a problem in my experiment I playing a pcap file maybe this influence on udp packets?

Hi folks,
I have a big progress (I think so) and I try to understand what missing.
I succeed to transmit the video src and show the video on the screen I need to mention important things that I did not say before the video that I try receiving is from a Pcap file so I play the Pcap file with Colasoft Packet Player from my windows and try to capture multicast in jetson (I connected with hub).
I make a new experiment I save the videoSrc to a pcap file (the video that I succeed to display on jetson) then I play the video and surprisingly I do not succeed to play the video on Jetson.
I think maybe is a problem with timing or something similar because when I try open rtp from the Pcap file I fail I will be happy to any advice on how to tackle this problem.

Here is what I tried:

  1. On another computer than Jeston on the same LAN, I simulated a camera feed streaming to Unicast-prefix-based multicast. I used address 234.0.0.1 (234.0.0.0 didn’t work for my case):
gst-launch-1.0 videotestsrc ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=15 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=234.0.0.1 port=46002
  1. Then I captured (you can do this either on sender or Jeston receiver):
sudo tcpdump -w test.pcap -i eth0

let it run for 30s and stop it with Ctrl-C. At this point I also stopped the sender.

  1. On Jetson play the capture with:
gst-launch-1.0 filesrc location=test.pcap ! pcapparse dst-ip=234.0.0.1 dst-port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink -v

Or you can restream with:

gst-launch-1.0 filesrc location=test.pcap ! pcapparse dst-ip=234.0.0.1 dst-port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! udpsink host=234.0.0.1 port=46002

And receive with:

gst-launch-1.0 udpsrc address=234.0.0.1 port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=100 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink -v

Hope this helps. If not working, post the output from:

sudo tcpdump -r test.pcap

I not relly undesrtand what you do diffrent from my expirment I play pcap file from windows computer and try capture the video in jetson xavier all the users connecterd to hub and the video stream from the pcap file is to 234.0.0.0:46002

some additional information

Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33
/GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:src: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
/GstPipeline:pipeline0/GstTSDemux:tsdemux0.GstPad:sink: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
/GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)320, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)2.1
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)320, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)2.1
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:src: caps = video/x-raw(memory:NVMM), format=(string)NV12, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/Gstnvvideoconvert:nvvideoconvert0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/Gstnvvideoconvert:nvvideoconvert0.GstPad:sink: caps = video/x-raw(memory:NVMM), format=(string)NV12, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, format=(string)YUY2, block-linear=(boolean)false
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, format=(string)YUY2, block-linear=(boolean)false
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.

You may try increasing latency to 1000 ms (default is 2000) and adding a queue after decoder (also adjust your multicast address):

sudo jetson_clocks

gst-launch-1.0 udpsrc address=234.0.0.1 port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=1000 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! autovideosink -v

# Or
gst-launch-1.0 udpsrc address=234.0.0.1 port=46002 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=MP2T,payload=33 ! rtpjitterbuffer latency=1000 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! nv3dsink -v

Why do I need to add so much delay to the pipeline it’s not acceptable in my system to underline my system needs to work in real-time operation, Although I try to add your elements and still, it’s not working.

I will try to describe my problem again I play a PCAP file from Windows the PAP file contains an RTP stream to multicast UDP 234.0.0.0:46002 I do not succeed to see the video on screen when I try to replace autovideosink with fakesink and view Jtop I do not see any response of NVVDEC.
Now I play exactly the same video (Videostc) but not from PCAP from Jetson to Udpsink I succeed to capture and display the video.
During the experience, I connected to Hub with another Windows computer that succeed to capture video PCAP and not PCAP and displayed using an MPV player.

I add GST_DEBUG=5 maybe you will see something wrong

0:01:33.712617856  4265 0xaaaaab34a000 DEBUG                GST_BUS gstbus.c:379:gst_bus_post:<bus1> [msg 0xaaaaab736ca0] dropped
0:01:33.712647136  4265 0xaaaaab34a000 INFO        GST_ELEMENT_PADS gstpad.c:2187:gst_pad_unlink: unlinked udpsrc0:src and rtpmp2tdepay0:sink
0:01:33.712708416  4265 0xaaaaab34a000 DEBUG                GST_BUS gstbus.c:340:gst_bus_post:<bus1> [msg 0xaaaaab6f89c0] posting on bus structure-change message: 0xaaaaab6f89c0, time 99:99:99.999999999, seq-num 107, element 'sink', GstMessageStructureChange, type=(GstStructureChangeType)GST_STRUCTURE_CHANGE_TYPE_PAD_UNLINK, owner=(GstElement)"\(GstUDPSrc\)\ udpsrc0", busy=(boolean)false;