I am trying to stream video to an AGX Orin, and all of the streaming techniques I have tried have resulted in either no video output from the receiver pipeline or a full green frame. I am attempting to just stream to localhost on the AGX to start.
My streaming pipeline is gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter ! videoconvert ! x264enc ! rtph264pay ! udpsink host=localhost port=5000
If I replace x264enc and forward with autovideosink, it displays the camera feed just fine.
my receiving pipeline is
gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
I have tried removing decodebin, videoconvert, and several caps iterations.
Using videotestsrc instead of the usb source does not change anything.
sending output:
$ gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter ! videoconvert ! x264enc ! rtph264pay ! udpsink host=localhost port=5000
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)Y42B, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
Redistribute latency...
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)Y42B, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)10/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, codec_data=(buffer)017a001fffe1001c677a001fbcd9405005bb016a02020280000003008000000a478c18cb01000568ebecb22c, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high-4:2:2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z3oAH7zZQFAFuwFqAgICgAAAAwCAAAAKR4wYyw\=\=\,aOvssiw\=", profile-level-id=(string)7a001f, profile=(string)high-4:2:2, payload=(int)96, ssrc=(uint)4189946296, timestamp-offset=(uint)1681988701, seqnum-offset=(uint)11685, a-framerate=(string)10
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z3oAH7zZQFAFuwFqAgICgAAAAwCAAAAKR4wYyw\=\=\,aOvssiw\=", profile-level-id=(string)7a001f, profile=(string)high-4:2:2, payload=(int)96, ssrc=(uint)4189946296, timestamp-offset=(uint)1681988701, seqnum-offset=(uint)11685, a-framerate=(string)10
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, codec_data=(buffer)017a001fffe1001c677a001fbcd9405005bb016a02020280000003008000000a478c18cb01000568ebecb22c, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high-4:2:2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 1682060928
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 11685
Should I get new timestamps / segnums to indicate sending packets? The output does not change when I attempt to connect the receiving pipeline.