Stream Bayer Data with Gtsreamer

Hi,
You may modify the caps to different combinations for a try:

'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12'

The RAW data shoud be in 8-bit so depth=(string)10 looks wrong. May remove it and try again.

May also try this and see if the received RTP stream can be saved:

gst-launch-1.0 -v udpsrc uri=udp://<client IP>:5000 ! application/x-rtp ! filesink location=sample.raw

@DaneLLL Thanks for the reply

You may modify the caps to different combinations for a try

We have modified the caps as per your suggestion, but the result is the same, we can’t dump the data from receiver side

The RAW data shoud be in 8-bit so depth=(string)10 looks wrong. May remove it and try again.

We have changed the bit depth also as per your suggestion, no change in the result.
Actually, our sensor outputs 10bit data, that’s why we had given 10bit depth in the pipeline

[0]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Discrete 3280x2464
			Interval: Discrete 0.048s (21.000 fps)
		Size: Discrete 3280x1848
			Interval: Discrete 0.036s (28.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1640x1232
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)

We have checked data from server pipeline comes as RTP packets using Wireshark tool on the respective node of LAN. The issue is mainly on the receiver side, One more point we have used the same pipeline int the receiver side as loop back, it is working fine we are able to dump the data locally

Also sharing some logs in the receiver side,

gst-launch-1.0 -v udpsrc uri=udp://<client_IP>:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)8,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
#udp streaming on the receiver side jetson itself as loop back

gst-launch-1.0 -v udpsrc uri=udp://localhost:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:32.569931329
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Could you please let us know the root cause of the issue is something to network like port that we are giving while streaming externally on the laptop from jetson using UDP, not related to the pipeline we have tried?
Any help would be appreciated

Hi,
We can run the commands:
[Server]

$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640, height=480, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=10.19.107.92 port=5001

[Client]

$ gst-launch-1.0 udpsrc port=5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480" ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
$ DISPLAY=:0 gst-launch-1.0 udpsrc port=5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480" ! rtpvrawdepay ! videoconvert ! xvimagesink sync=0

Please try smaller resolution and different port number in your setup.

Argus output is the frame data after ISP engine, so it is YUV420 8-bit.

@DaneLLL Thanks for the reply

Argus output is the frame data after ISP engine, so it is YUV420 8-bit.

Noted thanks

Please try smaller resolution and different port number in your setup.

We have tried to reduce the resolution as well as port in these pipelines

# Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=192.168.1.5 port=5001
# Client
gst-launch-1.0 udpsrc uri=udp://192.168.1.5:5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)1280, height=(string)720" ! rtpvrawdepay ! filesink location=sample.raw

But we failed to dump the captured raw data, if we changed the port and resolution too.
We have tried the USB camera as well with resolution 640x480 But the result is same
Could you please help us to sort out the issue further?
Can, you try the same pipeline we used for UDP streaming like jetson as a server and client as Linux PC,
to verify that whether the issue is in the pipeline?

$ gst-launch-1.0 udpsrc port=5001 caps= “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480” ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
$ DISPLAY=:0 gst-launch-1.0 udpsrc port=5001 caps= “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480” ! rtpvrawdepay ! videoconvert ! xvimagesink sync=0

We are giving the full URL like this udpsrc uri=udp://192.168.1.5:5001 instead of just port name
Any difference is there?

Hi,
Please try this and see if h264 streaming works in your setup:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

We can run both RAW data streaming and h264 streaming successfully. You may clarify if it fails in RAW data streaming only. Or it fails in both use-cases.

@DaneLLL We have used the below pipelines H264 UDP streaming

# Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvv4l2h264enc insert-sps-pps=1 ! h264parse ! rtph264pay ! udpsink host=192.168.1.5 port=5000 sync=0
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 4 
   Output Stream W = 1280 H = 720 
   seconds to Run    = 0 
   Frame Rate = 59.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
H264: Profile = 66, Level = 0 

#Client 
gst-launch-1.0 udpsrc uri=udp://192.168.1.5:5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

The result is same, we can’t capture both h264 and RAW data in the receiver side using UDP protocol
Could you please provide the root cause of the issue?

Hi,
Please try the command:

$ gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264

If it still does not work, it looks to be an issue in network. Maybe UDP is blocked due to certain firewall settings. You can check this part.

@DaneLLL Thanks for the input

Please try the command:
$ gst-launch-1.0 udpsrc port=5000 ! ‘application/x-rtp,encoding-name=H264,payload=96’ ! rtph264depay ! video/x-h264 ! filesink location=sample.h264

We have tried the above pipeline too in the receiver side but the result is same. failed to capture H264 Data

# client
gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

We are giving the full URL like this udpsrc uri=udp://192.168.1.5:5000 instead of just port name
Any difference it will make in the pipeline?

If it still does not work, it looks to be an issue in network. Maybe UDP is blocked due to certain firewall settings. You can check this part.

The data is coming from server side(jetson) as RTP packets we have verified using Wiresharkfrom as RTP packets

Could you please let us know the issue is in server(Jetson) or client(PC) side also how to unblock the UDP port from software side?

Hi,
We have tried server(Jetson) with client(PC in Ubuntu 18.04 or 20.04) or client(Jetson). All commands can work well. Maybe you can try client(Jetson). If client(Jetson) works, the issue may be in your client(PC). If client(Jetson) does not work, it is still like that certain network settings trigger the failure.

You may try:

  • be sure that no firewall blocks UDP/5000.
  • adding rtpjitterbuffer. Note that its latency property is set to 2000 ms by default, you may decrease when it works.
  • disable udpsink property auto-multicast in sender, especially if using WiFi.
  • store into a file container so that decoding would be simpler:

Sender:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 ! h264parse ! rtph264pay ! queue ! udpsink host=192.168.1.5 port=5000 auto-multicast=0 sync=0

Receiver:

# client
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer ! queue ! rtph264depay ! h264parse ! matroskamux ! filesink location=test.mkv -e

# Stop after 10 s with Ctrl-C (once then wait for completion)

# Play back
gst-play-1.0 test.mkv

That being said, it is getting far from your original topic.
You may better explain your case.

  • There are very few use cases that would require bayer data, the only ones I can see are for monochrome cases or if you want to exclude Argus debayering or auto-tuning from the loop. For such case, you have all the answers in my first answer (edited many times for clearness and completeness). You would try that on a fresh JP5 install without any try to patch v4l2src nor build gstreamer nor any plugin.
  • Streaming debayered raw video may be an option if you need lossless quality and have the required bandwidth available. This might need some buffer-size adjustments for UDP.
  • Streaming compressed video such as H264 would need less bandwith, but may loose quality. Tuning the encoding parameters may help saving quality.

@DaneLLL We have tried UDP streaming with client as jetson and server as PC, it works well with H264 and RAW data. Can you provide some solution to stream RAW data using UDP with Jetson as server and PC as client?

@Honey_Patouceul Thanks for the info

Be sure that no firewall blocks UDP/5000.

Can you provide how to check whether UDP/5000 is blocked and also how to unblock the same using software, because we are able to stream from PC as Server to Jetson as client

disable udpsink property auto-multicast in sender, especially if using WiFi.

What difference does it make when the auto-multicast=true property is enabled on the pipeline?

You would have the receiver to listen on UDP/5000 with:

nc -l -u -p 5000

that would output received bytes to stdout.

Then from sender try:

while true; do echo 'If you see this on receiver, then no firewall blocks UDP/5000'; sleep 1; done | nc -u <receiver_IP_address> 5000

that will send the string each second to receiver on port 5000 with UDP.

If the receiver outputs the string, then it can go through and it is not blocked.
If not, then you would have to investigate. If it is a firewall, it would depend on what has been installed. I can’t guess, but the basic firewall with Linux would be ufw.

By default udpsink enables multicast. Multicast allows several hosts on the LAN to receive the stream, so that sender would not have to send to each receiver. Though, multicast may not work fine with WiFi. If you don’t need to send to several hosts, you may disable it. Note that multicast uses special IP addresses for a multicast group, such as 224.1.1.1.

BTW, did you try my proposals ? What does fail ?
I may provide some workarounds to your issues, but I don’t have time for teaching gstreamer or networks.

Hi,
Please re-flashing Xavier NX developer kit to Jetapck 4.6.3 or 5.1.1. And give it a try. Probably you have done certain software customization so it does not work properly. We have validated the setup and commands. If you use Xavier NX developer kit + default image, it is supposed to work fine. If there is no additional firewall in network.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.