Stream Bayer Data with Gtsreamer

Hi,
Please try the command:

$ gst-launch-1.0 nvarguscamerasrc num-buffers=1 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! filesink location=test.nv12 -e

@DaneLLL Thanks for the input

$ gst-launch-1.0 nvarguscamerasrc num-buffers=1 ! ‘video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12’ ! nvvidconv ! video/x-raw ! filesink location=test.nv12 -e

@DaneLLL The pipeline worked

One more point,
we need to stream the raw bayer image data via tcpserver locally and capture the same with tcpclient to store the bayer data in a file, we have tried some pipelines, but we couldn’t store the bayer data into the file using tcpclientsrc and filesink

#server 
gst-launch-1.0 -vvv v4l2src   ! 'video/x-bayer,width=3264,height=2464,format=rggb,framerate=21/1' ! tcpserversink port=8888 host=192.168.55.1 recover-policy=keyframe sync-method=latest-keyframe

#Server logs
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstTCPServerSink:tcpserversink0: current-port = 8888
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstTCPServerSink:tcpserversink0.GstPad:sink: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
#client
gst-launch-1.0 -v tcpclientsrc host=192.168.55.1 port=8888 ! filesink location=bayer.raw

#client logs
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.000284682
Setting pipeline to NULL ...
Freeing pipeline ...

Could you confirm whether this is the right method to stream bayer data as per our use case, or are we missing something?
Any help would be appreciated

@krishnaprasad.k ,
I’ve added a point to my previous answer that may help for your case. This doesn’t need any v4l2src patching.

For bayer plugin not available on your system, it may be blacklisted because of missing dependency.
Try:

# Clear gstreamer cache
rm ~/.cache/gstreamer-1.0/registry.aarch64.bin

# This will rebuild the cache and tell what plugins are blacklisted if any
gst-inspect-1.0 -b

# Check library dependencies for that plugin with:
ldd /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstbayer.so

Though, debayering with bayer2rgb gstreamer element is CPU based and would be a bottleneck on Jetsons for high pixel rate.
Usually on Jetson Argus is used for debayering with ISP and providing NV12 raw video.
Otherwise, better debayer with receiver PC.

Hi,
We would suggest encode to h264/h265 for streaming. Streaming RAW data through network requires significant bandwidth. Please refer to the discussion in
UDP-Raw Stream Issue On Nvidia Jetson Devices - #8 by DaneLLL

@DaneLLL Thanks for the info

We would suggest encode to h264/h265 for streaming. Streaming RAW data through the network requires significant bandwidth.

As per use case, we need to stream bayer data only. Please suggest some methods to achieve the same?

We have tried to stream the NV12 data for validating the path using the below pipelines

#server as **Jetson Xavier-NX**

gst-launch-1.0 nvarguscamerasrc num-buffers=1000 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay !  'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=192.168.55.100 port=5000

#Client PC

gst-launch-1.0 -v udpsrc host=192.168.55.100 port=5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw 

But sample.raw has zero data, while the same pipeline is used for streaming locally inside the jetson it is working fine
Could you please let us know the solution to fix this issue?

One more point
We have tried to stream without rtpvrawpay on jetson side using the below pipeline

#Server side
 gst-launch-1.0 nvarguscamerasrc  ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! udpsink host=192.168.55.100 port=5000

#Error log
WARNING: from element /GstPipeline:pipeline0/GstUDPSink:udpsink0: Attempting to send a UDP packets larger than maximum size (12063744 > 65507)
Additional debug info:
gstmultiudpsink.c(722): gst_multiudpsink_send_messages (): /GstPipeline:pipeline0/GstUDPSink:udpsink0:
Reason: Error sending message: Message too long

Could you please clarify if there is any reson for the usage of RTP payloads like rtpvrawpay for UDP streaming?

Hi,
For UDP streaming, you can also refer to
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

In your commands, please make sure 192.168.55.100 is IP address of the client

@DaneLLL Thanks for the reply

In your commands, please make sure 192.168.55.100 is IP address of the client

Yes you are correct

We have got some information about UDP streaming in the link that you have shared,

We have used the below gstreamer pipelines with UDP streaming through LAN which is 1gbps speed, but we can’t dump the data to the file from the receiver side,
The same pipeline we have streamed locally in the jetson device (server and client are jetson only)

#Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-t raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=<client IP> port=5000
Note : In previous post the pipline we have shared the wrong pipeline for cleint 
#Client  
gst-launch-1.0 -v udpsrc uri=udp://<client IP>:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw

Could you please help us find the root cause of the issue or some method to debug the same?

Hi,
You may modify the caps to different combinations for a try:

'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12'

The RAW data shoud be in 8-bit so depth=(string)10 looks wrong. May remove it and try again.

May also try this and see if the received RTP stream can be saved:

gst-launch-1.0 -v udpsrc uri=udp://<client IP>:5000 ! application/x-rtp ! filesink location=sample.raw

@DaneLLL Thanks for the reply

You may modify the caps to different combinations for a try

We have modified the caps as per your suggestion, but the result is the same, we can’t dump the data from receiver side

The RAW data shoud be in 8-bit so depth=(string)10 looks wrong. May remove it and try again.

We have changed the bit depth also as per your suggestion, no change in the result.
Actually, our sensor outputs 10bit data, that’s why we had given 10bit depth in the pipeline

[0]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Discrete 3280x2464
			Interval: Discrete 0.048s (21.000 fps)
		Size: Discrete 3280x1848
			Interval: Discrete 0.036s (28.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1640x1232
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)

We have checked data from server pipeline comes as RTP packets using Wireshark tool on the respective node of LAN. The issue is mainly on the receiver side, One more point we have used the same pipeline int the receiver side as loop back, it is working fine we are able to dump the data locally

Also sharing some logs in the receiver side,

gst-launch-1.0 -v udpsrc uri=udp://<client_IP>:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)8,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
#udp streaming on the receiver side jetson itself as loop back

gst-launch-1.0 -v udpsrc uri=udp://localhost:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)3264, height=(int)2464, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, encoding-name=(string)RAW, depth=(string)10, sampling=(string)YCbCr-4:2:0, width=(string)3264, height=(string)2464, framerate=(fraction)21/1, format=(string)NV12, clock-rate=(int)90000
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:32.569931329
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Could you please let us know the root cause of the issue is something to network like port that we are giving while streaming externally on the laptop from jetson using UDP, not related to the pipeline we have tried?
Any help would be appreciated

Hi,
We can run the commands:
[Server]

$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640, height=480, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=10.19.107.92 port=5001

[Client]

$ gst-launch-1.0 udpsrc port=5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480" ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
$ DISPLAY=:0 gst-launch-1.0 udpsrc port=5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480" ! rtpvrawdepay ! videoconvert ! xvimagesink sync=0

Please try smaller resolution and different port number in your setup.

Argus output is the frame data after ISP engine, so it is YUV420 8-bit.

@DaneLLL Thanks for the reply

Argus output is the frame data after ISP engine, so it is YUV420 8-bit.

Noted thanks

Please try smaller resolution and different port number in your setup.

We have tried to reduce the resolution as well as port in these pipelines

# Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=192.168.1.5 port=5001
# Client
gst-launch-1.0 udpsrc uri=udp://192.168.1.5:5001 caps= "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)1280, height=(string)720" ! rtpvrawdepay ! filesink location=sample.raw

But we failed to dump the captured raw data, if we changed the port and resolution too.
We have tried the USB camera as well with resolution 640x480 But the result is same
Could you please help us to sort out the issue further?
Can, you try the same pipeline we used for UDP streaming like jetson as a server and client as Linux PC,
to verify that whether the issue is in the pipeline?

$ gst-launch-1.0 udpsrc port=5001 caps= “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480” ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw
$ DISPLAY=:0 gst-launch-1.0 udpsrc port=5001 caps= “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)640, height=(string)480” ! rtpvrawdepay ! videoconvert ! xvimagesink sync=0

We are giving the full URL like this udpsrc uri=udp://192.168.1.5:5001 instead of just port name
Any difference is there?

Hi,
Please try this and see if h264 streaming works in your setup:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

We can run both RAW data streaming and h264 streaming successfully. You may clarify if it fails in RAW data streaming only. Or it fails in both use-cases.

@DaneLLL We have used the below pipelines H264 UDP streaming

# Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvv4l2h264enc insert-sps-pps=1 ! h264parse ! rtph264pay ! udpsink host=192.168.1.5 port=5000 sync=0
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 4 
   Output Stream W = 1280 H = 720 
   seconds to Run    = 0 
   Frame Rate = 59.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
H264: Profile = 66, Level = 0 

#Client 
gst-launch-1.0 udpsrc uri=udp://192.168.1.5:5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

The result is same, we can’t capture both h264 and RAW data in the receiver side using UDP protocol
Could you please provide the root cause of the issue?

Hi,
Please try the command:

$ gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264

If it still does not work, it looks to be an issue in network. Maybe UDP is blocked due to certain firewall settings. You can check this part.

@DaneLLL Thanks for the input

Please try the command:
$ gst-launch-1.0 udpsrc port=5000 ! ‘application/x-rtp,encoding-name=H264,payload=96’ ! rtph264depay ! video/x-h264 ! filesink location=sample.h264

We have tried the above pipeline too in the receiver side but the result is same. failed to capture H264 Data

# client
gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! video/x-h264 ! filesink location=sample.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

We are giving the full URL like this udpsrc uri=udp://192.168.1.5:5000 instead of just port name
Any difference it will make in the pipeline?

If it still does not work, it looks to be an issue in network. Maybe UDP is blocked due to certain firewall settings. You can check this part.

The data is coming from server side(jetson) as RTP packets we have verified using Wiresharkfrom as RTP packets

Could you please let us know the issue is in server(Jetson) or client(PC) side also how to unblock the UDP port from software side?

Hi,
We have tried server(Jetson) with client(PC in Ubuntu 18.04 or 20.04) or client(Jetson). All commands can work well. Maybe you can try client(Jetson). If client(Jetson) works, the issue may be in your client(PC). If client(Jetson) does not work, it is still like that certain network settings trigger the failure.

You may try:

  • be sure that no firewall blocks UDP/5000.
  • adding rtpjitterbuffer. Note that its latency property is set to 2000 ms by default, you may decrease when it works.
  • disable udpsink property auto-multicast in sender, especially if using WiFi.
  • store into a file container so that decoding would be simpler:

Sender:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280, height=720, framerate=60/1, format=NV12' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 ! h264parse ! rtph264pay ! queue ! udpsink host=192.168.1.5 port=5000 auto-multicast=0 sync=0

Receiver:

# client
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer ! queue ! rtph264depay ! h264parse ! matroskamux ! filesink location=test.mkv -e

# Stop after 10 s with Ctrl-C (once then wait for completion)

# Play back
gst-play-1.0 test.mkv

That being said, it is getting far from your original topic.
You may better explain your case.

  • There are very few use cases that would require bayer data, the only ones I can see are for monochrome cases or if you want to exclude Argus debayering or auto-tuning from the loop. For such case, you have all the answers in my first answer (edited many times for clearness and completeness). You would try that on a fresh JP5 install without any try to patch v4l2src nor build gstreamer nor any plugin.
  • Streaming debayered raw video may be an option if you need lossless quality and have the required bandwidth available. This might need some buffer-size adjustments for UDP.
  • Streaming compressed video such as H264 would need less bandwith, but may loose quality. Tuning the encoding parameters may help saving quality.

@DaneLLL We have tried UDP streaming with client as jetson and server as PC, it works well with H264 and RAW data. Can you provide some solution to stream RAW data using UDP with Jetson as server and PC as client?

@Honey_Patouceul Thanks for the info

Be sure that no firewall blocks UDP/5000.

Can you provide how to check whether UDP/5000 is blocked and also how to unblock the same using software, because we are able to stream from PC as Server to Jetson as client

disable udpsink property auto-multicast in sender, especially if using WiFi.

What difference does it make when the auto-multicast=true property is enabled on the pipeline?

You would have the receiver to listen on UDP/5000 with:

nc -l -u -p 5000

that would output received bytes to stdout.

Then from sender try:

while true; do echo 'If you see this on receiver, then no firewall blocks UDP/5000'; sleep 1; done | nc -u <receiver_IP_address> 5000

that will send the string each second to receiver on port 5000 with UDP.

If the receiver outputs the string, then it can go through and it is not blocked.
If not, then you would have to investigate. If it is a firewall, it would depend on what has been installed. I can’t guess, but the basic firewall with Linux would be ufw.

By default udpsink enables multicast. Multicast allows several hosts on the LAN to receive the stream, so that sender would not have to send to each receiver. Though, multicast may not work fine with WiFi. If you don’t need to send to several hosts, you may disable it. Note that multicast uses special IP addresses for a multicast group, such as 224.1.1.1.

BTW, did you try my proposals ? What does fail ?
I may provide some workarounds to your issues, but I don’t have time for teaching gstreamer or networks.