Stream Bayer Data with Gtsreamer

We have a custom carrier board with Jetson Xavier NX and an imx219 image sensor. We need to stream bayer data using gstreamer v4l2src

The camera is of 10bit, details are mentioned below,

[0]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Discrete 3280x2464
			Interval: Discrete 0.048s (21.000 fps)
		Size: Discrete 3280x1848
			Interval: Discrete 0.036s (28.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1640x1232
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)

In imx219 camera device tree, the format is bayer_rggb

                                        mclk_khz = "24000";
                                        num_lanes = "2";
                                        tegra_sinterface = "serial_a";
                                        phy_mode = "DPHY";
                                        discontinuous_clk = "yes";
                                        dpcm_enable = "false";
                                        cil_settletime = "0";

                                        active_w = "1920";
                                        active_h = "1080";
                                        mode_type = "bayer";
                                        pixel_phase = "rggb";
                                        csi_pixel_bit_depth = "10";
                                        readout_orientation = "90";
                                        line_length = "3448";
                                        inherent_gain = "1";
                                        mclk_multiplier = "9.33";
                                        pix_clk_hz = "182400000";

We have tested the below pipeline, and it fails to run

gst-launch-1.0 v4l2src ! 'video/x-bayer, format=rggb, width=1920, height=1080, framerate=30/1' ! multifilesink location=image.raw

Pipeline Error logs

Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

We have seen similar post in the forum, v4l2src plugin supports only supports 8 bit bayer formats, plugin details are mentioned below,

      video/x-bayer
                 format: { (string)bggr, (string)gbrg, (string)grbg, (string)rggb }
                  width: [ 1, 32768 ]
                 height: [ 1, 32768 ]
              framerate: [ 0/1, 2147483647/1 ]

Could you please let us know if there is any option to stream bayer 10/16bit data from camera using gstreamer v4l2src ?
Any help would be appreciated
Note : We are using a Jetpack version 5.1.1

Hi,
The v4l2src plugin is a native gstreamer plugin and it is open source. So if you need the format to be supported, you may download the source code and customize it to add the format.

Another possible solution is to use appsrc plugin. You can do v4l2_ioctl() in the plugin to capture frame data and send to next element.

@DaneLLL Thanks for the reply

The v4l2src plugin is a native gstreamer plugin and it is open source. So if you need the format to be supported, you may download the source code and customize it to add the format.

We have seen the below post to apply patches for supporting 10 bit bayer format in the v4l2src plugin

post
It is throwing some error on Jetpack 5.1.1?
Could you please let us know if there is any working patch available from the NVIDIA side to support 10 bit bayer format on v4l2src plugin?

Another possible solution is to use appsrc plugin. You can do v4l2_ioctl() in the plugin to capture frame data and send to next element.

Could you please clarify the below point, with an example or some reference? So that we can proceed further

Hi,
We don’t have the patch. Generally we use prebuilt libs and do not do customization to the plugins. Would see if other users can check this use-case and share experience.

For building gstreamer frameworks from source code, you can check
Accelerated GStreamer — Jetson Linux Developer Guide documentation

@DaneLLL Thanks for the reply

We don’t have the patch. Generally we use prebuilt libs and do not do customization to the plugins. Would see if other users can check this use-case and share experience

So we have to do our own customization to the v4l2src plugin to support bayer 10 bit format and test. Am I correct?

Another possible solution is to use appsrc plugin. You can do v4l2_ioctl() in the plugin to capture frame data and send to next element.

Could you please clarify the above point also, with an example or some reference, it would be helpful for us to proceed further?

Hi,
Please refer to source code of v4l2-ctl:
v4l2-ctl.cpp\v4l2-ctl\utils - v4l-utils.git - media (V4L2, DVB and IR) applications and libraries

It is the command for capturing frame data through v4l2 interface. The source code is the demonstration. Or you may refer to source code of v4l2 plugin. The implementation should be similar.

You may try the following :

1. Build a RG10 to rggb converter

You would create a file bayer10_8.c with the following content:

#include <stdlib.h>
#include <stdbool.h>
#include <stdio.h>
#include <string.h>

int main (int argc, char** argv) 
{       
	char filename[64];

	int readItems;
	int width, height, size;
	unsigned int curLevel, curItem; 
	unsigned short *sbuf;
	unsigned char *cbuf;
	unsigned int curFrame = 0;
	bool bRun = true;
	
	if (argc < 3) {
		fprintf(stderr, "Usage: bayer10_8 width height [any 3rd arg for saving to files in out directory to be created beforehand]\n");
		return -1;
	}
	width = atoi(argv[1]);
	if (width < 1) {
	   fprintf(stderr, "Error bad width\n");
	   return -2;
	}
	height = atoi(argv[2]);
	if (height < 1) {
	   fprintf(stderr, "Error bad height\n");
	   return -3;
	}
	size = width*height;

	sbuf = malloc(2*size);
	if (!sbuf) {
		fprintf(stderr, "Error failed to allocate sbuf.");
		return -4;
	}
	cbuf = malloc(size);
	if (!sbuf) {
		fprintf(stderr, "Error failed to allocate cbuf.");
		return -5;
	}

	while(bRun) {
		curLevel = 0;
		while (curLevel < size) {
			readItems = fread(sbuf, 2, size - curLevel, stdin);
			if (readItems < 1) {
				bRun = false;
				break;
			}
			for (curItem=curLevel; curItem < curLevel + readItems; ++curItem)
				cbuf[curItem] = (unsigned char)(sbuf[curItem]>>6);
			curLevel += readItems;
		}
		if (!bRun)
			break;
			
		if (argc > 3) {
			sprintf(filename, "out/frame_%dx%d_%05d.rggb", width, height, curFrame);
			FILE* fout = fopen(filename, "wb");
			if (!fout) {
				fprintf(stderr, "Error: failed to open file %s for writing.\nDid you create out folder ?\n", filename);
				return -6;
			}
			fwrite(cbuf, size, 1, fout);
			fclose(fout);
		}
		else
			fwrite(cbuf, 1, size, stdout);
		++curFrame;
	}
	free(sbuf);
	free(cbuf);
	return 0;
}

Here shifting 6 bits to right according to this,

and build with:

gcc -Wall -O3 -o bayer10_8 bayer10_8.c

The converter will output into stdout if launched with only two args.
Adding more args would make it output separate rggb files for each frame into out folder to be created beforehand.

2. Set your sensor mode from Argus:

# Set 1280x720@60
gst-launch-1.0 nvarguscamerasrc sensor-mode=4 num-buffers=1 ! 'video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1' ! fakesink 

3. Preview: debayer and display with gstreamer:

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,sensor_mode=4 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720) ! video/x-bayer,format=rggb,width=1280,height=720,framerate=60/1 ! queue ! bayer2rgb ! queue ! videoconvert ! xvimagesink

4. Try a capture into images:

Be aware that it can fill your disk very quickly. Better save to an external disk, or stop with Ctrl-C after a few seconds.

# Create out folder in not yet done
mkdir out

# Or clean folder if it exists
rm out/*

# Run a capture with 1280x720@60 fps... 
v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,sensor_mode=4 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 f

# Stop with Ctrl-C

# Check for created files
ls out/*

5. Note for manually adjusting gain or exposure

For adjusting gain or exposure, you would have to activate override_enable control.
However, it may not work as long as you’ve not run argus yet, but this would have been done in step 2.
Then you may be able to set your values :

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,gain=50,exposure=5000,frame_rate=60000000,sensor_mode=4,override_enable=1 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720) ! video/x-bayer,format=rggb,width=1280,height=720,framerate=60/1 ! queue ! bayer2rgb ! videoconvert ! xvimagesink

6. Streaming bayer 8 bits rggb as gray8 and storing on receiver side

There is no support for streaming bayer video that I’m aware of, but for your case it is possible to cheat and pretend it is GRAY8 format while it is actually bayer rggb format. For that you would try:

Sender:

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,sensor_mode=4 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720) do-timestamp=1 ! queue ! video/x-raw,format=GRAY8,width=1280,height=720,framerate=60/1 ! matroskamux streamable=1 ! tcpserversink sync=0

Receiver:

gst-launch-1.0 tcpclientsrc ! queue ! matroskademux ! multifilesink location=frame_received_%05d.rggb

This is for local (Jetson) testing,

Obviously you would adjust host and port properties for your case (assuming that no firewall blocks TCP port 4593):
Sender:

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,sensor_mode=4 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720) do-timestamp=1 ! queue ! video/x-raw,format=GRAY8,width=1280,height=720,framerate=60/1 ! matroskamux streamable=1 ! tcpserversink host=<Jetson_IP> sync=0

Receiver:

gst-launch-1.0 tcpclientsrc host=<Jetson_IP> ! queue ! matroskademux ! multifilesink location=frame_received_%05d.rggb

7. Streaming Bayer 10 bits as gray16 would require twice bandwidth

Though, this can be achieved with a different converter to 16 bits bayer LE format with a 1 Gb/s wired ethernet.

bayer10_16le_10bpp.c
#include <stdlib.h>
#include <stdbool.h>
#include <stdio.h>
#include <string.h>

int main (int argc, char** argv) 
{       
	char filename[64];

	int readItems;
	int width, height, size;
	unsigned int curLevel, curItem; 
	unsigned short *sin_buf, *sout_buf;
	unsigned int curFrame = 0;
	bool bRun = true;
	
	if (argc < 3) {
		fprintf(stderr, "Usage: bayer10_8 width height [any 3rd arg for saving to files in out directory to be created beforehand]\n");
		return -1;
	}
	width = atoi(argv[1]);
	if (width < 1) {
	   fprintf(stderr, "Error bad width\n");
	   return -2;
	}
	height = atoi(argv[2]);
	if (height < 1) {
	   fprintf(stderr, "Error bad height\n");
	   return -3;
	}
	size = width*height;

	sin_buf = malloc(2*size);
	if (!sin_buf) {
		fprintf(stderr, "Error failed to allocate sin_buf.");
		return -4;
	}
	sout_buf = malloc(2*size);
	if (!sout_buf) {
		fprintf(stderr, "Error failed to allocate sout_buf.");
		return -5;
	}

	while(bRun) {
		curLevel = 0;
		while (curLevel < size) {
			readItems = fread(sin_buf, 2, size - curLevel, stdin);
			if (readItems < 1) {
				bRun = false;
				break;
			}
			for (curItem=curLevel; curItem < curLevel + readItems; ++curItem)
				sout_buf[curItem] = (sin_buf[curItem]>>4);
			curLevel += readItems;
		}
		if (!bRun)
			break;
			
		if (argc > 3) {
			sprintf(filename, "out/frame_%dx%d_%05d.rggb", width, height, curFrame);
			FILE* fout = fopen(filename, "wb");
			if (!fout) {
				fprintf(stderr, "Error: failed to open file %s for writing.\nDid you create out folder ?\n", filename);
				return -6;
			}
			fwrite(sout_buf, 2, size, fout);
			fclose(fout);
		}
		else
			fwrite(sout_buf, 2, size, stdout);
		++curFrame;
	}
	free(sin_buf);
	free(sout_buf);
	return 0;
}

This may be dark depending on how you’re debayering. You may also try full 16 bits scale with:

bayer10_16le.c
#include <stdlib.h>
#include <stdbool.h>
#include <stdio.h>
#include <string.h>

int main (int argc, char** argv) 
{       
	char filename[64];

	int readItems;
	int width, height, size;
	unsigned int curLevel, curItem; 
	unsigned short *sin_buf, *sout_buf;
	unsigned int curFrame = 0;
	bool bRun = true;
	
	if (argc < 3) {
		fprintf(stderr, "Usage: bayer10_8 width height [any 3rd arg for saving to files in out directory to be created beforehand]\n");
		return -1;
	}
	width = atoi(argv[1]);
	if (width < 1) {
	   fprintf(stderr, "Error bad width\n");
	   return -2;
	}
	height = atoi(argv[2]);
	if (height < 1) {
	   fprintf(stderr, "Error bad height\n");
	   return -3;
	}
	size = width*height;

	sin_buf = malloc(2*size);
	if (!sin_buf) {
		fprintf(stderr, "Error failed to allocate sin_buf.");
		return -4;
	}
	sout_buf = malloc(2*size);
	if (!sout_buf) {
		fprintf(stderr, "Error failed to allocate sout_buf.");
		return -5;
	}

	while(bRun) {
		curLevel = 0;
		while (curLevel < size) {
			readItems = fread(sin_buf, 2, size - curLevel, stdin);
			if (readItems < 1) {
				bRun = false;
				break;
			}
			for (curItem=curLevel; curItem < curLevel + readItems; ++curItem)
				sout_buf[curItem] = ((sin_buf[curItem]>>4)<<8);
			curLevel += readItems;
		}
		if (!bRun)
			break;
			
		if (argc > 3) {
			sprintf(filename, "out/frame_%dx%d_%05d.rggb", width, height, curFrame);
			FILE* fout = fopen(filename, "wb");
			if (!fout) {
				fprintf(stderr, "Error: failed to open file %s for writing.\nDid you create out folder ?\n", filename);
				return -6;
			}
			fwrite(sout_buf, 2, size, fout);
			fclose(fout);
		}
		else
			fwrite(sout_buf, 2, size, stdout);
		++curFrame;
	}
	free(sin_buf);
	free(sout_buf);
	return 0;
}

and build as:

gcc -Wall -O3 -o bayer10_16 bayer10_16le.c

# or
gcc -Wall -O3 -o bayer10_16 bayer10_16le_10bpp.c

Then just stream with:

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,sensor_mode=4 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_16 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720 \* 2) do-timestamp=1 ! queue ! video/x-raw,format=GRAY16_LE,width=1280,height=720,framerate=60/1 ! multipartmux ! tcpserversink host=<Jetson_IP> sync=0

and receive with:

gst-launch-1.0 tcpclientsrc host=<Jetson_IP> ! multipartdemux single-stream=1 ! video/x-raw,fromat=GRAY16_LE,width=1280,height=720,framerate=60/1 ! multifilesink location=<wherever_you_have_room_for_that>/frame_1280x720_%05d.rggb16le

@DaneLLL Thanks for the info

We are able to capture the bayer rggb data using gstreamer v4l2src plugin by applying the patch from this post
But this patch works only in Jetpack 4.6.x only, for jetpack 5.x it failed to while building the libraries.
If we share the error logs while applying the patch for gstreamer v4l2src plugin in jetpack 5.x series, can you help us find the root cause?

One more point,
We want to confirm the captured bayer image is correct or not. Could you please let us know if there is any way to convert the bayer format to YUV using gstreamer?

@Honey_Patouceul Thanks for the support
We are able to capture the bayer image using gstreamer after applying patches in v4lsrc source code?
We want to convert the captured bayer image to YUV/RGB
In the below pipeline, you have mentioned bayer2rgb for displaying the bayer image

v4l2-ctl -d /dev/video0 --set-ctrl=bypass_mode=0,gain=50,exposure=5000,frame_rate=60000000,sensor_mode=4,override_enable=1 --set-fmt-video=width=1280,height=720,pixelformat=RG10 --stream-mmap --stream-to=- | ./bayer10_8 1280 720 | gst-launch-1.0 filesrc location=/dev/stdin blocksize=$(expr 1280 \* 720) ! video/x-bayer,format=rggb,width=1280,height=720,framerate=60/1 ! queue ! bayer2rgb ! videoconvert ! xvimagesink

But this plugin is not present in the Jetson Xavier-NX, We have verified with,

gst-inspect-1.0 bayer2rgb
No such element or plugin 'bayer2rgb'

Can you help us to build the above plugin for Jetson Xavier-Nx platform, which has Ubuntu version 18.0.4?

Hi,
We are not sure if there is gstreamer plugin for debayering. Since there is hardware ISP engine, we generally use the hardware engine. For doing debayering through software plugin, would need other users to share experience.

Looks like there is a patch from 3rdparty, you may check with them for further assistance.

Should be available with gstreamer bad plugins:

Plugin Details:
  Name                     bayer
  Description              Elements to convert Bayer images
  Filename                 /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstbayer.so
  Version                  1.16.3
  License                  LGPL
  Source module            gst-plugins-bad
  Source release date      2020-10-21
  Binary package           GStreamer Bad Plugins (Ubuntu)
  Origin URL               https://launchpad.net/distros/ubuntu/+source/gst-plugins-bad1.0

Try

sudo apt install gstreamer1.0-plugins-bad

@Honey_Patouceul Thanks for the info

The package is already installed

sudo apt install gstreamer1.0-plugins-bad
Reading package lists... Done
Building dependency tree       
Reading state information... Done
gstreamer1.0-plugins-bad is already the newest version (1.14.5-0ubuntu1~18.04.1).
0 upgraded, 0 newly installed, 0 to remove and 317 not upgraded.

gst-inspect-1.0 bayer2rgb
No such element or plugin 'bayer2rgb'

But the bayer2rgb plugin is not present in Xavier-NX platform ? Any other options are there to build the plugin?

@DaneLLL thanks for the info

Looks like there is a patch from 3rdparty, you may check with them for further assistance.

Can you provide some reference for patch?

We have tried to capture the NV12 using the below pipeline, but we can’t open the captured image in vooya application

gst-launch-1.0 nvarguscamerasrc num-buffers=1 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! filesink location=test.nv12 -e

Could you please help us to capture YUV image from nvargus and how to view the captured YUV image properly?

Hi,
Please try the command:

$ gst-launch-1.0 nvarguscamerasrc num-buffers=1 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! filesink location=test.nv12 -e

@DaneLLL Thanks for the input

$ gst-launch-1.0 nvarguscamerasrc num-buffers=1 ! ‘video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12’ ! nvvidconv ! video/x-raw ! filesink location=test.nv12 -e

@DaneLLL The pipeline worked

One more point,
we need to stream the raw bayer image data via tcpserver locally and capture the same with tcpclient to store the bayer data in a file, we have tried some pipelines, but we couldn’t store the bayer data into the file using tcpclientsrc and filesink

#server 
gst-launch-1.0 -vvv v4l2src   ! 'video/x-bayer,width=3264,height=2464,format=rggb,framerate=21/1' ! tcpserversink port=8888 host=192.168.55.1 recover-policy=keyframe sync-method=latest-keyframe

#Server logs
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstTCPServerSink:tcpserversink0: current-port = 8888
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstTCPServerSink:tcpserversink0.GstPad:sink: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-bayer, width=(int)3264, height=(int)2464, format=(string)rggb, framerate=(fraction)21/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
#client
gst-launch-1.0 -v tcpclientsrc host=192.168.55.1 port=8888 ! filesink location=bayer.raw

#client logs
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.000284682
Setting pipeline to NULL ...
Freeing pipeline ...

Could you confirm whether this is the right method to stream bayer data as per our use case, or are we missing something?
Any help would be appreciated

@krishnaprasad.k ,
I’ve added a point to my previous answer that may help for your case. This doesn’t need any v4l2src patching.

For bayer plugin not available on your system, it may be blacklisted because of missing dependency.
Try:

# Clear gstreamer cache
rm ~/.cache/gstreamer-1.0/registry.aarch64.bin

# This will rebuild the cache and tell what plugins are blacklisted if any
gst-inspect-1.0 -b

# Check library dependencies for that plugin with:
ldd /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstbayer.so

Though, debayering with bayer2rgb gstreamer element is CPU based and would be a bottleneck on Jetsons for high pixel rate.
Usually on Jetson Argus is used for debayering with ISP and providing NV12 raw video.
Otherwise, better debayer with receiver PC.

Hi,
We would suggest encode to h264/h265 for streaming. Streaming RAW data through network requires significant bandwidth. Please refer to the discussion in
UDP-Raw Stream Issue On Nvidia Jetson Devices - #8 by DaneLLL

@DaneLLL Thanks for the info

We would suggest encode to h264/h265 for streaming. Streaming RAW data through the network requires significant bandwidth.

As per use case, we need to stream bayer data only. Please suggest some methods to achieve the same?

We have tried to stream the NV12 data for validating the path using the below pipelines

#server as **Jetson Xavier-NX**

gst-launch-1.0 nvarguscamerasrc num-buffers=1000 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay !  'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=192.168.55.100 port=5000

#Client PC

gst-launch-1.0 -v udpsrc host=192.168.55.100 port=5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw 

But sample.raw has zero data, while the same pipeline is used for streaming locally inside the jetson it is working fine
Could you please let us know the solution to fix this issue?

One more point
We have tried to stream without rtpvrawpay on jetson side using the below pipeline

#Server side
 gst-launch-1.0 nvarguscamerasrc  ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! udpsink host=192.168.55.100 port=5000

#Error log
WARNING: from element /GstPipeline:pipeline0/GstUDPSink:udpsink0: Attempting to send a UDP packets larger than maximum size (12063744 > 65507)
Additional debug info:
gstmultiudpsink.c(722): gst_multiudpsink_send_messages (): /GstPipeline:pipeline0/GstUDPSink:udpsink0:
Reason: Error sending message: Message too long

Could you please clarify if there is any reson for the usage of RTP payloads like rtpvrawpay for UDP streaming?

Hi,
For UDP streaming, you can also refer to
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

In your commands, please make sure 192.168.55.100 is IP address of the client

@DaneLLL Thanks for the reply

In your commands, please make sure 192.168.55.100 is IP address of the client

Yes you are correct

We have got some information about UDP streaming in the link that you have shared,

We have used the below gstreamer pipelines with UDP streaming through LAN which is 1gbps speed, but we can’t dump the data to the file from the receiver side,
The same pipeline we have streamed locally in the jetson device (server and client are jetson only)

#Server
gst-launch-1.0 nvarguscamerasrc ! 'video/x-t raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv ! video/x-raw ! rtpvrawpay ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW' ! udpsink host=<client IP> port=5000
Note : In previous post the pipline we have shared the wrong pipeline for cleint 
#Client  
gst-launch-1.0 -v udpsrc uri=udp://<client IP>:5000 ! 'application/x-rtp, media=(string)video, encoding-name=(string)RAW,depth=(string)10,sampling=(string)YCbCr-4:2:0,width=(string)3264, height=(string)2464, framerate=21/1,format=NV12' ! queue ! rtpvrawdepay ! video/x-raw ! filesink location=sample.raw

Could you please help us find the root cause of the issue or some method to debug the same?