Stream 4k video of jetson nano camera 12.3 MP with rtpjpegpay

hi everybody
I wanna stream 4k frames of jetson nano 12.3 MP camera with rtp payloader and jpeg encoder (rtpjpegpay)
my frmaes size is 2464X3820 .
and there is a dimension error from the receiver side (rtpjepgdepay)
but it will work with lower image size very good
with a little search i have found that the rtp work with lower image size ( maximum 2040x2040)
I try h264 enc with rtp payloader but that’s not good for me ( 5s latency and fps are my bottlenecks)
with a little search, I have found something to stream frames with size more than 2040 but I have no idea what it’s happening?
can yu help me with that ??
do you have any idea to get rid of this problem ??

is there any way to Split and send out each frame in multiple chunks with a smaller size than the main frame to overcome this problem??( like RTPVRAWPAY properties ( chunks-per-frame))

that is the link I have talked about.
https://gstreamer-bugs.narkive.com/yqKt2pw6/bug-684955-new-rtpjpegpay-doesn-t-support-width-or-height-greater-than-2040

What is your pipeline?

If you specify low insert-sps-pps and idrinterval parameters for nvv4l2enc the latency should be reduced.

$ … nvv4l2h264enc insert-sps-pps=1 idrinterval=4 ! …

It also seems the rtpjpegpay bug was fixed a long time ago. What happens if you include the width and height in caps such as

$ … nvjpegenc ! image/jpeg, width=3820,height=2464 ! rtpjpegpay !..

Otherwise you have to set the width and height in rtp header to zero and add x-dimensions. I’m not quite sure what the syntax is for doing that from the command line.

I have been struggling with this though so if anybody knows how to write custom rtp header/extensions please share.

oh I see
what is the proper decoder for nvv4l2h264enc ??
and according to my research nvidia jetson have specific hardware to encode and decode video in h264 format and it is usable in the GStreamer pipeline with nvh264enc and nvh264dec for the receiver.
but when I want to use this element in my pipeline it returns an error to me
[erroneous pipeline: no element “nvh264enc”]
am I doing anything wrong or jetson nano hasn’t NVENC hardware ??

and about jpeg encoder
at the sender side when I set nvarguscamerasrc (jetson nano camera) properties to send data with that size ( width=2830,height=2464 ) there is no error came back to me
but on the receiver side rtpjpegdepay return error ( invalid dimension )
and about setting width and height to zero and adding x-dimension, i read that in the link ([Bug 684955] New: rtpjpegpay doesn't support width or height greater than 2040) but my problem is the same as you , i don’t know what is the syntax for doing this
every try makes an error because in inspect of rtpjpegpay the width and height parameters start from 1 not zero.

The elements are nvv4l2h264enc and nvv4l2decoder.

Try running your pipeline with -v to get verbose output.

doyou mean nvv4l2h264enc and nvv4l2decoder use specific nvidia hardware accelerator (NVENC) and some other enocer elements ( omxh264enc , x264 , etc.) are software encoders ??

Yes, they’re hardware accelerated. omxh264 is also hardware encoding but deprecated.

Hi,
Thank mhd0425 for providing information. Please use v4l2 plugins since omx plugins are deprecated.

I think that the main issue is RTP/UDP for this case.

Indeed, using TCP streaming would work:

Sender:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=3280,height=2464,framerate=21/1' ! nvvidconv compute-hw=GPU ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! image/jpeg,format=MJPG ! matroskamux ! tcpserversink

Receiver:

gst-launch-1.0 tcpclientsrc ! matroskademux ! image/jpeg,width=3280,height=2464,framerate=21/1 ! nvv4l2decoder mjpeg=1 ! queue ! nvegltransform ! nveglglessink

Using RTP/UDP may not work…You may try to increase max buffer-size, but I haven’t been able to use it reliabily… Seems to me that the UDP stack gets less and less available with each new L4T version (might be related to docker or else).
Here using IPv6 localhost:

On sender:

sudo sysctl -w net.core.wmem_max=33554432
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=3280,height=2464,framerate=21/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! image/jpeg,format=MJPG ! rtpjpegpay ! queue ! udpsink host=::1 port=5004 -v

On receiver:

sudo sysctl -w net.core.rmem_max=33554432
gst-launch-1.0 udpsrc address=::1 port=5004 ! application/x-rtp,media=video,encoding-name=JPEG,clock-rate=90000,a-framerate=\"21,000000\",x-dimensions=\"3280,2464\" ! queue ! rtpjpegdepay ! image/jpeg,width=3280,height=2464 ! nvv4l2decoder mjpeg=1 ! queue ! nvegltransform ! nveglglessink

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.