I’m trying to send one frame from camera by network using next script:
To send:
gst-launch-1.0 nvcamerasrc num-buffers=1 ! "video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420" ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000 -e
To receive:
gst-launch-1.0 -v udpsrc port=5000 num-buffers=1 ! application/x-rtp,payload=26,encoding-name=JPEG ! rtpjpegdepay ! filesink location=test_jpeg.jpg sync=false async=false -e
but i’m getting empty test_jpeg.jpg. What i’m doing wrong? Verbose on receiver side:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ payload\=\(int\)26\,\ encoding-name\=\(string\)JPEG\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = "application/x-rtp\,\ payload\=\(int\)26\,\ encoding-name\=\(string\)JPEG\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
Hi,
If you run your sender pipeline as follows, you will see the actual number of buffers sent, in my case it reports 56:
GST_DEBUG=1,*udp*:7 gst-launch-1.0 nvcamerasrc num-buffers=1 ! "video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420" ! perf ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000 -e -v
0:00:01.156834869 5541 0x5ec0a0 LOG multiudpsink gstmultiudpsink.c:791:gst_multiudpsink_render_buffers:<udpsink0> 56 buffers, 168 memories -> to be sent to 1 clients
Changing the num-buffers property to 56 on the receiver pipeline works well for me. The number of buffers has a dependency with the maximum size of packet (mtu) property of rtpjpegpay element, you can increase it to reduce the number of buffers.
The receiver pipeline will also work correctly if you leave the num-buffers property to its default value -1, which is better since I’ve seen that the number of buffers can vary a little.
Hi,
If you run your sender pipeline as follows, you will see the actual number of buffers sent, in my case it reports 56:
GST_DEBUG=1,*udp*:7 gst-launch-1.0 nvcamerasrc num-buffers=1 ! "video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420" ! perf ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000 -e -v
0:00:01.156834869 5541 0x5ec0a0 LOG multiudpsink gstmultiudpsink.c:791:gst_multiudpsink_render_buffers:<udpsink0> 56 buffers, 168 memories -> to be sent to 1 clients
Changing the num-buffers property to 56 on the receiver pipeline works well for me. The number of buffers has a dependency with the maximum size of packet (mtu) property of rtpjpegpay element, you can increase it to reduce the number of buffers.
The receiver pipeline will also work correctly if you leave the num-buffers property to its default value -1, which is better since I’ve seen that the number of buffers can vary a little.
Hi.Thanks for reply but i still can’t do that.
To send i use now:
GST_DEBUG=1,*udp*:7 gst-launch-1.0 nvcamerasrc num-buffers=1 ! "video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420" ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000 -e -v
As you said i have next debug output:
0:00:00.635740071 6343 0x5bc140 LOG multiudpsink gstmultiudpsink.c:791:gst_multiudpsink_render_buffers:<udpsink0> 264 buffers, 792 memories -> to be sent to 1 clients
To receive:
gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp,payload=26,encoding-name=JPEG ! rtpjpegdepay ! filesink location=test_jpeg.jpg sync=false async=false -e
But i still get empty file. Output on receiver side:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ payload\=\(int\)26\,\ encoding-name\=\(string\)JPEG\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = "application/x-rtp\,\ payload\=\(int\)26\,\ encoding-name\=\(string\)JPEG\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
Hi. Thanks for your reply. But it also doesn’t help.
To send i use:
gst-launch-1.0 -v nvcamerasrc num-buffers=1 ! "video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420" ! nvjpegenc ! rtpjpegpay ! tcpclientsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 -v tcpserversrc port=5000 ! application/x-rtp,payload=26,encoding-name=JPEG ! rtpjpegdepay ! filesink location=test_jpeg.jpg -e
and i have next errors on receiver side:
(gst-launch-1.0:6802): GStreamer-CRITICAL **: gst_segment_to_running_time: assertion 'segment->format == format' failed
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = "image/jpeg\,\ framerate\=\(fraction\)0/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080"
WARNING: from element /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0: Could not decode stream.
Additional debug info:
gstrtpbasedepayload.c(492): gst_rtp_base_depayload_handle_buffer (): /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0:
Received invalid RTP payload, dropping
Hi,
You may go to http://gstreamer-devel.966125.n4.nabble.com/
Except nvjpegenc, others are 3rdpary plugins. Users in gstreamer forum shall have more experience and can help you out. You can have working pipelines first and then replace jpegenc with nvjpegenc.
Your use case is not clear to me, but this might help.
You can stream your camera in jpeg format through udp with:
# Capture server
gst-launch-1.0 -v nvcamerasrc ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! nvjpegenc ! rtpjpegpay ! udpsink host=localhost port=5000
Now on receiver side, you would first have to check if the size of kernel buffer for reception is enough for your resolution. For this case, we’ll use 1MB buffers. Typical value is 224K, so you’ll probably have to increase.
# Get kernel socket reception buffer max size
cat /proc/sys/net/core/rmem_max
# Increase to 1MB if current value is lower
sudo sysctl -w net.core.rmem_max=1048576
Receiver should now be able to view the remote camera with:
# Client live view
gst-launch-1.0 -v udpsrc port=5000 buffer-size=1048576 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! xvimagesink
You can use multifilesink and slow down framerate. Here it’s running 1 fps and saving to a unique file being rewritten each second:
# Client saving to same file each second
gst-launch-1.0 -ev udpsrc port=5000 buffer-size=1048576 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! videorate ! image/jpeg,framerate=1/1 ! multifilesink location=test.jpg max-file-duration=1
multifilesink has many options that can help to meet your specific case.
I’m trying to use hardware decoder to decode MJPEG stream instead of Opencv.
For example:
cv::Mat mat_sm = mjpeg2rgb(frame buffer);
How do I code by using gstreamer?
Would you please provide me the sample?
Hi wangxin112,
Please install tegra_multimedia_api sample and refer to 12_camera_v4l2_cuda.