I am using a C++ application with OpenCV on a Jetso Nano .
I want to stream live video from a camera with GStreamer. I use UDP to send the stream to a Windows PC.
This is my GStreamer pipeline on Jetson nano:
"appsrc" " ! videoconvert" " ! video/x-raw,format=BGRx" " ! nvvidconv" " ! video/x-raw(memory:NVMM)" ", format=(string)NV12, width=1920,height=1080" " ! nvv4l2h264enc bitrate=8000000 maxperf-enable=true insert-sps-pps=true" " ! rtph264pay pt=96 config-interval=4" " ! udpsink host=192.168.178.203 port=8080"
The application feeds video into this pipeline:
On PC, gst-launch1.0 is used with this pipeline:
! application/x-rtp, payload=96
! h264parse ! avdec_h264
! autovideosink sync=false
Everything works well, if the “frame” (a cv::Mat), that is written to the pipeline, has a format like 1280x720 or smaller.
When I feed frames 1920x1080 into the pipeline, the receiver cannot correctly show the video. The pixel rows are samehow displaced and mixed up.
Strange enough, everything works fine when I use direct stream from camera on the Jetson:
gst-launch-1.0 -v tcamsrc ! tcamdutils ! video/x-raw,format=BGRx,width=1280,height=720,framerate=30/1 ! nvvidconv ! “video/x-raw(memory:NVMM),format=NV12” ! nvv4l2h264enc bitrate=8000000 maxperf-enable=1 insert-sps-pps=true ! rtph264pay pt=96 config-interval=4 ! udpsink host=192.168.178.203 port=8080
I cannot understand, where the difference should be.
I hope that someone can help, thank you in advance …