Hi,
I have an OpenCV program that initializes 3 GStreamer pipelines that grab video streams and send them to a server on a Windows host PC using imagezmq. I am able to stream all 3 to the host but have a lot of latency when I show all 3 streams. With two it works almost real-time. I want to try encoding the stream so that I can send less data, then decode on the host PCs end which could hopefully help with latency.
My issue is that when I tried looking into how to do this, I saw that the Orin Nano only supports software encoding. So I am trying to create my pipeline with x264enc but cannot get it working.
I am using an Jetson Orin Nano with Jetpack 6.0. My current pipeline looks like
pipeline1 = (
"v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=640,height=360,framerate=30/1 ! "
"videoconvert ! queue max-size-buffers=2 ! appsink drop=1"
)
pipeline2 = (
"v4l2src device=/dev/video2 ! video/x-raw,format=YUY2,width=640,height=360,framerate=30/1 ! "
"videoconvert ! queue max-size-buffers=2 ! appsink drop=1"
)
pipeline3 = (
"v4l2src device=/dev/video4 ! video/x-raw,format=YUY2,width=640,height=360,framerate=30/1 ! "
"videoconvert ! queue max-size-buffers=2 ! appsink drop=1"
)
How can I get it working with encoding? Would this be the best solution for my problem or is there a better way I should look in to? Thanks in advance for any help!