Hello, I am trying to use the nvcompositor gstreamer plugin to create a 4K multiview/quad-split of 4x 1080p feeds coming from ISP via nvcamerasrc, and compress to jpeg. I am running L4T release 28.2 on a TX2 dev kit with 4 cameras connected over CSI.
Here is my prototype pipeline:
gst-launch-1.0 -v -e
nvcompositor name=comp
sink_0::xpos=0 sink_0::ypos=0
sink_1::xpos=1920 sink_1::ypos=0
sink_2::xpos=0 sink_2::ypos=1080
sink_3::xpos=1920 sink_3::ypos=1080 ! nvjpegenc ! filesink location=test.jpg
nvcamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=(fraction)60/1’ ! comp.sink_0
nvcamerasrc sensor-id=1 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=(fraction)60/1’ ! comp.sink_1
nvcamerasrc sensor-id=2 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=(fraction)60/1’ ! comp.sink_2
nvcamerasrc sensor-id=3 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=(fraction)60/1’ ! comp.sink_3
It is failing with “WARNING: erroneous pipeline: could not link comp to nvjpegenc0”. Debug messages show it making several attempts to connect them together, but eventually failing.
I can successfully connect nvcamerasrc to nvjpegenc (staying in NVMM space), so I am at a loss as to why I can’t put the nvcompositor in between.
Any insight would be appreciated.