Debayering raw data from USB camera

I’m trying to get a high res stream from a USB camera (The Imaging Source) but am running into an issue: for a 3072x2048 stream, CPU usage is at 100% due to debayering, and FPS drops to 9fps because of this. Below tcambin is what is performing the debayering:

gst-launch-1.0 -v tcambin ! video/x-raw, format=BGRx,width=3072,height=2048,framerate=30/1 ! queue ! videoconvert ! fpsdisplaysink video-sink=fakesink text-overlay=false

Is there any type of hardware accelerated debayering that can be performed on the TX2 or is this as good as it gets?

Pieces of the puzzle:
The VSP can do de-bayering, but I don’t know if there’s a good path in from USB (as opposed to from CSI.)
You can also use cuda for de-bayering. I don’t know how much latency this would add.
Finally, a basic no-frills de-bayering algorithm should be able to run on one CPU at 30 Hz. Is tcambin running a fancier filter, or doing things other than de-bayering? Or is there a lot of gstreamer overhead?

videoconvert may also use a lot of CPU time for such a resolution. Have you tried boosting the jetson ?

sudo nvpmodel -m 0
sudo /home/nvidia/jetson_clocks.sh

Hi,
We suggest you implement de-bayering on CUDA.

The HW ISP only takes CSI inputs.