Live fisheye correction preview stream via Argus

Hello!

I have made a Argus program with Raspberry Pi HQ camera but I am having a problem with running a corrected fisheye lens preview stream. At first I found VPI tutorial but it only works on Xavier platform.

Any suggestions how to do it in Argus? Preferably CUDA accelerated. I have been thinking of OpenCV fisheye correction but then how can I stream the OpenCV data into PreviewStream on Argus?

Thanks!

hello therealmatiss,

may I have your confirmation about which JetPack release you’re working with?
thanks

Hello! Sorry, for the late reply. This is still needed for me - I have Jetpack 4.

hello therealmatiss,

may I know the FOV of your cover lens.
there’s hardware acceleration for Xavier,
please also check documentation of deep stream, you may refer to Gst-nvdewarper plugin to de-warps camera input.

you may have some color conversion since OpenCV works with RGB color space, but Argus takes YUV;
you may also check Tutorials page for using OpenCV on Jetson platforms.
thanks