Issues decoding RTSP stream using nvv4l2decoder with Jetpack 4.4

First, apologies to @benky58un for squatting this topic.

Only in case of a MIPI/CSI camera being accessed through ISP with argus. In most other cases such as with v4l2src not going through ISP, frames would be received in CPU allocated memory.

Yes, an application linked to opencv would only receive frames into CPU allocated cv::Mat using opencv videoio.
However, you can access NVMM buffers with gstreamer plugin nvivafilter. It is intended to perform CUDA operations on NVMM hosted frames, so you can use it with opencv CUDA. You would have to output RGBA frames from this plugin. You may have a look to this example.
Also note that you can directly access from gstreamer buffer.

There may be (from tail to head) a BGR buffer, a BGRx buffer, a NVMM NV12 buffer, and some H264 encoded/depayed/RTP buffers. For a 4K resolution, worst case being BGRx would be 33MB only, so it should be affordable to have a few copies (I have no Nano, but I think that even with Linux and Ubuntu you would have more than 3.5GB available for your application).