Puzzled by missing "Prime" tab in `nvidia-settings` app

I believe I’ve [almost] succeeded in setting up dual GPU (iGPU/dGPU) on my Ubuntu 22.04 laptop

lspci | egrep 'VGA|3D'
0000:00:02.0 VGA compatible controller: Intel Corporation TigerLake-H GT1 [UHD Graphics] (rev 01)
0000:01:00.0 VGA compatible controller: NVIDIA Corporation GA104M [GeForce RTX 3070 Mobile / Max-Q] (rev a1)

nvidia-smi shows no load whatever I do on my laptop:

unless I choose Launch using Descrete card option:

(note opaqueness disappeared in dGPU mode)

The primary purpose was to unload video rendering to GPU and it indeed appears rendering on iGPU:

sudo intel_gpu_top

intel-gpu-top: Intel Tigerlake (Gen12) @ /dev/dri/card0 -  116/ 119 MHz;  92% RC6;      168 irqs/s

         ENGINES     BUSY                                                                                 MI_SEMA MI_WAIT
       Render/3D    0.21% |▎                                                                            |      0%      0%
         Blitter    0.60% |▌                                                                            |      0%      0%
           Video    2.41% |█▉                                                                           |      0%      0%
    VideoEnhance    4.82% |███▊                                                                         |      0%      0%

However, I am puzzled by missing Prime tab in NVIDIA settings:

even though I think I’ve setup nvidia-prime correctly:

prime-select query

My files:
xorg.conf (575 Bytes)
nvidia-bug-report.log (244.4 KB)

I’d really appreciate if somebody guides me on correctly setting nvidia up, so that the full info about my GPU setup is accessible (and manageable) via nvidia-settings.

(secure boot is disabled)

The “Prime-Tab” is not standard but a patched Ubuntu version of nvidia-settings. So if you got your driver elsewhere, e.g. from ppa, it’s not there.
Video decoding on the nvidia gpu through vaapi/vdpau in offload mode is AFAIK not possible due to limitations in Xorg. Using a video application which directly uses nvdec/nvenc might be possible.