Cannot acquire buffer with 10 bit nvbuffer color format, when setting the bit depth of the cameras in the device tree to 10 bit

I have the following setup:
IMX568c sensors, Jetson Orin NX, Forecr DSBOARD ORNX carrier board, Jetpack 5.1.2 (L4T 35.4.1), the camera images are acquired with libargus.
The device tree is customized, I added it here(had to change ‘*.dtsi’ to ‘.*txt’ to upload).
tegra234-camera-vc-mipi-cam.txt (32.0 KB)
When everything is set to 10 bit (pixel clock frequency, csi_pixel_bit_depth, min_bits_per_pixel) in the device tree file, flashing is performed without errors, but I get a timeout when trying to acquire buffers with the NvBufSurfaceColorFormat set to the one for 10 bit images (NVBUF_COLOR_FORMAT_NV24_10LE). When this timeout happens the following error appears in the dmesg output:

NVRM gpumgrGetSomeGpu: Failed to retrieve pGpu - Too early call!.
NVRM nvAssertFailedNoLog: Assertion failed: NV_FALSE @ gpu_mgr.c:296

But when I set the color format to the one for 8 bit images (NVBUF_COLOR_FORMAT_NV24), buffers can be acquired.
I tried to change the pixel clock frequency, as it depends on the bit depth, and was set before to the frequency for 8 bit images, but the error persists.
Ideally I want 10 bit images from the cameras, as specified when flashing.
Do I miss anything in the device tree file or somewhere else that might prevent me from acquiring 10 bit buffers?

Argus only support PIXEL_FMT_YCbCr_420_888/PIXEL_FMT_RAW16.
Below is the

Sensor output(raw10/12/14) → argus → PIXEL_FMT_YCbCr_420_888/PIXEL_FMT_RAW16

I should have specified: we use the BufferOutputStream interface to setup and acquire buffers. There we specify the NvBufSurfaceColorFormat as NVBUF_COLOR_FORMAT_NV24. Ideally we would like to be able to acquire in NVBUF_COLOR_FORMAT_NV24_10LE.

The two formats you mentioned are only available in EGLOutputStream interface which we do not use.

Maybe check NvBufferTransform

Is there any way to avoid the EGLOutputStream?