Cameras used with direct V4l2 (no ISP subsytem)

Hello,

We use mipi camera with TX1.
I understand from “tegra linux driver development” that some cameras can be used directly with v4l2, Right?

"In applications support a direct V4L2 interface, use this interface to communicate to the NVIDIA V4L2 driver
without having to use the SCF library. Use this path for a YUV sensor since this sensor has a built-in ISP and frame does not need extra processing…
Read the following sections to learn how to develop these; our examples use OmniVision OV5693 sensor, and the
source code for OV5693 sensor is available to customers.
"

But OV5693 , does not output YUV, only RAW !
So, how can it be used with direct v4l ?

Another thing: Can I assume that the latency with v4l direct approach is much better than using the camera subsystem (ISP) ?

Thanks,
ranran

Main point here is that OV5693 is not a YUV sensor but a bayer sensor, so it can go through ISP for debayering (and more…).
If you want to use V4L2 for OV5693, you would have to debayer by yourself. Note that you would get 10 bits bayer, while gstreamer only expects 8 bits bayer. Some details here.

thanks