I have an interesting situation. I have a video stream and FFMPEG pipeline that “work” and produce a color output on CSI-0 of an Orin Nano dev kit.
The DT is set to YUYV for the pixel phase however I have to run with the rawvideo option and uyvy422 as the pixel_format in the FFMPEG pipeline in order to get decent output. I then have to deinterlace and crop the image in the pipeline. Just curious why this is the case. I need to find a better answer as there are horizontal artifacts in the final image when there is motion or lighting changes.