How to use IMX274 camera on Xavier perfectly?

1, LI-JXAV-MIPI-ADPT 4CAM V1.0;
2, 3 leopard imx274 cameras;
3, R32.1, /dev/video0, video1, video2, video3 exist;
4, opencv3.3.1 is removed and opencv3.4.0 is installed;
5, cmd: “gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)1920,height=(int)1080’ ! nvvidconv flip-method=0 ! ‘video/x-raw, format=(string)I420’ ! xvimagesink -e” works fine;
6, sample codes can be run, but video is not fluent:

#include <stdio.h>
#include <opencv2/opencv.hpp>

using namespace cv;
using namespace std;

int main(int argc, char** argv)
{
  VideoCapture cap("nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM), width=1280, height=720,format=NV12, framerate=30/1 ! nvvidconv ! video/x-raw,format=I420 ! appsink");

  if (!cap.isOpened())
    {
      cout << "Failed to open camera." << endl;
      return -1;
    }

  for(;;)
    {
      Mat frame;
      cap >> frame;
      Mat bgr;
      cvtColor(frame, bgr, CV_YUV2BGR_I420);
      imshow("original", bgr);
      waitKey(1);
    }

  cap.release();
}

7, cmd: “v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=3 --stream-to=IMX274.raw -d /dev/video0” can be run, and just see result “<<<”, dmesg is attached;

The output video is bad quality, so I change “format=I420” according to Accelerated_GStreamer_User_Guide.pdf, for example, update to “format=BGRA”, then report:

nvidia@nvidia-desktop:~/Downloads$ ./simple_opencv 
VIDEOIO ERROR: V4L: device nvarguscamerasrc sensor-id=1 ! video/x-raw(memory:NVMM), width=1280, height=720,format=NV12, framerate=30/1 ! nvvidconv ! video/x-raw,format=BGRA ! appsink: Unable to query number of channels

(Argus) Error Timeout:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function openSocketConnection(), line 215)
(Argus) Error Timeout: Cannot create camera provider (in src/rpc/socket/client/SocketClientDispatch.cpp, function createCameraProvider(), line 102)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:515 Failed to create CameraProvider

I try to adapt codes of “argus_camera” into opencv’s capture because the output video is perfect after “argus_camera --device=0”, but it’s a big task. I also refer to Installing on Linux, but it makes me feel dizzy. Would experts help me? Thanks in advanced.
dmesg.txt (81.6 KB)

Could you try modify the 1280x720 to 1920x1080 to check if get more fluent.

VideoCapture cap(“nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM), width=1280, height=720,format=NV12, framerate=30/1 ! nvvidconv ! video/x-raw,format=I420 ! appsink”);

Thanks for reply. It worsens after modification.

How about remove the “format=NV12” to have it the same as your #5 experiment.

VideoCapture cap(“nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw,format=I420 ! appsink”);

Thanks. It’s almost no improvement. I think opencv imshow mechanism needs to be optimized. The render mechanism is good, but handling pic format is more difficult than opencv. I’m trying to show opencv frame by render mode in argus. Hope nvidia would consider this trouble.