What is the defaulat output format of the Jetson board camera

Hi,
I’d like to know the default output format of the jetson board camera.

hello chay1991,

it depends on your use-case, will you use NVIDIA ISP or not?
suggest you refer to Camera Software Development Solution and check the [Camera Architecture Stack].
thanks

hi,
I get the data from the Jetson Camera use the following code:

#include <opencv2/opencv.hpp>

std::string get_tegra_pipeline(int width, int height, int fps) {
    return "nvcamerasrc ! video/x-raw(memory:NVMM), width=(int)" + std::to_string(width) + ", height=(int)" +
            std::to_string(height) + ", format=(string)I420, framerate=(fraction)" + std::to_string(fps) +
            "/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink";
}

int main() {
    // Options
    int  WIDTH = 1920;
    int  HEIGHT = 1080;
    int FPS = 30;

    // Define the gstream pipeline
    std::string pipeline = get_tegra_pipeline(WIDTH, HEIGHT, FPS);
    std::cout << "Using pipeline: \n\t" << pipeline << "\n";

    // Create OpenCV capture object, ensure it works.
    cv::VideoCapture cap(pipeline, cv::CAP_GSTREAMER);
    if (!cap.isOpened()) {
        std::cout << "Connection failed";
        return -1;
    }

    // View video
    cv::Mat frame;
    while (1) {
        cap >> frame;  // Get a new frame from camera

        // Display frame
        imshow("Display window", frame);
        cv::waitKey(1); //needed to show frame
    }
}

Does this mean the output format is I420 ?

You may try nvarguscamerasrc instead of nvcamerasrc and NV12 format instead of I420.

sorry,
I dit not know what you exactly mean?
I just want to know the default output format of the camera and how to get the data in my program.

Sorry your case is a bit unclear to me…

Xavier has no onboard camera by default. You can add either a CSI camera or a USB camera.

For example, if I use the camera module from TX1/TX2 devkit, this CSI camera module has a OV5693 sensor. This sensor sends frames in bayer format. It would be shown as /dev/video0. Details can be obtained with (you would need to have package v4l-utils installed):

v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'BG10'
	Name        : 10-bit Bayer BGBG/GRGR
		Size: Discrete 2592x1944
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 2592x1458
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.008s (120.000 fps)
		Size: Discrete 640x480
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 2592x1944
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 2592x1944
			Interval: Discrete 0.033s (30.000 fps)

You can see the format available from V4L is bayer 10 bits (BG10). It means you would have to debayer frames for converting into BGR format expected by your opencv code (imshow).

gstreamer can be used for debayering 8bits bayer frames, but doesn’t support more bits.

An alternate (and much better) solution is to use a bypass and send to the ISP that will debayer frames, while adding no load to your CPUs. On L4T versions up to R28.2 on TX1/TX2, there is a gstreamer plugin nvcamerasrc that does that and would output I420 frames as output by default (some others formats such as NV12 may also be used).

In your opencv code, you use a gstreamer pipeline starting with nvcamerasrc to get I420 frames and send these into plugin
nvvidconv that will convert into BGRx and copy to CPU memory, then videoconvert plugin is used to convert into BGR format that is sent to your opencv application.

On Xavier and R31, you would use another plugin nvarguscamerasrc instead of nvcamerasrc. Its default output format would be NV12, so you would change your pipeline into:

std::string get_tegra_pipeline(int width, int height, int fps) {
    return "<b>nvarguscamerasrc</b> ! video/x-raw(memory:NVMM), width=(int)" + std::to_string(width) + ", height=(int)" +
            std::to_string(height) + ", format=(string)<b>NV12</b>, framerate=(fraction)" + std::to_string(fps) +
            "/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink";
}

If you are using another camera it may be different (see this). If it is a CSI camera and all drivers/firmware/configs are installed, you may try to see what nvarguscamerasrc outputs :

gst-launch-1.0 -v nvarguscamerasrc ! fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)<b>NV12</b>, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
...

Can you please show me a python example

Will this work
convertedImage = cv2.cvtColor(image, cv2.COLOR_BAYER_GR2RGB)

Hi, karthikbalu.meng

You could try this.

import sys
import cv2

def read_cam():
    cap = cv2.VideoCapture("nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)NV12, framerate=(fraction)24/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink")
    if cap.isOpened():
        cv2.namedWindow("demo", cv2.WINDOW_AUTOSIZE)
        while True:
            ret_val, img = cap.read();
            cv2.imshow('demo',img)
            cv2.waitKey(10)
    else:
     print "camera open failed"

    cv2.destroyAllWindows()

if __name__ == '__main__':
    read_cam()

Duplicate question, answered here.
With L4T releases after R28.2.0, use nvarguscamerasrc instead of nvcamerasrc and NV12 instead of I420.