MT9M021 Streaming

I’ve been working on getting Leopard LI-M021C-MIPI camera working with the TX1. Communication works and I can take pictures using yavta

yavta /dev/video0 -c1 -n1 -s1280x960 -fSGRBG12 -Fdata.raw
Device /dev/video0 opened.
Device `vi-output-0, daxc02 6-0010' on `platform:vi:0' (driver 'tegra-video') is a video capture (without mplanes) device.
Video format set: SGRBG12 (32314142) 1280x720 (stride 2560) field none buffer size 1843200
Video format: SGRBG12 (32314142) 1280x720 (stride 2560) field none buffer size 1843200
3 buffers requested.
length: 1843200 offset: 0 timestamp type/source: mono/EoF
Buffer 0/0 mapped at address 0x7fb6dc6000.
length: 1843200 offset: 1843200 timestamp type/source: mono/EoF
Buffer 1/0 mapped at address 0x7fb6c04000.
length: 1843200 offset: 3686400 timestamp type/source: mono/EoF
Buffer 2/0 mapped at address 0x7fb6a42000.
0 (0) [E] none 0 1843200 B 141.972970 142.006559 1.233 fps ts mono/EoF
Captured 1 frames in 0.844493 seconds (1.184141 fps, 2182608.904751 B/s).
3 buffers released.

Pictures are well formed and I can debayer them to get the correct colors.

bayer2rgb --input=data.raw --output=data.tiff --width=1280 --height=720 --bpp=16 --first=GRBG --method=BILINEAR --tiff --swap

I’m having trouble figuring out where I need to go to get video streaming working. I followed the Ridgerun guide to recompile gstreamer on the TX1. Only change is I added the RAW12 formats instead of the RAW10 formats.

However when I run the test stream…

DISPLAY=:0 gst-launch-1.0 -vvv v4l2src num-buffers=10  ! 'video/x-bayer, width=(int)1280, height=(int)720, format=(string)grbg, framerate=(fraction)30/1' ! fakesink silent=false

I just get errors

(gst-plugin-scanner:1681): GStreamer-WARNING **: Failed to load plugin '/home/ubuntu/gst_1.11.1/out/lib/gstreamer-1.0/libgstkmssink.so': /home/ubuntu/gst_1.11.1/out/lib/gstreamer-1.0/libgstkmssink.so: undefined symbol: drmModeGetFB
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = event   ******* (fakesink0:sink) E (type: stream-start (10254), GstEventStreamStart, stream-id=(string)2b7f5679e9f0cee0776e7c9d96ffc3b4682b26f7b8931798b4254fc99bdbde9b, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)0;) 0x49c9d0
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = event   ******* (fakesink0:sink) E (type: eos (28174), ) 0x49ca40
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(2951): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000844635
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

dmesg is strange. It calls set_format over and over again before it finally gives up and fails.

[   39.138088] (NULL device *): camera_common_dpd_disable: csi 0
[   39.138101] daxc02 6-0010: camera_common_mclk_enable: enable MCLK with 74250000 Hz
[   39.138121] daxc02 6-0010: daxc02_power_on
[   39.356193] (NULL device *): camera_common_dpd_enable: csi 0
[   39.356331] daxc02 6-0010: daxc02_power_off
[   39.359007] daxc02 6-0010: camera_common_mclk_disable: disable MCLK
[   39.801014] (NULL device *): camera_common_dpd_disable: csi 0
[   39.801026] daxc02 6-0010: camera_common_mclk_enable: enable MCLK with 74250000 Hz
[   39.801046] daxc02 6-0010: daxc02_power_on
[   40.013488] (NULL device *): camera_common_dpd_enable: csi 0
[   40.013624] daxc02 6-0010: daxc02_power_off
[   40.016379] daxc02 6-0010: camera_common_mclk_disable: disable MCLK
[   40.053439] (NULL device *): camera_common_dpd_disable: csi 0
[   40.053483] daxc02 6-0010: camera_common_mclk_enable: enable MCLK with 74250000 Hz
[   40.053546] daxc02 6-0010: daxc02_power_on
[   40.462446] daxc02 6-0010: mt9m021_get_format
[   40.462503] daxc02 6-0010: daxc02_g_input_status
[   40.462867] daxc02 6-0010: daxc02_g_input_status
[   40.464619] daxc02 6-0010: mt9m021_set_format
[   40.464659] daxc02 6-0010: mt9m021_set_format
[   40.464671] daxc02 6-0010: mt9m021_set_format
[   40.464680] daxc02 6-0010: mt9m021_set_format
[   40.464689] daxc02 6-0010: mt9m021_set_format
[   40.464698] daxc02 6-0010: mt9m021_set_format
[   40.464707] daxc02 6-0010: mt9m021_set_format
[   40.464716] daxc02 6-0010: mt9m021_set_format
[   40.464732] daxc02 6-0010: mt9m021_set_format
[   40.464741] daxc02 6-0010: mt9m021_set_format
[   40.464750] daxc02 6-0010: mt9m021_set_format
[   40.464794] daxc02 6-0010: mt9m021_set_format
[   40.464806] daxc02 6-0010: mt9m021_set_format
[   40.464828] daxc02 6-0010: mt9m021_set_format
[   40.464842] daxc02 6-0010: mt9m021_set_format
[   40.464861] daxc02 6-0010: mt9m021_set_format
[   40.464876] daxc02 6-0010: mt9m021_set_format
[   40.464885] daxc02 6-0010: mt9m021_set_format
[   40.464893] daxc02 6-0010: mt9m021_set_format
[   40.464902] daxc02 6-0010: mt9m021_set_format
[   40.464910] daxc02 6-0010: mt9m021_set_format
[   40.464919] daxc02 6-0010: mt9m021_set_format
[   40.464928] daxc02 6-0010: mt9m021_set_format
[   40.464941] daxc02 6-0010: mt9m021_set_format
[   40.464950] daxc02 6-0010: mt9m021_set_format
[   40.464959] daxc02 6-0010: mt9m021_set_format
[   40.464980] daxc02 6-0010: mt9m021_set_format
[   40.464990] daxc02 6-0010: mt9m021_set_format
[   40.465009] daxc02 6-0010: mt9m021_set_format
[   40.465021] daxc02 6-0010: mt9m021_set_format
[   40.465038] daxc02 6-0010: mt9m021_set_format
[   40.465053] daxc02 6-0010: mt9m021_set_format
[   40.465062] daxc02 6-0010: mt9m021_set_format
[   40.465071] daxc02 6-0010: mt9m021_set_format
[   40.465080] daxc02 6-0010: mt9m021_set_format
[   40.465089] daxc02 6-0010: mt9m021_set_format
[   40.465097] daxc02 6-0010: mt9m021_set_format
[   40.465106] daxc02 6-0010: mt9m021_set_format
[   40.465119] daxc02 6-0010: mt9m021_set_format
[   40.465128] daxc02 6-0010: mt9m021_set_format
[   40.465137] daxc02 6-0010: mt9m021_set_format
[   40.465158] daxc02 6-0010: mt9m021_set_format
[   40.465167] daxc02 6-0010: mt9m021_set_format
[   40.465185] daxc02 6-0010: mt9m021_set_format
[   40.465197] daxc02 6-0010: mt9m021_set_format
[   40.465215] daxc02 6-0010: mt9m021_set_format
[   40.465229] daxc02 6-0010: mt9m021_set_format
[   40.465238] daxc02 6-0010: mt9m021_set_format
[   40.465246] daxc02 6-0010: mt9m021_set_format
[   40.465255] daxc02 6-0010: mt9m021_set_format
[   40.465263] daxc02 6-0010: mt9m021_set_format
[   40.465272] daxc02 6-0010: mt9m021_set_format
[   40.465280] daxc02 6-0010: mt9m021_set_format
[   40.465294] daxc02 6-0010: mt9m021_set_format
[   40.465302] daxc02 6-0010: mt9m021_set_format
[   40.465311] daxc02 6-0010: mt9m021_set_format
[   40.465333] daxc02 6-0010: mt9m021_set_format
[   40.465343] daxc02 6-0010: mt9m021_set_format
[   40.465361] daxc02 6-0010: mt9m021_set_format
[   40.465373] daxc02 6-0010: mt9m021_set_format
[   40.465389] daxc02 6-0010: mt9m021_set_format
[   40.465404] daxc02 6-0010: mt9m021_set_format
[   40.465413] daxc02 6-0010: mt9m021_set_format
[   40.465422] daxc02 6-0010: mt9m021_set_format
[   40.465431] daxc02 6-0010: mt9m021_set_format
[   40.465439] daxc02 6-0010: mt9m021_set_format
[   40.465448] daxc02 6-0010: mt9m021_set_format
[   40.465456] daxc02 6-0010: mt9m021_set_format
[   40.465470] daxc02 6-0010: mt9m021_set_format
[   40.465478] daxc02 6-0010: mt9m021_set_format
[   40.465487] daxc02 6-0010: mt9m021_set_format
[   40.468087] (NULL device *): camera_common_dpd_enable: csi 0
[   40.468111] daxc02 6-0010: daxc02_power_off
[   40.470978] daxc02 6-0010: camera_common_mclk_disable: disable MCLK
(END)

Any thoughts? Is my set_format function not returning, or setting, something?

Think I figured out those errors. My pipeline specified 30/1 fps, but what I put in camera_common_frmfmt was 60fps. Making that switch lets me get out raw images from gstreamer. Not sure where to go for streaming video though. Will keep poking at it.

gst-launch-1.0 -vvv v4l2src num-buffers=10  ! 'video/x-bayer, width=(int)1280, height=(int)960, format=(string)grbg, framerate=(fraction)60/1' ! multifilesink location=test%d.raw  -v

From what I’ve gathered nvcamerasrc does not have any support for bayer. So I need to use v4l2src. I’ve tried to use this pipeline to do the bayer conversion

gst-launch-1.0 v4l2src num-buffers=1 ! 'video/x-bayer, width=(int)1280, height=(int)960, format=(string)grbg, framerate=(fraction)60/1' ! bayer2rgb ! filesink location=data.raw  -v

but it crashes.

I keep seeing people talk about the “ISP” what does the stand for?

Have you tried the command: nvgstcapture ?

It should interrogate the modes available on your camera and attempt to play video on your screen. If you get that going you can play around with a lot of the features of the camera.

ISP: Image Signal Processor, among other things, it’s used to convert the raw bayer coming out of your camera and change it into a useful image format like I420 or NV12 image format. This can be eaten up by many things such as video encoders.

Yeah I’ve tried nvgstcapture a few times, usually just errors out and closes.

Your thread on getting the Raspberry Pi camera working was quite helpful in my own attempts.

The MT9M021 is parallel and goes through a converter chip to make MIPI. So I went ahead and switched the conversion to RAW8 from RAW12 to hopefully make things work better. Unfortunately while I can still take pictures using yavta and convert them using bayer2rgb, gstreamer stopped working with it (even after switching back to the default build which should handle 8 bit).

Will keep experimenting.

Hi Atrer
For making the nvcamerasrc and nvgstcapture working you have to make sure your device tree are correct.
And nvcamerasrc do support the bayer by internal ISP.

Good to know that they support bayer. I’m trying to figure out what’s wrong with my device tree. It seems to match the examples.

Following the instructions in this thread: https://devtalk.nvidia.com/default/topic/971131/nvcamera-daemon-crash-when-using-ov5693-camera/

I ran the nvcamera-daemon in the console and examined what was going on when I tried to launch either nvcamerasrc or nvgstcapture. Here’s the output:

NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
Sensor_CheckDriverVersion: Mixed or missing V4L2 controls!
Sensor_CheckDriverVersion: Make sure your kernel driver implements either
Sensor_CheckDriverVersion: V4L2_CID_FRAME_LENGTH + V4L2_CID_COARSE_TIME(_SHORT)
Sensor_CheckDriverVersion:      or  
Sensor_CheckDriverVersion: V4L2_CID_FRAME_RATE + V4L2_CID_EXPOSURE
Sensor_InitializeV4L2Items: Failure ------------
NvPclDriver_V4L2_Sensor_Initialize: Failure ---------------------
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureServiceDeviceSensor.cpp, function open(), line 121)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 516)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 702)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 263)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 422)
./debug_daemon.sh: line 6:  2124 Segmentation fault      sudo /usr/sbin/nvcamera-daemon

My driver has those V4L2 controls defined and accessable via v4l2-ctl, so I assume the root of the issue is the “No module data found” message.

How exactly does the camera daemon map the modules? I’ll attach my dtsi.
tegra210-daxc02.dtsi.txt (8.44 KB)

For more information you can download the document from http://developer.nvidia.com/embedded/dlc/l4t-documentation-24-2-1 and check the “Sensor Driver Programming Guide” chapter.

Yes, that chapter was helpful when I was first getting everything working. All the device tree stuff seems to match the documentation.

The only big differences I see are that when I wrote the driver I tried as much as possible not to use the camera_common framework Perhaps that is the issue? Is camera_common responsible for registering the module?

Everything works with yavta and gstreamer, so I thought my driver was fine and it was an issue with my device tree.

What is responsible for adding the data to “NvPclHwGetModuleList” to be found by the daemon?

You are correct there’s no matter with the camera_common. The NvPclHwGetModuleList was get from device tree. Like below define.

tegra-camera-platform {
    compatible = "nvidia, tegra-camera-platform";
    modules {
        module0 {
            badge = "e3326_front_P5V27C";
            position = "rear";
            orientation = "1";
            drivernode0 {
                pcl_id = "v4l2_sensor";
                proc-device-tree = "/proc/device-tree/host1x/i2c@546c0000/ov5693_c@36";
            };
        };
    };
};

So it’s just tegra-camera-platform that I have to worry about? That’s helpful.

Here’s mine taken from my Tx1

dtc -I dtb tegra210-jetson-tx1-p2597-2180-a01-devkit.dtb | less
tegra-camera-platform {
    compatible = "nvidia, tegra-camera-platform";
    modules {
        module0 {
            badge = "daxc02_master_mt9m021";
            position = "front";
            orientation = [31 00];
            status = "okay";

            drivernode0 {
                pcl_id = "v4l2_sensor";
                devname = "daxc02 6-0010";
                proc-device-tree = "/proc/device-tree/host1x/i2c@546c0000/daxc02@10";
                status = "okay";
        };
    };
};

Looks like I have all the parts and the proc-device-tree path is valid. Is the badge parsed at all, do I need to change my naming convention?

You can reference to the thread https://devtalk.nvidia.com/default/topic/971131 to enable the logs to check your device tree is being parsing or not.

kill the nvcamera-daemon then

root@tegra-ubuntu:/home/ubuntu# export enableCamScfLogs=1
root@tegra-ubuntu:/home/ubuntu# export enableCamPclLogs=1
root@tegra-ubuntu:/home/ubuntu# /usr/sbin/nvcamera-daemon

Yes sir, found that thread. That’s where I found out that my module data isn’t getting picked up.

Which way did you use for. Suggest to use the second one.

Using Plugin Manager
Using Main Platform Device Tree File

External Media

I went with the second option, device tree file.

I rechecked and confirmed that I commented out the two lines

//#include "tegra210-jetson-cv-camera-plugin-manager.dtsi"
//#include "tegra210-platforms/tegra210-jetson-cv-camera-modules.dtsi"

And added my own dtsi to tegra210-jetson-cv-base-p2597-2180-a00.dtsi

#include "tegra210-platforms/tegra210-daxc02.dtsi"

which I’ll attach

Thanks for helping with this!

Also to reiterate:
The camera gets probed and registered as /dev/video0
I can take pictures with yavta
I can take pictures with gstreamer
I can not use nvcamerasrc or nvgstcapture
tegra210-daxc02.dtsi.txt (8.44 KB)

Hi After
First. I can’t find any problem with your attached device tree. But if your module didn’t have focus you need to remove below declaration. And use dtc utility to disassemble the .dtb to .dtsi to check those part are exactly as your code.

            drivernode1 {
                status = "disabled";
                pcl_id = "v4l2_focuser_stub";
                proc-device-tree = "";
            };

Second. You may need to run the v4l2-compliance to make sure all the CID are implement correct.
Looks like your sensor driver not implement below control.
V4L2_CID_FRAME_LENGTH , V4L2_CID_COARSE_TIME(_SHORT), V4L2_CID_EXPOSURE

NvU32 cid;
    NvU32 oldVersionCidCount = 0;
    NvU32 newVersionCidCount = 0;

    /* Check for new CIDs */
    V4L2DeviceFindControlByName(pContext->pV4L2Device,
                                "Frame Rate", &cid);
    if (cid != 0)
        newVersionCidCount++;
    V4L2DeviceFindControlByName(pContext->pV4L2Device,
                                "Exposure", &cid);
    if (cid != 0)
        newVersionCidCount++;

    /* Check for old CIDs */
    V4L2DeviceFindControlByName(pContext->pV4L2Device,
                                "Frame Length", &cid);
    if (cid != 0)
        oldVersionCidCount++;
    V4L2DeviceFindControlByName(pContext->pV4L2Device,
                                "Coarse Time", &cid);
    if (cid != 0)
        oldVersionCidCount++;
    V4L2DeviceFindControlByName(pContext->pV4L2Device,
                                "Coarse Time Short", &cid);
    if (cid != 0)
        oldVersionCidCount++;

    /* Check for mixed or missing CIDs */
    if ((newVersionCidCount && oldVersionCidCount) ||
        (!newVersionCidCount && !oldVersionCidCount) ||
        (oldVersionCidCount == 0 && newVersionCidCount < 2) ||
        (newVersionCidCount == 0 && oldVersionCidCount < 2))
    {
        NvOsDebugPrintf("%s: Mixed or missing V4L2 controls!\n", __func__);
        NvOsDebugPrintf("%s: Make sure your kernel driver implements either\n", __func__);
        NvOsDebugPrintf("%s: V4L2_CID_FRAME_LENGTH + V4L2_CID_COARSE_TIME(_SHORT)\n", __func__);
        NvOsDebugPrintf("%s:      or  \n", __func__);
        NvOsDebugPrintf("%s: V4L2_CID_FRAME_RATE + V4L2_CID_EXPOSURE\n", __func__);

        return NvError_BadParameter;
    }

I’ve attached the decompiled dtb and the results of the compliance test.

This portion was directly from the dtb. I’ll have to add drivernode1 back in. I wasn’t aware it was necessary since it was set to disabled.

Good call on the v4l2-compliance test, I had run it before, but hadn’t looked closely at the output. Looks like I have a few failures in Format ioctls that I overlooked.

v4l2-compliance -d /dev/video0
Format ioctls:
	test VIDIOC_ENUM_FMT/FRAMESIZES/FRAMEINTERVALS: OK
	test VIDIOC_G/S_PARM: OK (Not Supported)
	test VIDIOC_G_FBUF: OK (Not Supported)
	test VIDIOC_G_FMT: OK
	fail: v4l2-test-formats.cpp(330): colorspace >= 0xff
	fail: v4l2-test-formats.cpp(432): testColorspace(pix.pixelformat, pix.colorspace, pix.ycbcr_enc, pix.quantization)
	fail: v4l2-test-formats.cpp(732): Video Capture is valid, but TRY_FMT failed to return a format
	test VIDIOC_TRY_FMT: FAIL
	fail: v4l2-test-formats.cpp(330): colorspace >= 0xff
	fail: v4l2-test-formats.cpp(432): testColorspace(pix.pixelformat, pix.colorspace, pix.ycbcr_enc, pix.quantization)
	fail: v4l2-test-formats.cpp(952): Video Capture is valid, but no S_FMT was implemented
	test VIDIOC_S_FMT: FAIL
	test VIDIOC_G_SLICED_VBI_CAP: OK (Not Supported)
	test Cropping: OK (Not Supported)
	test Composing: OK (Not Supported)
	fail: v4l2-test-formats.cpp(1486): node->can_scale && node->frmsizes_count[v4l_format_g_pixelformat(&cur)]
	test Scaling: OK

I’ll attach the full logs and see about fixing those errors.

That said those controls are settable through v4l2-ctl, so it’s strange to me that they aren’t found. Here’s the full list

v4l2-ctl -d /dev/video0 -l
User Controls

	horizontal_flip (int)    : min=0 max=1 step=1 default=0 value=0
	vertical_flip (int)    : min=0 max=1 step=1 default=0 value=0
	gain_red (int)    : min=0 max=255 step=1 default=16 value=16
	gain_green_r (int)    : min=0 max=255 step=1 default=16 value=16
	gain_green_b (int)    : min=0 max=255 step=1 default=16 value=16
	gain_blue (int)    : min=0 max=255 step=1 default=16 value=16
	gain_column (int)    : min=0 max=3 step=1 default=0 value=0

Camera Controls

	auto_exposure (int)    : min=1 max=2 step=1 default=1 value=1
	<b>frame_length (int)</b>    : min=0 max=32767 step=1 default=1984 value=1984 flags=slider
	<b>coarse_time (int)</b>    : min=2 max=32761 step=1 default=32698 value=32698 flags=slider
	<b>coarse_time_short (int)</b>    : min=2 max=32761 step=1 default=32698 value=32698 flags=slider
	group_hold (intmenu): min=0 max=1 default=0 value=0
	hdr_enable (intmenu): min=0 max=0 default=0 value=0
	bypass_mode (intmenu): min=0 max=1 default=0 value=0
	   gain (int)    : min=0 max=255 step=1 default=16 value=16 flags=slider
	<b>exposure (int)</b>    : min=1 max=672 step=1 default=256 value=256

Image Processing Controls

	test_pattern (menu)   : min=0 max=4 default=0 value=0

dtb.txt (387 KB)
compliance.txt (3.06 KB)

Atrer,
While debugging is ongoing, I am wondering if you guys ask Leopard for MT9M021 sensor driver support for Jetson? Thanks.

Chijen,

Leopard doesn’t provide any drivers for the MIPI version of their camera, hence my work. However, It’s one of the few global shutter camera options we could find. Leopard did provide me with register maps and some basic information, and I was able to find an old MT9M021 Beagleboard driver to use as a starting point.

Once we get the driver working we’re going to open source it. Hopefully my documentation I’m putting together will help those working on similar projects.

Atrer
Thanks for clarify
Then please make the v4l2-compliance pass first.