L4T 24.2 Rasperry Pi CSI camera driver debug

Xilinx has some documents for D-Phy interface:

Have you compared differences between OV5647 and OV5693?

It will take a while for me to get Auvidea J20 or J100 to try Pi Cameras.

Some Progress,

I think I figured out the CSI problem and I can get images from the camera now… although, not very good ones, so I’ve switched to getting colorbars at the moment:

Apparently I’ve got something configured incorrectly. here is a non color bar image:

(the ‘bright’ part is the fluorescent strip above)

I’ve just been capturing raw data packets and converting the raw to png.

It’s strange, I specify 1920x1080 for the image size and the images that come out are 4147200 or 4MB which is twice the size of 1920x1080 = 2073600, I was expecting 4X the size of the image that would group together a block of 4 BGGRs. Perhaps I’m not understanding something, I’ll have to read up on this.

When I run the following command I get a socket read error:

ubuntu@tegra-ubuntu:~/Projects/video-tester$ gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvjpegenc ! filesink location=test.jpg -e


Setting pipeline to PAUSED ...

Available Sensor modes : 
1920 x 1080 FR=30.000000 CF=0x1009208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
1280 x 720 FR=60.000000 CF=0x1009208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

NvCameraSrc: Trying To Set Default Camera Resolution. Selected 1920x1080 FrameRate = 30.000000 ...

Socket read error. Camera Daemon stopped functioning.....
^Chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:04.621666769
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Looking in dmesg I find this line:

[ 1128.562553] nvcamera-daemon[2269]: unhandled level 2 translation fault (11) at 0x00000010, esr 0x92000006

I’ve updated the github repo with the current configuration.

I’ve still got a lot to figure out but any ideas about what to try would be great.

Dave

I believe for Bayer Raw10 image, the size of 1920x1080 image should be 4MB. For Bayer Raw8 image, the size of 1920x1080 image should be 2MB. The log shows bit depth of 10 bit, I believe the size is correct.
Did you do debayer?

@yahoo2016

Nice Call!

Unfortunately there are no lights on in the office to get some images so I’ll get some tomorrow.

Great work! Color bars are more useful than real images for driver development. Your posts and git are very valuable for all developers.

Hi, good job getting the sensor working!

About the size of the image, it still sounds weird for me, for a 1920x1080 I believe the size should be:

1920108010/8=2592000 bytes -> ~2.5MB

this because 1920 is already multiple of 4.

https://developer.android.com/reference/android/graphics/ImageFormat.html#RAW10

Here is the raw data for the colorbars from the camera:

http://www.cospandesign.com/images/uploaded/colorbars.raw

I’ve opened it up in the hex-editor, here’s what it looks like:

//Section 1 (first Row)
00000000: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000010: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000020: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
//Section 1 (Second Row)
00000f00: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000f10: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000f20: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................


//Section 2 (First Row)
00000290: 0000 ff03 0000 ff03 0000 ff03 0000 ff03  ................
000002a0: 0000 ff03 0000 ff03 0000 ff03 0000 ff03  ................
000002b0: 0000 ff03 0000 ff03 0000 ff03 0000 ff03  ................
//Section 2 (Second Row)
00001190: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
000011a0: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
000011b0: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................

//Section 3 (First Row)
00000520: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000530: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
00000540: ff03 ff03 ff03 ff03 ff03 ff03 ff03 ff03  ................
//Section 3 (Second Row)
00001420: ff03 0000 ff03 0000 ff03 0000 ff03 0000  ................
00001430: ff03 0000 ff03 0000 ff03 0000 ff03 0000  ................
00001440: ff03 0000 ff03 0000 ff03 0000 ff03 0000  ................

it seems as though there is padding to get to 16-bits and a byte swap.

If that were the case then then

Section 1: White
Blue = 0x3FF
Green1 = 0x3FF
Green2 = 0x3FF
Red = 0x3FF

Section 2: Yellow
Blue = 0x00
Green1 = 0x3FF
Green2 = 0x3FF
Red = 0x3FF

Section 3: Cyan
Blue = 0x3FF
Green1 = 0x3FF
Green2 = 0x3FF
Red = 0x00

I’m not doing the rest of them :/

I captured an image and I’ve obviously got something wrong with the configuration:

Here is the data from the above image
http://www.cospandesign.com/images/uploaded/data.raw

It could be confusing for image formats, e.g., Android RAW10 and camera RAW10. I captured TX1 5MP camera using V4L and got 10MB image size returned from V4L buffer. I believe 10 bit image from CSI-2 interface and V4L driver is not packed, i.e., 2 bytes for each 10 bit pixel.

I’ll take a look at you color bar and real images.

The image looks fine using Bayer pattern GBRG instead of GRBG.

Can you take an image with bright R,G,B colors?

Just as an status update, we got the raspberry pi v2 camera (IMX219) working as well using the J20 board from auvidea but at this point we are playing a little bit with the settings to get the right resolution.

https://devtalk.nvidia.com/default/topic/963288/jetson-tx1/6-raspberry-pi-cameras/post/5011148/#5011148

This is great to hear! Good job guys.

Hi Dave,

Thanks! one question, what tool do you use to debayer the image? We normally use raw2rgbpnm but I was wondering which is the one that you use.

Our next step is to try integrating the driver to make use of the ISP so we debayer there.

-David

I found this on github: https://github.com/jdthomas/bayer2rgb

I use this command:

./bayer2rgb --input=data.raw --output=data.tiff --width=1920 --height=1080 --bpp=16 --first=GBRG --method=BILINEAR --tiff

I then use image magik to convert the tiff to png

convert data.tiff data.png

Awesome, thank you. I will give it a try.

@yahoo2016

Thanks for the feedback. Tomorrow I can capture some more images with better light.

I’m not completely positive about this but I think there are three factors that are inter-related

  1. The period of your clock which is generated from the assortment of PLL multipliers and dividers.
  2. The exposure time, which I believe is a count related to the period. So if your frequency goes up you also need to double your exposure.
  3. The gain is how much gain you can you apply to the input. So say you capture 100 photons in one image at a given exposure and you cut the exposure by 2 then for the same light you would capture 1/2 as much but you can multiply the output with the gain by 2 so effectively you would have the same level of light.

If you use too much gain then the images have more noise. below is an image in low light where I needed to boost the gain up:

Unfortunately when I try and boost the gain too high things get ugly really fast.

I had to fixup a TX1 devboard for someone on my team and it gave me an opportunity to capture images from an OV5693. I was looking forward to this because I wanted to see if the bayer output from that image was any different from what the OV5647 generated.

Apparently not, here is both the colorbars and a captured image. I couldn’t set the image size down to 1920x1080 but the pixel format of colorbars between the OV5693 and the OV5647 look very similar.

EDIT: I DID NOT CAPTURE COLORBARS, THE OV5693_COLORBARS.RAW IS A NORMAL VIDEO CAPTURE

http://www.cospandesign.com/images/uploaded/ov5693_colorbars.raw
http://www.cospandesign.com/images/uploaded/ov5693_image.raw

Side Note: I would have created images of them but the tool I’ve been using for some reason segfaulted when I used it on these larger images.

So, for some reason the ISP on the TX1 can process images from the OV5693 but not the OV5647.

I believe that in order to use the isp with nvcamerasrc you need an isp configuration file. We are currently working on this to see if we can get it working with imx219.

https://devtalk.nvidia.com/default/topic/970967/isp-configuration-files/

Furthermore, we were able to verify that we can capture from any of the 6 ports available in the J20 and fixed the resolutions problems.

For ov5647 we also see the saturation in some points, currently checking it. There are some vertical weird white lines over the data

The size of above ov5693 images(7,464,960 bytes)seem incorrect for 2592x1944. I got 10,077,696 bytes from ov5693.

I thought TX1 ISP was not used for raw Bayer cameras. If Color bars worked perfectly but real images do not, it’s likely settings (the three factors you pointed out) were incorrect.

What if you use some of the automatic features of the sensor? Like autowhite balance?

In the other hand, about the ISP, yes v4l2src will bypass the ISP and there is not way to use the ISP with bayer cameras unless your are an ODM working with nvidia. But now that we have the driver for the IMX219 we will try to use that driver with nvcamerasrc to see if we can pass the frames through the ISP.

@yahoo2016

You’re right, I wasn’t paying attention. I looked at the raw data again and I found I was incorrect. the colorbars between the OV5693 and OV5647 do not match up, I must have made a mistake when saving the images for the OV5693 and compared two previous OV5647 colorbars.

… okay trying again…

I tried again today to get the colorbars by setting /sys/module/ov5693/parameters/test_mode to 1 but it didn’t enable the colorbars. So I haven’t captured any colorbars for the OV5693, I fixed up the previous post so no one gets confused.

I’ll have to look at this a bit more.

@DavidSoto, thanks for pointing me to that post.

Hi,

Just to let you know that we were able to capture from the cameras using our V4L2 driver and integrate that with nvcamerasrc so now we can also use the ISP to do the conversion from bayer to YUV and render the image. When doing that we don’t see the saturation on the image that might be then caused by the tool used to convert from bayer to png or any other format. Check this:

https://devtalk.nvidia.com/default/topic/975962/jetson-tx1/support-for-raspberry-pi-v2-imx219-and-v1-ov5647-cameras-using-isp-multi-camera/

-David

Thanks @DavidSoto!

I looked into re-building gstreamer with support for 10-bit images and was able to capture pngs and video too. Although 3 out of 5 times I get a strange issue where it seems as though the top half of the image is processed using all of the image while the bottom half of the image is black.

Have you ever ran into this?

I’m curious if it has something to do with sending short packets for each lane. I’ve been trying to get back to this problem but I had to finish up designing a board and it took longer than I expected.

Congratulations on getting the drivers for the two cameras up and running.

Dave