I think I figured out the CSI problem and I can get images from the camera now… although, not very good ones, so I’ve switched to getting colorbars at the moment:
(the ‘bright’ part is the fluorescent strip above)
I’ve just been capturing raw data packets and converting the raw to png.
It’s strange, I specify 1920x1080 for the image size and the images that come out are 4147200 or 4MB which is twice the size of 1920x1080 = 2073600, I was expecting 4X the size of the image that would group together a block of 4 BGGRs. Perhaps I’m not understanding something, I’ll have to read up on this.
When I run the following command I get a socket read error:
ubuntu@tegra-ubuntu:~/Projects/video-tester$ gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvjpegenc ! filesink location=test.jpg -e
Setting pipeline to PAUSED ...
Available Sensor modes :
1920 x 1080 FR=30.000000 CF=0x1009208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
1280 x 720 FR=60.000000 CF=0x1009208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
NvCameraSrc: Trying To Set Default Camera Resolution. Selected 1920x1080 FrameRate = 30.000000 ...
Socket read error. Camera Daemon stopped functioning.....
^Chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:04.621666769
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I believe for Bayer Raw10 image, the size of 1920x1080 image should be 4MB. For Bayer Raw8 image, the size of 1920x1080 image should be 2MB. The log shows bit depth of 10 bit, I believe the size is correct.
Did you do debayer?
It could be confusing for image formats, e.g., Android RAW10 and camera RAW10. I captured TX1 5MP camera using V4L and got 10MB image size returned from V4L buffer. I believe 10 bit image from CSI-2 interface and V4L driver is not packed, i.e., 2 bytes for each 10 bit pixel.
I’ll take a look at you color bar and real images.
The image looks fine using Bayer pattern GBRG instead of GRBG.
Just as an status update, we got the raspberry pi v2 camera (IMX219) working as well using the J20 board from auvidea but at this point we are playing a little bit with the settings to get the right resolution.
Thanks for the feedback. Tomorrow I can capture some more images with better light.
I’m not completely positive about this but I think there are three factors that are inter-related
The period of your clock which is generated from the assortment of PLL multipliers and dividers.
The exposure time, which I believe is a count related to the period. So if your frequency goes up you also need to double your exposure.
The gain is how much gain you can you apply to the input. So say you capture 100 photons in one image at a given exposure and you cut the exposure by 2 then for the same light you would capture 1/2 as much but you can multiply the output with the gain by 2 so effectively you would have the same level of light.
If you use too much gain then the images have more noise. below is an image in low light where I needed to boost the gain up:
I had to fixup a TX1 devboard for someone on my team and it gave me an opportunity to capture images from an OV5693. I was looking forward to this because I wanted to see if the bayer output from that image was any different from what the OV5647 generated.
Apparently not, here is both the colorbars and a captured image. I couldn’t set the image size down to 1920x1080 but the pixel format of colorbars between the OV5693 and the OV5647 look very similar.
EDIT: I DID NOT CAPTURE COLORBARS, THE OV5693_COLORBARS.RAW IS A NORMAL VIDEO CAPTURE
I believe that in order to use the isp with nvcamerasrc you need an isp configuration file. We are currently working on this to see if we can get it working with imx219.
The size of above ov5693 images(7,464,960 bytes)seem incorrect for 2592x1944. I got 10,077,696 bytes from ov5693.
I thought TX1 ISP was not used for raw Bayer cameras. If Color bars worked perfectly but real images do not, it’s likely settings (the three factors you pointed out) were incorrect.
What if you use some of the automatic features of the sensor? Like autowhite balance?
In the other hand, about the ISP, yes v4l2src will bypass the ISP and there is not way to use the ISP with bayer cameras unless your are an ODM working with nvidia. But now that we have the driver for the IMX219 we will try to use that driver with nvcamerasrc to see if we can pass the frames through the ISP.
You’re right, I wasn’t paying attention. I looked at the raw data again and I found I was incorrect. the colorbars between the OV5693 and OV5647 do not match up, I must have made a mistake when saving the images for the OV5693 and compared two previous OV5647 colorbars.
… okay trying again…
I tried again today to get the colorbars by setting /sys/module/ov5693/parameters/test_mode to 1 but it didn’t enable the colorbars. So I haven’t captured any colorbars for the OV5693, I fixed up the previous post so no one gets confused.
Just to let you know that we were able to capture from the cameras using our V4L2 driver and integrate that with nvcamerasrc so now we can also use the ISP to do the conversion from bayer to YUV and render the image. When doing that we don’t see the saturation on the image that might be then caused by the tool used to convert from bayer to png or any other format. Check this:
I looked into re-building gstreamer with support for 10-bit images and was able to capture pngs and video too. Although 3 out of 5 times I get a strange issue where it seems as though the top half of the image is processed using all of the image while the bottom half of the image is black.
Have you ever ran into this?
I’m curious if it has something to do with sending short packets for each lane. I’ve been trying to get back to this problem but I had to finish up designing a board and it took longer than I expected.
Congratulations on getting the drivers for the two cameras up and running.