I am using Leopard Imaging IMX185 camera connected to Tegra TX1 development kit using Leopard Imaging adapter. My jetpack version is JetPack-L4T-2.3.1 and kernel version is 24.2.1
I am able to capture from the camera using nvgstcapture-1.0 and also using the nvcamerasrc element in a gstreamer pipeline.
However, our requirement is to capture the RAW data from the sensor (we will be eventually moving to a monochrome sensor). When I try to capture using yavta, I get all black frames (content is 0x0). I changed yavta to fill in a “0xab” pattern in the frames before capture, and I see that the frames have the same content after capture, i.e. no data got written to the frames.
Could you please tell me:
- What could be causing the v4l2 direct capture failure?
- Is it possible to use nvcamerasrc element to capture raw bayer data (i.e. the ISP should do nothing)
Here is more information about my setup:
Jetson TX1 Development Kit
Jetpack 2.3.1, Linux kernel tegra-24.2.1
IMX 185 Leopard Imaging camera connected to J21 connector of TX1 via Leopard imaging MIPI connector
- nvcamerasrc works:
gst-launch-1.0 nvcamerasrc fpsRange=“30 30” ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! filesink location=file.raw
→ This works
→ I put logs inside drivers/media/platform/tegra/camera/channel.c, this goes via bypass path, so VI bypass works
- v4l2 capture does not work
v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=test.raw
./yavta /dev/video0 -c3 -s1920x1080 -fSRGGB10 -I -Fov.raw
→ This does not work. This goes via the tegra-video v4l2 path. I put logs in the kernel driver and found that queue frame → soc_channel_capture_frame → updating v4l2 timestamps all happens at expected frame rate, but received frames have no data.