We’re developing the PCIe HDMI capture card which support V4L2 and DMA Buffer Mode.
So we’re searching for the reference driver source code.
Which code should we refer ?
Hi,
You may refer to 12_camera_v4l2_cuda. The frame capture goes through v4l2 ioctl().
- Set V4L2_MEMORY_DMABUF in ioctl(VIDIOC_REQBUFS):
rb.count = V4L2_BUFFERS_NUM;
rb.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
rb.memory = V4L2_MEMORY_DMABUF;
ioctl(ctx->cam_fd, VIDIOC_REQBUFS, &rb);
- The DMA buffer is NvBuffer. You need to allocate the buffers and queue them in
buf.index = index;
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_DMABUF;
ioctl(ctx->cam_fd, VIDIOC_QUERYBUF, &buf);
buf.m.fd = NVBUFFER_FD;
ioctl(ctx->cam_fd, VIDIOC_QBUF, &buf);
And then you can capture frames by calling:
v4l2_buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
v4l2_buf.memory = V4L2_MEMORY_DMABUF;
ioctl(ctx->cam_fd, VIDIOC_DQBUF, &v4l2_buf);
You can check if you can capture through v4l2-ctl command first. If the command works, you can then try 12_camera_v4l2_cuda.
We have to develop our own v4l2 kernel driver which support DMA Buffer mode.
Because we’re developing the customized PCIe HDMI Capture Board.
Which code should we refer in “kernel/nvidia/drivers/media/i2c” or other folder ?
Hi,
There is USB camera driver in kernel source code. It is based on v4l2 spec and most USB cameras can be plug-and-play. Please take a look at
https://www.kernel.org/doc/html/v4.14/media/v4l-drivers/uvcvideo.html
https://linuxtv.org/downloads/legacy/video4linux/v4l2dwgNew.html
There is the following description in NVIDIA Jetson Linux Developer Guide.
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
V4L2 Kernel Driver (Version 2.0)
This topic is based on the Video for Linux 2 (V4L2) driver for the Sony IMX185 sensor, located at:
/kernel/nvidia/drivers/media/i2c/imx185.c
Examine this source file to obtain a complete understanding of the driver.
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
Will the source code under the i2c folder be helpful for developing the driver ?
And there isn’t imx185.c file in that folder.
Which code is better to refer ?
We’re looking for the driver source code which support DMABUFF and VIDIOC_REQBUFS.
Which code should we refer in “kernel/nvidia/drivers/media/i2c” or other folder ?
Hi,
We would suggest refer to uvc drivers. Please look at
kernel-4.9/drivers/media/usb/uvc/uvc_v4l2.c
kernel-4.9/drivers/media/usb/uvc/uvc_drivers.c
It implements how to register v4l2 ioctls so that we can capture frames through v4l2 with USB cameras.
Hi,
The sensor driver in kernel/nvidia/drivers/media/i2c is for the device connecting to NVCSI interface, such as the default camera ov5693. Yours looks to be through PCIe interface so the sensor driver may not be a good reference. So suggest you refer to uvc drvier to understand how the v4l2 ioctls are registered and used for video capture.
Can “12_camera_v4l2_cuda” sample support YUYV 4k60p ?
The “stack smashing detected” error occurs in our sample code based on its sample.
What do you think ?
After that, we’re moving forward a little.
But we have some troubles yet.
When we read the buffer size with NvBufferGetParams
after creating the buffer with NvBufferCreateEx,
its buffer size is zero.
But each function return no error.
What do you think is the cause ?
Hi,
Please share which parameter returns zero in NvBufferParams so that we can take a look. Since the size of YUV422 is (width * height * 2) bytes, you should be able to calculate the value.
We have found the issue on our driver.
It seems that the data can be received when we use “v4l2-ctl --stream-mmap=3 --stream-count=300”.
We’d like to use the gstreamer.
What pipeline should we use ?
Hi,
Please refer to the steps in
Jetson Nano FAQ
Q: I have a USB camera. How can I launch it on Jetson Nano?