I need to render frames from nvv4l2 camera src to opengl renderer. Do we have something on a jetson multimedia api , I found 1 example but that render from nvargus camera src and for that they are using nvidia’s api (setDisplay) do we have something for nvv4l2 src. If not than do we have any alternative for displaying v4l22src in opengl renderer
Please check if nv3dsink plugin fits your use-case. It is open source and you can download the source code package and check it. If you use Jetpack 4.6.3(r32.7.3), please download:
Driver Package (BSP) Sources
We can run gstreamer pipeline like:
nvv4l2camerasrc ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! nv3dsink
`Thanks @Danell, Pipeline works and I’m able to render the frames. But getting a latency of 130 ms which is too high, Can we do something about that to make it less than 100?
Please try … ! nv3dsink sync=0 and check if there is improvement. It disables synchronization mechanism in gstreamer.
Thanks, DaneLLL for the quick response, but even now latency is 130 ms.
Please also execute the steps and check:
- Run sudo nvpmodel -m 0 and sudo jetson_clocks
- Enable VIC at maximum clock:
Nvvideoconvert issue, nvvideoconvert in DS4 is better than Ds5? - #3 by DaneLLL
After executing the steps the system is in maximum performance mode. If there is still no much improvement, we would suspect the latency is from the camera source itself.
Thanks for quick response, After enabling VIC at maximum clock still getting latency of 130ms
When I’m trying to reduce latency I found 2 interesting observation
- Even when I pass V4l2_BUFFERS_NUM as 1 or 2 even then after the IOCTL call I receive 3 frames with my custom camera but with usb cam if I ask for 1 then 1 get one. So my question is, Do we have a way so we can ask for 1 and forcefully get 1. I think that will solve my problem
rb.count = V4L2_BUFFERS_NUM;
rb.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
rb.memory = V4L2_MEMORY_MMAP;
if (ioctl(ctx->cam_fd, VIDIOC_REQBUFS, &rb) < 0)
ERROR_RETURN(“Failed to request v4l2 buffers: %s (%d)”,
- Also when I printed difference of system time and frame capture time I’m getting difference of 2 sec in comparision of 20ms in USB Camera
It looks like the latency is from the camera source. After executing the steps, the system is at maximum performance mode. Do you use YUV sensor or USB camera? Generally there is buffering mechanism in capturing frames and it may not work properly if you try to reduce the buffer number in sensor driver.
I’m using CSI Camera which gives YUYV frames at 30fps(Frame Resolution:- 400 x 400).
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.