Capture video in H264 file using V4L2 and NVIDIA Multimedia APIs with UYVY sensor (e-CAM130_CUXVR)


I have this setup:
Jetson Xavier
Jetpack R32 2.1
Camera: e-Con Systems e-CAM130_CUXVR (UYVY format)

I need to capture the camera stream with V4L2 and save it on video in H264 at 30 FPS in the SSD storage. In 4K resolution, it was not possible to capture video at 30 FPS using GStreamer, since frames are lost, and e-Con support suggest to use Multimedia APIs.

For that porpouse, I’m trying to make work this application:

(Note: The original example in /usr/src/tegra_multimedia_api/samples/12_camera_v4l2_cuda works fine)

When I run the application with these parameters: ./camera_v4l2_cuda -d /dev/video0 -s 1920x1080 -f UYVY -n 10

I get this output:

libv4l2_nvvidconv (0):(802) (INFO) : Allocating (4) OUTPUT PLANE BUFFERS Layout=0
libv4l2_nvvidconv (0):(818) (INFO) : Allocating (4) CAPTURE PLANE BUFFERS Layout=0
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
H264: Profile = 100, Level = 50 
encoded frame size 9797 
[ERROR] (/media/interactive/SSD/projects/common/classes/NvV4l2ElementPlane.cpp:256) <enc0> Output Plane:Error while Qing buffer: Device or resource busy
ERROR: conv_capture_dqbuf_thread_callback(): (line:638) Failed to queue buffer on ENC output plane
[ERROR] (/media/interactive/SSD/projects/common/classes/NvV4l2ElementPlane.cpp:178) <enc0> Capture Plane:Error while DQing buffer: Broken pipe
encoded frame size 9797 
[ERROR] (/media/interactive/SSD/projects/common/classes/NvV4l2ElementPlane.cpp:178) <enc0> Capture Plane:Error while DQing buffer: Broken pipe
encoded frame size 9797 
[ERROR] (/media/interactive/SSD/projects/common/classes/NvV4l2ElementPlane.cpp:256) <enc0> Capture Plane:Error while Qing buffer: Device or resource busy
ERROR: enc_capture_dqbuf_thread_callback(): (line:666) Failed to queue buffer on ENC capture plane

Please, could someone guide me on how to solve this problem?

Thank you very much in advance

Please apply this patch

Hello DaneLLL,

The file recording works OK!

I’ve applied the patch to the original demo 12_camera_v4l2_cuda. Then I’ve added the file write part taken from 10_camera_recording example.

In the same way, now I have to save the streams from 4 cameras in a synchronized way. What approach is recommended? Is there a multi-camera example with synchronization?

Thanks in advance

We only have 12_camera_v4l2_cuda sample for launching single v4l2src. You may refer to it and adapt to your usecase.
If you need to composite 4 camera frames into one frame, you may try NvBufferComposite(). There are NvBuffer APIs defined in nvbuf_utils.h, FYR.

Thanks DaneLLL,

I’m adapting the example 12_camera_v4l2_cuda, duplicating all the code to manage 2 cameras. I’m not using NvBufferComposite(), but I’m using 2 contexts, 2 calls to NvEglRenderer::createEglRenderer(), 2 calls to NvVideoEncoder::createVideoEncoder(), etc. The idea is to have a window for each camera.

  1. I’d like to hear your opinion about that approach mentioned before.

  2. The problem is that the callback for the second camera “enc_capture_dqbuf_thread_callback()” is never called. I’m investigating but appears to be some kind of blocking with the first camera. The first camera is showed OK in the screen, but the second is black. If I disable the capture for the first camera, the second is showed OK. I cannot see both at the same time.

Thanks for your help!

Instead of running two encoding/rendering threads in single process, how about running single encoding/rendering thread in each process? In this case you can run like

$ ./camera_v4l2_cuda -d /dev/video0 & ./camera_v4l2_cuda -d /dev/video1 &