V4l2-ctl with --set-ctrl bypass_mode=1

Hi guys,
I want capture Argus in Tegra NX through this application

v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=raw.RG10

so how can we capture through ISP with --set-ctrl bypass_mode=1? if we set it it doesn’t capture.
and how can we capture through DMA mode please an example.
Thanks so much.

hello hhami.20401,

please try gst pipeline with nvarguscamerasrc plugin, you may see-also developer guide Accelerated GStreamer for sample pipelines.

No I don’t want use Argus Api or nvarguscamerasrc plugin, I want use “v4l2-ctl” ,so how can I capture ?
Thanks so mutch.

please check Camera Software Architecture, you must use libargus for accessing to internal ISP. (i.e. the [Camera Core] block)

Hi, thanks for your attention
Even without Argus API and /usr/lib/aarch64-linux-gnu/tegra/libnvargus_socketclient.so library (libargus as you mentioned) I used /usr/src/jetson_multimedia_api/samples/unittest_samples/camera_unit_sample source code and /usr/lib/aarch64-linux-gnu/libv4l2.so library to capture bayer camera. both of them use ISP. just v4l2-ctl with --set-ctrl bypass_mode=0 and /usr/src/jetson_multimedia_api/samples/v4l2cuda source code with mmap mode and RG10 pixel format doesn’t use ISP. so how can we use v4l2-ctl with --set-ctrl bypass_mode=1 or use /usr/src/jetson_multimedia_api/samples/v4l2cuda source code and use ISP in even MMAP mode or DMA mode? I mean I don’t want use libargus or libv4l2 and capture bayer sensor.
thanks so much.

actually, this is not implemented.
you may refer to below kernel sources for more details.
$public_sources/kernel_src/kernel/nvidia/drivers/media/platform/tegra/camera/vi/channel.c

static void tegra_channel_buffer_queue(struct vb2_buffer *vb)
{
...
         /* for bypass mode - do nothing */
         if (chan->bypass)
                 return;

thanks for your attention,
Does it mean usr/src/jetson_multimedia_api/samples/unittest_samples/camera_unit_sample source code and corresponding library /usr/lib/aarch64-linux-gnu/libv4l2.so doesn’t go through this function( because it uses ISP and I think bypass_mode=1) so it must return. and how about /usr/lib/aarch64-linux-gnu/tegra/libnvargus_socketclient.so library and Argus API doesn’t goes through this function( because it uses ISP and I think bypass_mode=1) so it must return? so why it capture? goes through other function ?
thanks so much.

[edit]

hello hhami.20401,

please refer to Camera Architecture Stack, it goes through different hardware blocks depends-on your userspace applications.

that’s v4l2 application, (note, you may dig into the code, that’s sending sensor controls via v4l2 APIs.)
hence, it’s not using internal NVIDIA ISP.

let me revise this, above was incorrect.
camera_unit_sample is an applications using libargus low-level APIs, it goes through libargus for using ISP to process the frames.
by checking the [Camera Architecture Stack], ISP is belong to the [Camera Core] block.

as you can see in the Camera Software Architecture, it’s libargus application.

Hi Thanks for your attention,
base on https://docs.nvidia.com/jetson/archives/r35.4.1/DeveloperGuide/text/SD/CameraDevelopment/CameraSoftwareDevelopmentSolution.html#camera-architecture-stack

The camera architecture includes the following NVIDIA components:

    libargus: Provides a low-level API based on the camera core stack.

    nvarguscamerasrc: NVIDIA camera GStreamer plugin that provides options to control ISP properties using the ARGUS API.

    v4l2src: A standard Linux V4L2 application that uses direct kernel IOCTL calls to access V4L2 functionality.

I didn’t get the point whether libargus use v4l2src and V4L2 and IOCTL calls ?
base on https://docs.nvidia.com/jetson/l4t-multimedia/l4t_mm_camera_unit_sample.html


does camera_unit_sample and /usr/lib/aarch64-linux-gnu/libv4l2.so library use libargus ? you mentioned v4l2 application, (note, you may dig into the code, that’s sending sensor controls via v4l2 APIs.) hence, it’s not using internal NVIDIA ISP so I got confused, if using libargus mean goes through ISP ? which ISP is internal ? what is other type of ISP ? I don’t know difference between internal ISP and other type of ISP if exists ?
which type of ISP does camera_unit_sample use ? which type of ISP does libArgus use ?
Is libArgus lowest level of API capture for Argus capturing ? or v4l2 APIs is the lowest level API?
Thanks so much.

hello hhami.20401,

FYI, I’ve also revise my previous comments to correct some mistakes.

here’s an example to demonstrate different pipeline it’s using.
you may terminate the Argus daemon service as the quickest way to check it’s using v4l standard IOCTL or libargus.
you may execute $ sudo pkill nvargus-daemon for running camera_sample.
then, you should see the error report as following.

Opening in BLOCKING MODE
(Argus) Error FileOperationFailed: Connecting to nvargus-daemon failed: Connection refused (in src/rpc/socket/client/SocketClientDispatch.cpp, function openSocketConnection(), line 204)
(Argus) Error FileOperationFailed: Cannot create camera provider (in src/rpc/socket/client/SocketClientDispatch.cpp, function createCameraProvider(), line 106)
ArgusV4L2_Open failed: Connection refused
Opening in BLOCKING MODE 
Device does not support V4L2_CAP_VIDEO_CAPTURE_MPLANE
Camera is in error

however, v4l pipeline should still functional even with Argus daemon service terminated.
for example,

$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --set-ctrl bypass_mode=0 --set-ctrl sensor_mode=0 --stream-mmap --stream-count=100
<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.26 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 28.69 fps

some Q&As as following…

for example, if you’re using YUV camera sensor, it’s external ISP (which on the camera module) to process the frame, and it’s outputting YUV content to CSI brick.
libargus cannot support such format types, it only works with v4l pipeline.

they’re the same, it’s using the ISP which belong to the [Camera Core] block.

as you can see in the above example to demonstrate different pipelines.
v4l2 API and libargus API they’re going through different code flows.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.