In need of a flow diagram for calls made via LibArgus to the CSI firmware

I’m working with the Argus library on a Jetson device and trying to understand the flow of how the Argus API calls are translated into interactions with the underlying software and hardware components. Specifically, I’m interested in tracing the path from an Argus API call (e.g., Argus::ICaptureSession::capture( …)) to the calls down to the Kernel / CSI firmware. I want to know what this single capture call is redirected to down to the CSI firmware.

From what I understand, the Argus library sits on top of the NVIDIA Multimedia API (MM API) framework, which acts as a middleware layer between the high-level API and the lower-level kernel drivers and hardware components. However, I’m struggling to find documentation or resources that clearly explain this flow and the interfaces between the MM API framework and the kernel drivers/hardware blocks.

Could someone from NVIDIA provide insights into the following:

  1. How does the Argus library interface with the MM API framework, and what are the key components involved in this interaction?
  2. Within the MM API framework, how are the Argus API calls translated into configurations and commands for the relevant hardware blocks like the ISP, VIC, and potentially the GPU?
  3. What are the specific kernel drivers and interfaces (e.g., device nodes, IOCTLs) that the MM API framework interacts with to program the hardware blocks?
  4. Are there any resources, documentation, or sample code available that could help me better understand this flow and the interfaces between the various software and hardware components involved in the multimedia pipeline on Jetson devices?

Any guidance or pointers in this regard would be greatly appreciated, as it would help me better comprehend the inner workings of the Argus library and the NVIDIA multimedia stack on Jetson devices. Thanks.

hello Shrihari33,

please check NVIDIA camera software architecture for reference.

Hello JerryChang.
Thanks so much for the link, it really helped me go through the flow of calls. But I am afraid I still cannot get hold of the ‘libargus’ source code. I want to know where the libargus calls into after being told by the libargus application. I am in need of the trace of abstracted CSI calls (ex: “capture_frame”, “setup”) from the libargus, to the lower level stack.

I would really appreciate any help around this. Thanks!

hello Shrihari33,

according to the software architecture, we did not public the sources of [libargus] and [Camera Core].
however, please download MMAPI package, i.e. $ sudo apt install nvidia-l4t-jetson-multimedia-api
you may obtain the sources of [libargus application] and also [nvarguscamerasrc].

BTW, here’s documentation, Libargus Camera API.

Hello JerryChang, thanks so much for the documentation link. I appreciate it.
The package you mentioned does not contain sources for argus. It just contains the headers.

I have a few doubts though, which I would really appreciate if you could clear them out because I couldn’t find any solution in the documentation, nor online.

The image below in the documentation show-cases how exactly the data flows in the whole system.
After my fair bit of research, I understood the direct V4L2 interactions with the sensor device. Then, I realised how GStreamer is a nice wrapper to control the capture pipeline. I understand the flow when I consider the direct V4L2 method. But then, when I tried to make sense of the flow via the libargus, I am posed with several questions:

  1. What exactly is libargus? I understand that we have the nvarguscamerasrc for the Tegra Drivers config, but we also see that it interacts with the GStreamer application. So, what exactly is the libargus and libargus app? I assume, in any case of route, we will have to have a GStreamer application open, as in, the GStreamer is the software that creates a source-dest sink. Please correct me if I’m wrong.
  2. It is also said that libargus is a low-level API. But isn’t it a higher level abstraction?
  3. I don’t understand what is the “camera core” stack. I predicted that the nvarguscamerasrc can potentially be relaying ISP related calls to the libargus, which then hands it off to the camera core, which then interacts with the Tegra Drivers. But is this really true? If so, why 3 levels of indirection if I may? Why couldn’t the nvarguscamerasrc call the Tegra Drivers directly?
  4. When using libargus, are the setup calls of device (ioctl calls) relayed via the camera core, or via the GStreamer Application through the V4L2src directly?

I understand this might be too much to ask for, but I would truly appreciate any help around this. Thanks JerryChang!

hello Shrihari33,

let’s check whether the package is downloaded correctly.
you should be able to see those Argus samples, (libargus application) after install the MMAPI package.
for instance, /usr/src/jetson_multimedia_api/argus/samples/*

let me give you a quick overview,
here’re two pre-built libraries.
/usr/lib/aarch64-linux-gnu/tegra/libnvargus.so
/usr/lib/aarch64-linux-gnu/tegra/libnvscf.so
these two are not public sources,
libnvargus.so is the [libargus] block, and libnvscf.so stands for [Camera Core] and also [Tuning] blocks.
so, when using libargus, you must have operations via [Camera Core] to kernel layer.

hello JerryChang,

I appreciate the quick reply. Yes, I do have the samples directory. I have looked through them, but the problem I am faced with is not knowing how exactly the CSI / VI calls are abstracted throughout this flow.

I sort of understand the libargus API’s now. But, if I need to understand the flow of, say CSI calls that are made (higher level to lower level flow), I presume I need to have the sources of libargus, camera core. How could I get hands on this source?

One last question, what exactly is the nvarguscamerasrc here though, could you perhaps clarify that? It is mentioned in the docs that it handles ISP related properties, but doesn’t the camera core do that too? It would be of great help if you could let me know why 2 levels of indirection. Is the nvarguscamerasrc involved in the capture pipeline using libargus?

I truly appreciate your reply, thanks!

hello Shrihari33,

nvarguscamerasrc is one of gstreamer plugin. it has public release sources.
for instance,
you’re able to access camera stream, and rendering camera preview via below gst pipeline.
$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),framerate=30/1,format=NV12' ! nvvidconv ! xvimagesink

you may visit NVIDIA Jetson Linux 35.5.0 page, to download [Driver Package (BSP) Sources] package.
please extract gst-nvarguscamera_src.tbz2 package to obtain the sources.

hello JerryChang,

Thank you so much for the explanation. I appreciate it. I shall have a look into it when I to work with the board today.

Have a good day!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.