Argus library questions

Hi all. I have some questions about Argus library, probably it’s more to NVIDIA people.

First question, let the task be “to receive a frame from a camera to GPU memory with minimal delay”, is the Argus library the fastest method?

Next, I’m trying to compile and run minimal Argus program, after flashing latest JetPack, I see the library at /usr/lib/aarch64-linux-gnu/tegra/libargus.so and headers at /home/nvidia/tegra_multimedia_api/argus/include/ – is it correct to use them from here?

The program is minimal:

printMsg("Argus: Begin");

using namespace Argus;
UniqueObj<CameraProvider> cameraProvider(CameraProvider::create());

ICameraProvider* iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
REQUIRE_MSG(iCameraProvider, STR("Failed to establish libargus connection"));

printMsg("Argus OK");

When I run it from a user different from “nvidia” user, the output is:

Error: Can't initialize nvrm channel
Error: Can't initialize nvrm channel
Couldn't create ddkvic Session: Cannot allocate memory
main() begin
Argus: Begin
NvIspCtrlInitialize: Error opening ctrl node /dev/nvhost-ctrl-isp (Permission denied)Error: Can't initialize nvrm channel
SCF: Error ResourceError:  (propagating from src/services/capture/CaptureServiceDeviceIsp.cpp, function open(), line 167)
SCF: Error ResourceError:  (propagating from src/services/capture/CaptureServiceDevice.cpp, function initialize(), line 314)
SCF: Error InvalidState: Isp is not opened (in src/services/capture/CaptureServiceDeviceIsp.cpp, function close(), line 195)
SCF: Error ResourceError:  (propagating from src/services/capture/CaptureService.cpp, function startService(), line 937)
SCF: Error InvalidState: Queue mutex not initialized (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamerautils/inc/QueueImpl.h, function dequeue(), line 227)
SCF: Error ResourceError:  (propagating from src/components/ServiceHost.cpp, function startServices(), line 120)
SCF: Error ResourceError:  (propagating from src/api/CameraDriver.cpp, function initialize(), line 153)
SCF: Error ResourceError:  (propagating from src/api/CameraDriver.cpp, function getCameraDriver(), line 100)
(Argus) Error ResourceError:  (propagating from src/api/GlobalProcessState.cpp, function createCameraProvider(), line 204)
Failed to establish libargus connection

Here the problems are:

  1. A capture library prints errors to stdout/stderr -- it's better to return error codes or at least throw exceptions so the main application can handle them.
  2. The library runs before main() function, not good.
  3. What do these errors mean?

When I run it via sudo OR from “nvidia” user without sudo, it gives:

main() begin
Argus: Begin
OFParserGetVirtualDevice: virtual device driver node not found in proc device-tree
OFParserGetVirtualDevice: virtual device driver node not found in proc device-tree
LoadOverridesFile: looking for override file [/Calib/camera_override.isp] 1/16LoadOverridesFile: looking for override file [/data/nvcam/settings/camera_overrides.isp] 2/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/camera_overrides.isp] 3/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/camera_overrides.isp] 4/16LoadOverridesFile: looking for override file [/data/nvcam/camera_overrides.isp] 5/16LoadOverridesFile: looking for override file [/data/nvcam/settings/e3326_front_P5V27C.isp] 6/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/e3326_front_P5V27C.isp] 7/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/e3326_front_P5V27C.isp] 8/16---- imager: No override file found. ----
Argus OK

This one looks better (and the pointer is not null), but again it tells us that something is wrong.

How to fix it? It’s fresh JetPack install.

@Hexagonal
There are some sample APP you can start with it. And download the l4t document from the download center to check the detail argus API

/home/nvidia/tegra_multimedia_api/argus/samples

Hi ShaneCCC.

I see the samples (which don’t compile after default installation), and I started with them. Also I found the Argus.0.96.pdf doc.

I’m even able to capture single JPEG image from it successfully.

But in the process the library outputs to stdout (what?) such messages:

OFParserGetVirtualDevice: virtual device driver node not found in proc device-tree
OFParserGetVirtualDevice: virtual device driver node not found in proc device-tree
LoadOverridesFile: looking for override file [/Calib/camera_override.isp] 1/16LoadOverridesFile: looking for override file [/data/nvcam/settings/camera_overrides.isp] 2/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/camera_overrides.isp] 3/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/camera_overrides.isp] 4/16LoadOverridesFile: looking for override file [/data/nvcam/camera_overrides.isp] 5/16LoadOverridesFile: looking for override file [/data/nvcam/settings/e3326_front_P5V27C.isp] 6/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/e3326_front_P5V27C.isp] 7/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/e3326_front_P5V27C.isp] 8/16---- imager: No override file found. ----
SCF: Error InvalidState:  NonFatal ISO BW requested not set. Requested = 2147483647 Set = 4687500 (in src/services/power/PowerServiceCore.cpp, function setCameraBw(), line 653)
NVMAP_IOC_WRITE failed: Invalid argument
NVMAP_IOC_READ failed: Invalid argument
NVMAP_IOC_READ: Offset 0 SrcStride 2048 pDst 0x7f3dcfdcf0 DstStride 2048 Count 1

How can I understand what the library is complaining about? The API documentation won’t help.

It’s default TX2 development board with default camera and default firmware of a latest version (JetPack). Flashed by me from a fresh install of Ubuntu of recommended version. I would expect it to work without any errors (and samples to compile after default setup).

Maybe the correct question is “What is the lowest-level and the most robust API to capture raw frames from CSI camera on Jetson TX2 board?”

Is gstreamer API lower or higher level compared to Argus? Or it’s just different?

Is there a robust low-level API to work with camera, with lowest latency possible? Without std::vectors in headers and global C++ objects running before main() and outputting to stderr?

If you want to get raw data then use v4l2-ctl

v4l2-ctl -d /dev/video0 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=ov.raw

Thank you for pointing out.

So the v4l2 is the lowest-level and fastest API?

Strangely, Argus instance of “IImage” saves correct JPEG, but the same instance returns trash data when mapBuffer() is called. It returns correct image dimensions, stride is correct, the image has two buffers, for luma and chroma plane in YUV420, with approximately correct size (a bit larger). But the memory for luma plane contains kind of random data.

In the samples, it gets pixel data in “yuvJpeg” sample by mapBuffer() call, but it only prints few bytes, so you can’t check if it’s correct image or not.

@Hexagonal
Even print all of them you still can’t check if it’s correct or not. Because it’s the data output from ISP not the RAW data from sensor.

But the Argus was configured to produce YUV420, so either it should be planar YUV420, or packed format.

Or in some useful format, otherwise how do we use received picture?
Or it’s some kind of metadata?

I want to know the data format of mapBuffer() output.
I am trying to get 6 camera data at once in TX2. I am using Leopard imaging IMX274 array.
I am running the sensors in 1920x1080 at 60 fps. When i run the libargus code for with 2 sensors per capture session I get 60fps for each camera. But I couldn’t use the data from EGL Image.
I tried to convert I420 to RGB using NvVideoConverter, but i get full FPS only for 2 sensors.
The moment i reach 4 sensors the FPS drops to 30 and with all 6 sensors, I only get 7 FPS. Which is totally useless.
Whats the efficient way to get RGB frames from all 6 sensors at 60FPS?

@Hexagonal
Yes, it’s YUV420 or NV21 from the ISP. It’s not the same with the raw data from v4l2-ctl
What I said is you the buffer from argus is already croped/downscaled so you don’t know what’s correct data.

@veshnu
You may need to break down which component consume much time to figure it out.

ShaneCCC, maybe you know, or could point to,

What is the best code example of low-latency video input on Jetson? With libArgus or by any means.

The task is just “to get the image from sensor to GPU memory with minimal delay, in any image format”.

(reasonable question for a real-time vision platform)

@Hexagonal
Not sure if the cusdHstogram as your expect.

I would like to follow up on this question. Is there a way in libArgus, (or maybe in a different streaming library), to specify a pointer to your already allocated GPU memory, and then to stream the video frame data to that location regardless of the format (raw)?

In the cudaHistogram sample, this is abstracted away from us using EGL streams (i.e. call to cuGraphicsResourceGetMappedEglFrame). This can’t be the simplest way to acquire raw data from the frame.

The native buffer functionality also seems inefficient. The way its set up it looks like I have to copy to the native Buffer (copyToNvBuffer), and then do a NvBuffer2Raw if I want the raw data, which is basically 2 different full copy times. Any help would be much appreciated!

Thanks,
aruby

I am sorry to say libargus not support get RAW data yet.

I assume that I need to stream data at the v4l2 level instead, if I want to have control over data formatting myself, rather than leaving this to the performance of the Argus copyToNvBuffer functionality. I already know that the conversions in this function are too slow for my application.

Is there another way that you can suggest which is a faster way to directly access the buffer data from the frames being captured from the camera?

The memory layout of Argus output is in block linear. If you need it to be pitch linear, you still need the conversion.

It is HW conversion and should be faster than SW conversion. What is your resolution and how long does it take for each conversion?

If you use OpenGL, please refer to tegra_multimedia_api\argus\samples\openglBox. GL operation does not need copyToNvBuffer().