Getting Single Images at Variable FPS / Resolution from CSI Camera

I have an RPi V2 CSI camera. I would like to query the camera every 1-2 seconds for a single image at low resolutions, then rapidly start a video stream at high resolution based on image content.

I was initially using Gstreamer through OpenCV in Python, but this does not work as the frame rate / resolution are fixed tuples, the stream setup takes 2+ seconds, and reading images from the stream at a lower frame rate will lead to synchronization issues.

Based on reading, I think using libargus and just creating some Python bindings would be the best way forward. This is probably trivial, yet I cannot find documentation anywhere; where can I find libargus headers / source? I currently have flashed JP 4.4, do I really need to reflash with the L4T Multimedia API image? If I do this, won’t I lose all the benefits of JP 4.4, like cuDNN 8, TensorRT 7.1.3, CUDA 10.2, etc?

Sorry if these are simple questions, I am a fresh CS graduate and not a dedicated embedded developer.

You need to download the multimedia API source sample code from the sdkmanager.
And there have some sample code for cuda maybe help.

1 Like

Just to clarify, this means I’d need to re-flash my Jetson Nano using the sdkmanager while selecting the Multimedia API as one of the packages to include?

I’m a little confused with how package management works. I believe I flashed my Jetson Nano with > L4T 32.3.1. From there, I followed instructions here to upgrade to JetPack 4.4.1. The documentation here seems to suggest JP 4.4.1 comes with the Multimedia API. Is this not the case? If so, where can I find the headers and source for libargus?

Running sudo apt show nvidia-l4t-jetson-multimedia-api results in:

Package: nvidia-l4t-jetson-multimedia-api
Version: 32.4.4-20201016124427
Priority: standard

yet I’m still unable to run anything. Notably, I have no “tegra_multimedia_api” directory under home. Is the Libargus library only meant for internal usage at NVIDIA or something? I am surprised that there’s a complete lack of usable documentation.

Hi gerardmaggiolino,

The Multimedia API is install in:

/usr/src/jetson_multimedia_api

Thank you for providing this.

I’m building my project with CMake, but the build fails because libargus.so is not found.

    # Nanocam source with all Libargus libraries.
    file(GLOB ARGUS_SRC_LIST
            "${MM_ROOT}/samples/common/classes/*.cpp"
            "${MM_ROOT}/argus/samples/utils/*.cpp")
    link_directories("/usr/lib/aarch64-linux-gnu/tegra"
            "/usr/local/cuda/lib64"
            "/usr/lib/aarch64-linux-gnu"
            "/lib/aarch64-linux-gnu")
    include_directories("${MM_ROOT}/include"
            "${MM_ROOT}/include/libjpeg-8b"
            "${MM_ROOT}/argus/samples/utils"
            "/usr/include/libdrm"
            "/usr/local/cuda/include")
    file(GLOB SRC_LIST "${PROJECT_SOURCE_DIR}/src/*.cpp")
    add_library(nanocam STATIC "${SRC_LIST}" "${ARGUS_SRC_LIST}")
    target_include_directories(nanocam PUBLIC "${PROJECT_SOURCE_DIR}/include")
    target_link_libraries(nanocam PUBLIC
            "argus"
            "nvjpeg"
            "drm"
            "nvbuf_utils"
            "nvosd"
            "EGL"
            "GLESv2"
            "X11"
            "cuda")

The build fails with

[ 93%] Built target nanocam
[ 96%] Linking CXX executable SecurityCameraOnboard
/usr/bin/ld: cannot find -largus

I searched this forum, but only found threads saying libargus.so source is not provided or suggesting to follow the README.txt in /usr/src/jetson_multimedia_api/libargus, which I have done.

I have tried to follow along with the Rules.mk and Makefiles under /usr/src/jetson_multimedia_api/samples, but I’m not making out what I should be including differently. If I do not have link library “argus” the linker fails to the first mention of “Argus” in my source.

I have also tried searching with find / -iname libargus* but to no avail.

Maybe libnvargus:

/usr/lib/aarch64-linux-gnu/tegra/libnvargus.so
1 Like

Hi gerardmaggiolino,

Please follow “/usr/src/jetson_multimedia_api/argus/README.TXT” steps to install all requirements.

1 Like

Thank you all for the great responses from all the NVIDIA people, this has helped a bunch.

In case other people have similar questions, here’s a summary of the information:

  • By default, the JetPack flash includes the Multimedia API. If you’ve used the sdkmanager to flash, you might have not included the Multimedia API, and you must re-flash.
  • The Multimedia API is not in $HOME as many tutorials suggest, but /usr/src/jetson_multimedia_api or usr/src/nvidia/tegra_multimedia_api.
  • Libargus is not open source, but the compiled library is provided as libnvargus.so.
  • Learning from the samples in /usr/src/jetson_multimedia_api/samples, which are referred to in the libargus documentation online (e.g. 00_video_decode, 01_video_encode, …) is very difficult as they are complex with complex build rules.
  • Learning from /usr/src/jetson_multimedia_api/argus/samples is much better, the samples are simple, complete, and the build rules are also simple.

The samples at the location of the last bullet point have simple CMakeList.txt rules that you can adopt directly. The /usr/src/jetson_multimedia_api/argus/cmake/FindArgus.cmake file is particularly useful, as it includes everything you need to build with libargus.

If you do look at and adopt the file, make sure to change the relative path for find_path(ARGUS_INCLUDE_DIR Argus/Argus.h HINT ...)to the absolute path of /usr/src/jetson_multimedia_api/argus/include.

Overall, there is an immense amount of code in argus/samples, but despite the 10,000+ lines of C++ code developed as examples, there is pretty much zero library documentation. There is no introduction to how to effectively use the library, the workflow of classes, etc. It’s definitely hard to get familiar with.