VIC Features API on Orin NX

Hello,

I am looking into the VIC module on the Jetson Orin NX for our product. The Orin Series System-On-Chip TRM lists VIC supported features, but it was not clear to me what API {MMAPI, VPI} is to be used to make use of some of these features and/or modules. Below, are listed some of the features.

7.3.1.3.6 Rotation - {45 ,315, etc} degrees

Subpixel Source Rectangles

Bi-linear filter

7.3.1.3.7 De-interlacing - DiSi1, BOB, WEAVE

7.3.1.3.9 Blender

7.3.1.4.1 4x4 bicubic pixel interpolation filter

Thanks,

Vangogh

Hi,
We have implemented the functions into NvBufSurface APIs. Please check the samples:
Jetson Linux API Reference: 07_video_convert (NvBufSurface conversion) | NVIDIA Docs
Jetson Linux API Reference: 13_argus_multi_camera (multi image capture & composite) | NVIDIA Docs

And also implemented nvvidconv, nvcompositor plugins in gstreamer. You can get source code of the plugins in

Jetson Linux Release 36.4.4 | NVIDIA Developer
Driver Package (BSP) Sources

Hi Dane,

Thank you for getting back to me on this and really appreciate it that you sent me the links to the source and examples where I could research these topics further.

Best regards,

vangogh

Hi Dane,

The data sheet (Jetson_Orin_NX_DS-10712-001_0.5.pdf) discusses the Image Signal Processor, but I don’t see any detailed discussion on the features etc as compared to other modules in the TRM. Would it be possible for you to point me to a resource where I could get more information on using the ISP, and the API to use it ?

Thank you,

vangogh

I was able to answer my question regarding API for ISP: libargus and gst-nvarguscamera_src would be the resource that I need to look at.

Hi,
You can install the samples by running the commands on Jetson device:

$ sudo apt update
$ sudo apt install nvidia-l4t-jetson-multimedia-api

The samples are installed to

/usr/src/jetson_multimedia_api

Hi,

I have a imx219 camera connected to my Orin NX, and have a few questions which features from which hardware block are being utilized.

  1. When I use the /jetson_multimedia_api/argus/samples/denoise sample I don’t observe any VIC activity in jtop. Would this mean that the setDenoiseMode() and setDenoiseStrength() would use the Jetson ISP (Jetson Orin NX section 1.6.3 Hardware Noise Reduction)

  2. Using nvarguscamerasrc:

gst-launch-1.0 nvarguscamerasrc ! “video/x-raw(memory:NVMM), width=(int)640,
height=(int)480, format=(string)NV12,
framerate=(fraction)30/1, tnr-mode=2” ! queue ! nveglglessink -e

a) For the above command I notice VIC and GPU being utilized. Is the VIC being used for RG10 → NV12, while the GPU is being used because of nveglglessink?

b) nvarguscamerasrc uses setDenoiseMode() and setDenoiseStrength(), respectively. So, would it correct to say that the Jetson ISP hardware denoising is used here instead of the TNR capability of the VIC ?

c) The original command I tried was with nvdrmvideosink. But this did not work and I had to switch to nveglglessink. In my setup I do have the jetson connected to a regular monitor. What would cause having nvdrmvideosink in the pipe produce an error?

  1. nvvidconv does not seem to have a tnr propery. Does it mean that VIC’s TNR capability is usable only through JMMAPI’s NvVideoConverter helper class and VPI?

  2. Is lens distortion correction feature of VIC usable only through VPI?

  3. TRM mentions 4x4 bicubic pixel interpolation filter. Do these pixel interpolation filters mentioned in the TRM map to nvvidconv’s interpolation-method property? If so, how is the interpolation filter method of 4x4 cubic provided? I am on JP R36.4.3.

  4. gst-inpect-1.0 nvv4l2camerasrc does not show interlace-mode in my setup. Is that expected? How is de-interlace and inverse telecine to be exercised? Is there a sample or gst command ?

Thank you,

vangogh

Hi,

Yes, VIC is used in calling copyToNvBuffer(), createNvBuffe(), NvBufSurfaceTransform(). GPU usage is from nveglglessink. The plugins are open source and you may check the source code.

Your understanding is correct.

Please check
Accelerated GStreamer — NVIDIA Jetson Linux Developer Guide

If you disable Ubuntu desktop and load DRM driver, it should be working fine.

Please use VPI - Vision Programming Interface: Temporal Noise Reduction

You are right.

The interpolation methods are defined as Nearest, Bilinear, etc. You may try the methods.

De-interlacing is supported in video decoding. It is not supported in camera input.

You may refer to the examples of NvBufSurface + VPI:
Making sure you're not a bot!

Hi,

Thank you for clarifying my doubts. I have a few more questions:

  1. “Lens shading compensation” is mentioned in the ISP section of Orin NX data sheet. I wasn’t able correlate this feature to nvarguscamerasrc property or libargus API. How is the ISP configured/programmed to carry out this feature?

  2. jetson_multimedia_api/samples/unittest_samples/transform_unit_sample implements transformation for a subset of formats. It uses the fill_bytes_per_pixel() where the bytes/pixel/plane are hardcoded. Where can one find bytes_per_pixel per plane information for all the color formats in nvbufsurface.h?

  3. Does transform_unit_sample only handle a subset of formats? Would it be correct to say that the VIC can handle all formats found in NvBufSurfaceColorFormat in nvbufsurface.h?

Thanks,

vangogh