I am looking into the VIC module on the Jetson Orin NX for our product. The Orin Series System-On-Chip TRM lists VIC supported features, but it was not clear to me what API {MMAPI, VPI} is to be used to make use of some of these features and/or modules. Below, are listed some of the features.
Thank you for getting back to me on this and really appreciate it that you sent me the links to the source and examples where I could research these topics further.
The data sheet (Jetson_Orin_NX_DS-10712-001_0.5.pdf) discusses the Image Signal Processor, but I don’t see any detailed discussion on the features etc as compared to other modules in the TRM. Would it be possible for you to point me to a resource where I could get more information on using the ISP, and the API to use it ?
I have a imx219 camera connected to my Orin NX, and have a few questions which features from which hardware block are being utilized.
When I use the /jetson_multimedia_api/argus/samples/denoise sample I don’t observe any VIC activity in jtop. Would this mean that the setDenoiseMode() and setDenoiseStrength() would use the Jetson ISP (Jetson Orin NX section 1.6.3 Hardware Noise Reduction)
a) For the above command I notice VIC and GPU being utilized. Is the VIC being used for RG10 → NV12, while the GPU is being used because of nveglglessink?
b) nvarguscamerasrc uses setDenoiseMode() and setDenoiseStrength(), respectively. So, would it correct to say that the Jetson ISP hardware denoising is used here instead of the TNR capability of the VIC ?
c) The original command I tried was with nvdrmvideosink. But this did not work and I had to switch to nveglglessink. In my setup I do have the jetson connected to a regular monitor. What would cause having nvdrmvideosink in the pipe produce an error?
nvvidconv does not seem to have a tnr propery. Does it mean that VIC’s TNR capability is usable only through JMMAPI’s NvVideoConverter helper class and VPI?
Is lens distortion correction feature of VIC usable only through VPI?
TRM mentions 4x4 bicubic pixel interpolation filter. Do these pixel interpolation filters mentioned in the TRM map to nvvidconv’s interpolation-method property? If so, how is the interpolation filter method of 4x4 cubic provided? I am on JP R36.4.3.
gst-inpect-1.0 nvv4l2camerasrc does not show interlace-mode in my setup. Is that expected? How is de-interlace and inverse telecine to be exercised? Is there a sample or gst command ?
Yes, VIC is used in calling copyToNvBuffer(), createNvBuffe(), NvBufSurfaceTransform(). GPU usage is from nveglglessink. The plugins are open source and you may check the source code.
Thank you for clarifying my doubts. I have a few more questions:
“Lens shading compensation” is mentioned in the ISP section of Orin NX data sheet. I wasn’t able correlate this feature to nvarguscamerasrc property or libargus API. How is the ISP configured/programmed to carry out this feature?
jetson_multimedia_api/samples/unittest_samples/transform_unit_sample implements transformation for a subset of formats. It uses the fill_bytes_per_pixel() where the bytes/pixel/plane are hardcoded. Where can one find bytes_per_pixel per plane information for all the color formats in nvbufsurface.h?
Does transform_unit_sample only handle a subset of formats? Would it be correct to say that the VIC can handle all formats found in NvBufSurfaceColorFormat in nvbufsurface.h?