N/W DMA into Hardware buffer

Hello,

I am using Orin NX and exploring the jetson_multimedia_api/samples/05_jpeg_encode sample. A few questions that I have are:

  1. Here does hardware buffer mean memory that is internal to the NVJPEG unit that is mmap’d to CPU address space?
  2. Can a network device DMA directly into such a hardware buffer ?

Thanks.

Hi,

The DMA buffer is NvBufSurface and hardware engines such as VIC, NVENC, NVJPG can access the frame data ddurectly. For mapping to CPU, please call NvBufSurfaceMap() and NvBufSurfaceUnMap()

If the device supports v4l2, the frame data can be captured into NvBufSurface directly. Please refer to the sample:

/usr/src/jetson_multimedia_api/samples/12_v4l2_camera_cuda

Data alignment has to be considered for certain resolutions. For some resolutions, pitch and width of the DMA buffer is not identical, so need to check if the v4l2 source can generate frame data to fit the alignment.

DaneLLL,
Thank you for pointing me to this application. When I use the 12_v4l2_camera_cuda application, I notice a black box in the upper left corner with the cuda_postprocess(). This is as expected. The rest appears green. Is this the expected behavior?

A couple other questions:

  1. How do I use this application to display the captured image or a video stream without it appearing green?

  2. The application sets the filter as NvBufSurfTransformInter_Algo3. This enum is defined as
    Specifies GPU-Lanzos, VIC-Smart interpolation. Does this mean that if the operation were to be performed on a CUDA stream GPU-Lanzos algorithm would be used?
    → I tried setting the session parameters wherein compute_mode is GPU while gpu_id and stream are 0. This failed. Would this be because of incorrect gpu_id and stream id?

Thanks

Hi,
Please refer to the guidance in

Jetson AGX Orin FAQ
Q: I have a USB camera. How can I launch it on AGX Orin?

And see if you can get the camera preview by following it.

If you cannot see valid camera preview, please run the command and share the prints for reference:

$ v4l2-ctl --list-formats-ext

so that we can suggest next.

v4l2-ctl -d/dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture

    [0]: 'RG10' (10-bit Bayer RGRG/GBGB)
            Size: Discrete 3280x2464
                    Interval: Discrete 0.048s (21.000 fps)
            Size: Discrete 3280x1848
                    Interval: Discrete 0.036s (28.000 fps)
            Size: Discrete 1920x1080
                    Interval: Discrete 0.033s (30.000 fps)
            Size: Discrete 1640x1232
                    Interval: Discrete 0.033s (30.000 fps)
            Size: Discrete 1280x720
                    Interval: Discrete 0.017s (60.000 fps)

Hi,
The camera is a Bayer sensor. Do you have sensor driver ready for using ISP engine? For debayering, we suggest use ISP engine and please check sensor driver programming guide to enable the sensor through ISP engine.

Hi,

I wasn’t quite clear on what you had said. To answer your question - I don’t have a special driver for the camera.

I wanted to add that I am able to get some video streaming to work using the JetsonHacks csi camera application. Therefore, I am thinking the camera is using a standard driver which is available in the image. Would I still need to enable the sensor through ISP engine?

Thanks

Hi,

Do you mean this application:
CSI-Camera/simple_camera.py at master · JetsonHacksNano/CSI-Camera · GitHub

From that repository but I used the simple_camera.cpp, instead.

Hi,
The camera looks to be Bayer sensor using Argus stack. Please check the samples and try:

/usr/src/jetson_multimedia_api/samples/09_argus_camera_jpeg/
/usr/src/jetson_multimedia_api/samples/10_argus_camera_recording/