Using VideoSource frames with VPI in C++

Hello, all!

I’ve been trying for a while to wrap data from a camera stream obtained via videoSource::Capture into a VPIImage but failed repeatedly. The VPI documentation examples only seem to use OpenCV as the video/image sources.

How can I wrap the uchar3* image output from videoSource::Capture into a VPIImage? (I would like to take advantage of managed memory on Jetson Nano).

Thanks in advance!

Hi,

Is videoSource::Capture the OpenCV video source?
So you can use the sample directly?

Could you share more about the issue you met?
(Ex. source code, log, error, …)

Thanks.

Hello, thanks for the fast reply.

Blockquote
Is videoSource::Capture the OpenCV video source?

No, the ‘videoSource’ I refer to is the class from jetson-utils/video, which captures camera frames using gstCamera (my case is a CSI camera). As far as I could see from gstCamera code, the data is already treated as device memory, but the format returned is uchar3*, which contains pixel data in x,y,z (x:R, y:G, z:B ?) format.

I would like to process this data using VPI algorithms, therefore, as far as I understood from VPI doc, requires data to be in VPIImage format. It’s not clear as well if data must be locked (vpiImageLock) when coming from videoSource, like shown on VPI samples.

I am currently not able to understand how to make this wrapping, from uchar3* to VPIImage + VPIImageData.

I tried using vpiImageCreateCUDAMemWrapper and vpiImageCreateHostMemWrapper without success, maybe I am not using it properly.

Some code samples on the wrapping of uchar3* with VPIImage would be very appreciated.

For those who might interest, I managed to make it work by using the following code, escaping safety checks:

        ...
        //Init jetson-utils videoSource
        camera = videoSource::Create(input_res.c_str(), camera_options);

        // capture frame:
        uchar3* image = NULL;
        camera->Capture(&image);

        // Manually fill VPIImageData with your video params
        VPIImageData vpiImageData;
        vpiImageData.format = VPIImageFormat::VPI_IMAGE_FORMAT_RGB8;
        vpiImageData.numPlanes = 1;
        vpiImageData.planes[0].data = image;
        vpiImageData.planes[0].width = 1280;
        vpiImageData.planes[0].height = 720;
        vpiImageData.planes[0].pitchBytes = 3 * 1280;
        vpiImageData.planes[0].pixelType = VPIPixelType::VPI_PIXEL_TYPE_3U8;

        // Use vpiImageData in your vpi algos

I am able to export the contents to OpenCV’s Mat to check for data consistency and it seems to be working well, except there is a color inversion between planes R and B, how to solve it? I tried changing VPIImageData.format to VPI_IMAGE_FORMAT_BGR8 but it does not fix it, maybe its a problem with color conversion on gstManager?

Using the following code to view the stream in OpenCV:

        vpiImageDataExportOpenCVMat(vpiImageData, &cvOut);
        cv::imshow("output", cvOut);
        cv::waitKey(1);

Hi,

Are you finding a way to convert between jetson-utils and VPI?
Below is a python sample that might help:

https://elinux.org/Jetson/L4T/TRT_Customized_Example#VPI_with_Jetson-utils

Thanks.

Hello,

The python examples unhappily are not useful since the data first goest through numpy, leading to the use of a completely different API, that does not exist in C++.
I managed to view the image with the code from my last reply, (Filling up VPIImageData manually) but the color channels R and B are inverted when exported to cv::Mat, I would really appreciate some help with this matter.

Please help, I am strugling a lot trying to understand how to do the uchar3* to VPIImage conversion.

Is VPIImage an object or just a pointer to image data (pixels in a given image format)?

When should vpiLockImage be used? What is its purpose, considering the shared memory architecture of jetson boards?

The description of jetson-utils usage with VPI is pretty inexistent in C++, but the documentation encourages the usage of videoSource, so please, help with that.

If this is not the correct channel for development support, please point which one is.

@_padreco the uchar3* from jetson-utils is just a pointer to CUDA memory. VPIImage is a struct, and you should be able to use vpiImageCreateWrapper() with VPI_IMAGE_BUFFER_CUDA_PITCH_LINEAR:

https://docs.nvidia.com/vpi/group__VPI__Image.html#ga3e7cf2520dd568a7e7a9a6876ea7995c

In Python, both VPI and jetson-utils support __cuda_array_interface__, so interoperability should be transparent and not need to use numpy:

https://github.com/dusty-nv/jetson-inference/blob/master/docs/aux-image.md#cuda-array-interface

Hello, @dusty_nv

Thanks for the reply, it clarified a lot, and thanks for the python example, but my application requires C++ usage.

I am currently using VPI version 1.2 (latest for Jetson Nano), which does not have the method you suggested, but it has the method vpiImageCreateCUDAMemWrapper which I was currently trying to use to wrap data into a VPI Image, but, frpm the code example given, the blue and red channels ended up swapped.

In the end, I decided to go for CUDA compiled OpenCV since some of the functionality I was looking for (RGB to HSV conversion) is missing on VPI libraries.

Thanks for the help and the good work on jetson_utils, and please, whenever possible, try to add examples on VPI + jetson_utils usage.

If you’re looking to OpenCv Cuda RGB to HSV, you may see this topic: GPU Acceleration Support for OpenCV Gstreamer Pipeline - #3 by Honey_Patouceul

Thanks for the reply, I have managed to get what I wanted using OpenCV with CUDA.

Unhappily, I had to abbandon the usage of VPI.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.