Argus to Opencv or Datatype for cuda kernel

Hi,

I’m using jetson multimedia api samples to develop my own application.

basically, I need to grab frames from MIPI camera using libargus,
and then use this frame buffer to conduct some image processing using cuda kernal.

if I can warp it in cuda::gpumat, it would be good for multiple purpose.

based on the exsisting posting, I tried to implement it, however, it seems it is outdated.

Therefore, this Argus camera to opencv Mat container example gives bus error.
is it still vaild to use these kind of NvBufferMemMap/ NvBufferMemSyncForCpu /NvBufferMemUnMap method?
or How can I realise this?

Thank you.

bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
{
    char filename[FILENAME_MAX];
    sprintf(filename, "output%03u.jpg", (unsigned)frameNumber);
    cv::Mat test;
    void *pdata = NULL;
    NvBufferMemMap(fd, 0, NvBufferMem_Read, &pdata);
    NvBufferMemSyncForCpu(fd, 0, &pdata);
    cv::Mat imgbuf = cv::Mat(CAPTURE_SIZE.height(), CAPTURE_SIZE.width(), CV_8UC4, pdata);
    cv::Mat display_img;
    cvtColor(imgbuf, display_img, cv::COLOR_RGBA2BGR);
    NvBufferMemUnMap(fd, 0, &pdata);

    cv::imwrite(filename, display_img);

    return true;
}
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Waiting until producer is connected...
PRODUCER: Available Sensor modes :
PRODUCER: [0] W=3856 H=2180 
PRODUCER: [1] W=2608 H=1964 
PRODUCER: [2] W=1920 H=1080 
PRODUCER: [3] W=1928 H=1090 
PRODUCER: [4] W=3856 H=2180 
PRODUCER: [5] W=2608 H=1964 
PRODUCER: [6] W=1920 H=1080 
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
Bus error (core dumped)

hello jahwan.oh,

may I also confirm the Jetpack release you’re working with.
please check release tag, $ cat /etc/nv_tegra_release for confirmation.

Hi, @JerryChang
Here you go.

nx-dev:~$ cat /etc/nv_tegra_release

R35 (release), REVISION: 2.1, GCID: 32413640, BOARD: t186ref, EABI: aarch64, DATE: Tue Jan 24 23:38:33 UTC 2023

hello jahwan.oh,

it’s now using nvbufsurface.
you may see-also developer guide, Buffer Management API module.

please also download MMAPI package, $ sudo apt install nvidia-l4t-jetson-multimedia-api
you may refer to sample application, 07_video_convert for demonstration.

@JerryChang Thank you for the advise.

I was using the jetson multimedia api which is included from the Jetpack.
But how do you usually include this api to the custom project?

I managed to make a CMakeLists.txt based on jetson multimedia api’s MakeFIle and Rules.mk.
But is there any better way or easy way to include MMAPI using find_package(MMAPI)… with FindMMAPI.cmake if you guys can provide?

Best Regards,

hello jahwan.oh,

let’s me double confirm the custom project’s environment.
are you going to built the sample application on the other setup? why don’t you based-on built this on native Jetpack release version and deploy the app to custom project.

We are not using MMAPI currently, but from now on we want to actively use this MMAPI.
To integrate MMAPI, I thought some kind of FineMMAPI.cmake with CMakeLists.txt would be easy solution.

Thank you.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.