Hi,
I’m using jetson multimedia api samples to develop my own application.
basically, I need to grab frames from MIPI camera using libargus,
and then use this frame buffer to conduct some image processing using cuda kernal.
if I can warp it in cuda::gpumat, it would be good for multiple purpose.
based on the exsisting posting, I tried to implement it, however, it seems it is outdated.
Therefore, this Argus camera to opencv Mat container example gives bus error.
is it still vaild to use these kind of NvBufferMemMap/ NvBufferMemSyncForCpu /NvBufferMemUnMap method?
or How can I realise this?
Thank you.
bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
{
char filename[FILENAME_MAX];
sprintf(filename, "output%03u.jpg", (unsigned)frameNumber);
cv::Mat test;
void *pdata = NULL;
NvBufferMemMap(fd, 0, NvBufferMem_Read, &pdata);
NvBufferMemSyncForCpu(fd, 0, &pdata);
cv::Mat imgbuf = cv::Mat(CAPTURE_SIZE.height(), CAPTURE_SIZE.width(), CV_8UC4, pdata);
cv::Mat display_img;
cvtColor(imgbuf, display_img, cv::COLOR_RGBA2BGR);
NvBufferMemUnMap(fd, 0, &pdata);
cv::imwrite(filename, display_img);
return true;
}
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Waiting until producer is connected...
PRODUCER: Available Sensor modes :
PRODUCER: [0] W=3856 H=2180
PRODUCER: [1] W=2608 H=1964
PRODUCER: [2] W=1920 H=1080
PRODUCER: [3] W=1928 H=1090
PRODUCER: [4] W=3856 H=2180
PRODUCER: [5] W=2608 H=1964
PRODUCER: [6] W=1920 H=1080
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
Bus error (core dumped)