How to get YUV Buffer pointer?

Hi,NVIDIA
I have dumped the YUV buffer successfully and I got the buffer by fd.
However,I prefer to get the pointer of the buffer because I would send the buffer to my own algorithm and use the processed buffer to preview.
If I use the fd to get buffer,I had to memcpy the buffer and it cost too much time so that it couldn’t meet the requirement of fps at 73.So I thought I should use the buffer pointer directly to save time.
How can I get the pointer of YUV buffer or is there any better way to meet my requirement?

hello 932315769,

you may check the Argus example to implement your use-case,
could you please also share what’s your actual use-case briefly.
thanks

main_1125.cpp (12.8 KB)
Hi,Jerry
I have uploaded the sample I used now.
I would like to get the YUV buffer from eglstream and send the buffer to customed algorithm and preview.For example,I get the buffer per frame and use OPENGL to scale the image and render it to the surface on the screen.
I think if I can got the pointer of buffer, I could do algorithmic processing and render previews in real time.

hello 932315769,

may I know what’s the processing, which processor it’s executed.
note,
it’s block linear layout for memory surfaces accessed internally. memory copy (CPU process) here is involved since the formats should be pitch linear for EGLstream.

Hi,Jerry
In my usecase,I don’t need the nv origin preview by libargus.I just only want to get the image data so that I could capture,preview and do some other things with it.
However,the image data needs to meet some criteria such as pixel format(yuv), fps(72) and the data should be processed by ISP so I want to get it through libargus.

Hi,
Please convert the buffer to pitch linear through NvBufferTransform() and then call NvBufferMemMap() to get CPU pointer to each plane. All functions are declared in

/usr/src/jetson_multimedia_api/include/nvbuf_utils.h

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.