Can I pass NvBuffer to IFrameProducer::presentFrame?

I am trying to use API IFrameProducer, which was introduced earlier this year.
In the documentation I see only a single use case:

  1. call IFrameProducer::getFrame(FrameBuf** frame,…)
  2. call IFrameBuf::loadInputImageFromFile(const char *fileName)
  3. call IFrameProducer::presentFrame

This assume that I have all frames as separate files, does it?
Is it possible to make it read frames from one continuous stream?
Or, better yet, can I pass NvBuffer to presentFrame?

Thank you

presentFrame it’ll lock the buffer for read access, it’ll be unlock when it’s returned from EGL.
please see-also [FrameBuf State Diagram] in the FrameProducer.h.
for example, /usr/src/jetson_multimedia_api/argus/include/EGLStream/FrameProducer.h

Yes, but the current loadInputImageFromFile is not very useful - you are expecting me to divide a video into individual frames - one per file.
My question is whether there are any alternatives to avoid having individual files.
I would like to read frames from one long video stream and write them to IFrameBuf.
Or, even better, put frame to a user allocated NvBuffer and pass that buffer to presentFrame.
Is it possible to do that?

If you would like to share NvBufSurface between processes, the working solution is this method:
NvBuffer to NvBufSurface copy without CPU - #7 by DaneLLL

It is not supported to get NvBufSurface while using IFrameProducer.

No, I do not need to share between processes.
I need to feed a video stream to IFrameProducer.
Not a single image file, but a long continuous stream of frames, as efficiently as possible.

IFrameProducer is implemented for Argus source(Bayer camera sensor using hardware ISP engine) so is not able to be used in this use-case.

Do you know if EGLImage can be hooked with EGLStream? NvBufSurface can be mapped to EGLImage. If EGLImage can be hooked with EGLStream, you can use EGLStream Producer/Consumer for the use-case.

No, I need to pass raw video to Argus engine for auto-exposure, denoising and other enhancements. I am not sure how EGLStream can help passing frames to Argus.

I found that, in fact, it is possible to write frames to FrameBuf using this kludge:
EGLStream::FrameBuf * buffer {};
Argus::Status status = iFrameProducer->getFrame(&buffer);
EGLStream::IFrameBuf ibuffer = Argus::interface_castEGLStream::IFrameBuf(buffer);
*)ibuffer)[3], 0, 0, frame_width, frame_height, frame_data);
but I would prefer to use a documented API for that.

Your use-case is not supported. Argus software stack is a complete pipeline form Bayer sensor capturing RAW data, to getting YUV420 data in ISP output. It is not supported to feed RAW data stored in memory to ISP engine.

The method you shared may work but it is not tested. It is possible to have potential issues.

But you added FrameProducer interface recently and I am the first person who tried using it.
And I found it to be inadequate because Argus is a video pipeline, not a still photo one,
but you only implemented a single IFrameBuf::loadInputImageFromFile API call, which cannot be used to feed video to argus, only individual photos, which makes no sense.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.