Argus API, get raw bayer frames into gstreamer using zero data copies

Hi,

I am using the Xavier devkit + sony IMX274 for testing, but I am having problems using the argus API.

I am working on a project where I have an image pipeline(real-time) working on the raw Bayer image and ending with a demosaic fcn before displaying.

So far i have made it work with GStreamer by using v4l2src ( but I don’t think this is optimal for performance + blocking future ISP usage)
v4l2 (x-bayer, format=bggr) → custom_cuda_plugin → fakesink

I would like to have the Bayer frames coming through the argus API and then into my gstreamer plugin with cuda code.
I have looked at many of the samples and the one I’m looking at now is “cudaBayerDemosaic”
That seems to give me the frames bu somehow the nvidia plugins never support raw frame, and its the same for nvbuffers.
I get stuck on “nveglstreamsrc”, “nvcamerasrc” because they all only support NV12, and similar formats never bayer images.

Can you help me with a way to the below pipeline, with zero data copies?
argus_API → x-bayer, format=bggr(nvmm) → custom_cuda_plugin → other_gstreamer_plugins.

I would like to use Argus API, so that in the future I could move some of the image algorithms into the ISP(i hope), and for reduced latency, as this is very important for my application.

Hi,
A possible solution is to customize nvv4l2camerasrc to run like:

capture RAW frame into CUDA buffer -> implement CUDA code for debayering -> NV12 or RGBA

So that you can run like:

nvv4l2camerasrc ! video/x-raw(memory:NVMM),format=NV12 or RGBA ! other_gstreamer_plugins

For capturing RAW frames into CUDA buffer, please refer to

/usr/src/jetson_multimedia_api/samples/v4l2cuda/

Hi DaneLLL

I am unsure what you want me to do.
when you say " customize nvv4l2camerasrc " does that mean I can find the code to this plugin somewhere?
I searched for how to modify nvv4l2camerasrc and found several forum posts, but never how to/where to find the code.

I would actually prefer to use the argus plugin, and I think I got a decent solution for v4l2src in GStreamer, which does something similar to the “v4l2cuda” sample.

Do you have a solution using Argus API or can I modify nvv4l2camerasrc to use argus API ( here assuming i can find the code for nvv4l2camerasrc)

Hi,
The source code is in the package:
https://developer.nvidia.com/embedded/l4t/r32_release_v6.1/sources/t186/public_sources.tbz2

Please download it and check. There is no existing code for this use-case so you would need to do the implementation. To convert the RAW data into RGBA or NV12 through CUDA.

Hi DaneLLL,

I found some time to look at the nvarguscamerasrc (R35 release) plugin, and I would like some help to use the raw Bayer frame and send it through the GStreamer pipeline. I have patched GStreamer to accept rggb16 (and similar Bayer formats) I hope that patch will be upstreamed one day.

I looked at sample “rawBayerOutput” + “cudaBayerDemosaic” and tried to adapt the plugin to extract the same bayer raw16 image.

Basically I want to add this line.
iStreamSettings->setPixelFormat(PIXEL_FMT_RAW16);

And I have tried to change the nvmm color format to different sizes to see if that helps. For some reason nvmm doesn’t seem to have 10/16 bit bayer format ?
NVBUF_COLOR_FORMAT_NV12, NVBUF_COLOR_FORMAT_RGBA, NVBUF_COLOR_FORMAT_UYVY

I have tried several ideas they just all end up with the same error I have pasted below. I must be looking at the wrong place to try fixing it.

The plugin failed at this line every time "
iEventProvider_ptr->waitForEvents(src->queue.get(), WAIT_FOR_EVENT_TIMEOUT);" by timeout and then failing.

I get this error from argus all the time. I don’t see the link to where it comes from, I guess that code is not public?

(Argus) Error InvalidState: (propagating from src/api/ScfCaptureThread.cpp, function run(), line 110)
SCF: Error InvalidState: Output buffer not sized correctly for raw DOL output (in src/components/CaptureSetupEngineImpl.cpp, function chooseGenInstFunc(), line 200)
SCF: Error InvalidState: (propagating from src/components/CaptureSetupEngineImpl.cpp, function doGetInstructions(), line 2134)
SCF: Error InvalidState: (propagating from src/components/CaptureSetupEngine.cpp, function getInstructionList(), line 308)
SCF: Error InvalidState: (propagating from src/components/CaptureSetupEngine.cpp, function setupCC(), line 216)
SCF: Error InvalidState: (propagating from src/api/Session.cpp, function capture(), line 792)

Can you assist with the above problem?

Hi,
The RAW formats are not supported in NvBufSurface, so you would need to convert RAW data into supported format(NV12, RGBA, UYVY) and then pass to next element. It may not be possible to pass CUDA buffers to next element.

Hi, it would be nice if you could elaborate a bit more.

So you want me to make and cuda-comsumer that can take the raw image and put it into like RGBA and use the RG color for the data and BA = zeros for instance?
Can I get from cuda back to nvmm again and send it through gst pipeline ?

Do you plan to have raw support in nvmm? It makes sense when argus supports it if you ask me. What people out in the world use it for is up to them :)

Hi,
Since RAW format is not supported in NvBuffer, we would need to copy RAW data to a CPU buffer and then pass to next element in like video/x-bayer, format=bggr. This has an additional memory copy so it may not be a good solution. Therefore we would suggest convert to NvBuffer-supported format in the implementation.

There’s hardware ISP engine in AGX Xavier so generally we get NV12 data after ISP. Your use-case is unique and there’s no existing implementation. Would need to check the current code and do customization.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.