Camera ingest hardware architecture on Jetson Xavier NX


I have a USB3 Basler camera that I would like to use with Gstreamer and Jetson Xavier NX. I have been suggested to use library: Imaginghub

I dont understand the topic properly and perhaps someone could explain.
I want to use Gstreamer because I feel its fast, reliable and you can split the incoming camera stream into an appsink and file/UDP stream sink directly in the pipe. But since the gstreamer example above (link) using Basler API to access the camera I am afraid that I will utilize alot if CPU processing to read and then pass the capture to g-streamer.

I get the feeling it does not stay close to hardware processing when doing this, meaning that you go back and forth a lot between hardware accelerated processing and slow memory processing on the cpu’s not GPU. How does this data stream work? Can someone explain how the data is processed and sent around in the Jetson Xavier NX. The top of the attached image shows the BaslerToGStream API structure. The second picture shows me doodling how I imagine the data flow from USB3 port to CPU and to encoder, if using the API. I just want to be very fast in the processing using GStreamer and not loose time. The last one is if you want to doodle to describe. I hope I make sense out of this…


Jetson Xavier NX
Basler 90umNIR:

Additional information. Am I on the right track if I start looking into Playbin. It is possible to my understanding to inject a source if you have a sort emit()

'appsrc name=source ! videoconvert ! `

We have steps to launch USB cameras on Jetson platforms:
Jetson Nano FAQ
Q: I have a USB camera. How can I launch it on Jetson Nano?

If the source supports YUV422 format, it can be run without extra memory copy from CPU buffers to NVMM buffers(NvBuffer). If it does not support YUV422, you would need to use appsrc and convert/copy the data to NVMM buffers. In this case, it takes certain CPU usage.