Deepstream with Baslar and Omron Cameras (Custom image capture APIs)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Xavier / Baslar USB3 Camera/ Omron USB3 Camera
• DeepStream Version
• JetPack Version (valid for Jetson only)
4.4 +
• TensorRT Version

I have trained a model with TLT and would like to deploy my TLT models with the Python Deepstream API.

The problem I am facing is that the cameras cannot be seen using


Instead one must install the camera drivers and then use their own custom API.

How can I code my python application to use the custom camera API to grab camera frames and then send each of these frames to the python deepstream api? (sending them to some sort of sink pad) I see no other way to use third party cameras with deepstream.

Camera --> pyplone.framegrab() --> Python Deepstream sink --> run through deepstream pipeline.

Please let me know. Thank you!

Any Updates? Bump

Hi @mbufi,
Sorry for delay!

The problem I am facing is that the cameras cannot be seen using

Is there any /dev/video* device node? If not, looks the USB camera is not enumrated successfully, you could file a BSP ticket under for the driver issue.

for the application, you can refer to sample.
And, you can refer to to install DeepStream python.


Hi @mchi

Thanks for the reply. I am curious if this makes sense… because they are custom dveloper API drivers. I am surprised no one has run into this same issue.

The deepstream install and python api are straightforward to me… but how to integrate 3rd part cameras is not. I think this is one of the biggest issues I see commonly when having issues with deepstream among many developers.


DeepStream is GStreamer based SDK, it can integrate 3rd party GStreamer plugins.
You can develop the camera capture as a GStreamer plugin and use it in DeepStream.