I am trying to use Deepstream for inference while using an ADlink NEON-2000 camera. The camera itself on these devices are from Basler, and can be accessed from Pylon. ADlink have a software suite (EVA SDK) that aims to do low-code/no-code deployments of AI vision projects, but I would rather use Deepstream, as it seems better maintained, more widely used, and more advanced.
The device comes with Deepstream installed (though it doesn’t seem to be able to natively use the camera). My aim is to be able to follow either this tutorial, or this one, to connect it to Azure IoT Hub. However, I have two main issues:
To connect the camera first to Deepstream, I am first trying to access it with gststreamer. However, this does not seem to work very well. I have installed a plugin, that almost works. Running the command
gst-inspect-1.0 pylonsrcgives me an error, but if I give the exact path
gst-inspect-1.0 /usr/local/lib/gstreamer-1.0/pylonsrcit does acknowledge it exists. The files seem to be installed in all the correct places. Trying something like this:
gst-launch-1.0 /usr/local/lib/gstreamer-1.0/pylonsrc ! xvimagesinkalso gives an error. Does any have any advice on how to correctly get Pylon working with gst-streamer?
If I got the camera to be accessible via gst-streamer, how do I then access that within Deepstream? In the config Deepstream files, there are 4 input options:
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP. How would I get the pylon plugin accessible here? This question here is somewhat similar to mine, but all the answers link to Github repos that give 404 errors
If anyone has had any experience using an ADlink camera, I would love to hear about your experiences. If anyone could share how they have managed to get either a Basler Camera, or ADlink device working with Deepstream, I would be incredibly grateful!