Can I use Deepstream for robotics(Computer vision) on Jetson Nano

My question is simple how feasible is it use deep-stream for basic robotics relying on computer vision.

When I say simple I mean using only computer vision and not other sensors such as lidars, gyros, IMUs etc for positioning and other forms of data input for the robot

Or Should one use Issac SDK ?
I have a Jetson Nano B01, so I don’t think it supports the SDK . Also my field of application is just computer vision at the moment as I am just trying out things.

Isaac SDK is more suitable for this use-case. Please take a look at
DeepStream for Robotics — ISAAC 2020.2 documentation

I understand Issac SDK is a sure way to go. Yet I feel using deepstream may just do it. As I just wanna get the co-ordinates for the detected objects from the camera. My robot is stationary and is not dynamic so it has a fixed coordinate system and moves within set bounds.