Jetson Inference, failing to build CUDA engine & failing to load networks

I’m trying to run a Jetson inference simple detection python program on the Jetson Nano. and the code is throwing an error shown in the image below. I don’t exactly know what it means and need a little help with it. I’ve also attached an image of the network, (ssd-mobilenet) file in the data folder.

Error messages say ‘check CUDA installation’.
Did you check it?

And jetson-inference should be built at the Jetson Nano, not x86 desktop.

I don’t exactly know how to check it, if you could point to a forum discussion about that, it would be nice.

Jetson inference was built on the nano itself. I also tried to run it through the docker container it showed the same error

I used this how do I check If I install CUDA and CUDNN successfully?
and well CUDA 10.2 does exist on my system I ran through some examples and they all worked.

Hmm I haven’t seen this error before, did you run any of the cuBLAS samples from the CUDA Toolkit? It seems that something happened with your CUDA install, but I’m not sure what.

I would recommend flashing the latest Nano JetPack image to a fresh SD card and trying again to see if that fixes it. If you re-flash the same SD card, remember to backup your work before re-flashing it.

Hey Dusty, @dusty_nv
We bought a new micro SD card, flashed the Jetson Nano 2GB OS image onto it and then moved on to running the docker container. This time around it threw a “segmentation fault error” didn’t even run anything. we suspect it maybe because the Jetson Nano 2GB module we have uses jetpack 4.5 which might not be compatible with the Jetson-inference.
we are currently trying to get Jetpack 4.4.1 onto the Nano. let us know what you think?

Flashing the Jetson Nano 2GB with Jetpack 4.4.1 did the trick and we were finally able to run the docker container and the other programs.

OK gotcha, glad you got it working! Nano 2GB should work too on JetPack 4.5, but I will double-check it against the jetson-inference:r32.5.0 container image.