Inferencing with a custom made keras/tensorflow model on jetson nano



Please forgive me for my lack of english or technical knowledge, I am a newbie:)

Steps I followed–

  1. Created a custom model on my laptop(intel i3, windows 10, tensorflow-2.0 CPU). Tried inferencing , and it works on my laptop

  2. Saved this model

  3. Transferred and Loaded this model in Jetson Nano.
    The jetson nano was flashed with the official image from the jetson nano website.
    It has cuda version 10.0 & cudnn version 7.5. Jetson nano has the official nvidia tensorflow-2.0, downloaded from Jetson nano website, along with the pip3, python3 and other development essentials as on the same page itself.

  4. Tried inferencing this model on nano. However it says, “failed to get convolution algorithm. cuDNN failed to initialize”.

  5. Tried converting the ‘sample.h5’ keras model obtained in step 2, into uff/tensorrt graph file. Again tried inferencing on nano, but same error as in step 4.

  6. In the next step, I created a similar model as in step 1 but now in google colab (gpu runtime, tensorflow 1.15, cudnnn version 7.6.5, cuda version 10.0.130). The inferencing of the model also works then and there in google colab itself.
    However, when I download and save this model on Jetson nano, It gives me the same cudnn error.

I tried referring the solutions mentioned as on this below page and many other blogs, pages. However failed to solve.

Is the problem with jetson nano cudnn version/cuda version which I need to match with google colab??
If yes, how should I change this cudnn/cuda version of my jetson nano, as cudnn & cuda on jetson nano comes preinstalled while flashing the memory card with the image(I did nothing extra to put or install cuda/cudnn in the jetson nano, as it was originally loaded with the respective versions as in step 2)

If no, then please let me know if I can work other way around and change the cudnn version of google colab so as to match it as my nano.

Will wait for your reply as I am in a mess, and need to complete and submit this project asap.

Thanks and regards,

Moving to Jetson Nano forum so Jetson team can take a look.


First, please noticed that TensorRT add the TensorFlow 2.0 support from v7.x.
So please use TensorFlow 1.x if you want to run it with TensorRT on Jetson platform currently.

For the cuDNN initialization issue, would you mind to reflash and install all the package again?
A possible reason is that the package is somehow broken and leads to this error.

Another thing is you will need to install the TF package built with the same JetPack version for compatibility.