Torch.cuda.is_available() return false on docker image "dustynv/jetson-inference:r32.5.0"

Here is the log on my docker image:

root@0ffb2af38cf4:/jetson-inference# python
Python 3.6.9 (default, Oct  8 2020, 12:12:24)
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> print(torch.cuda.is_available())
False

Your topic was posted in the wrong category. I am moving this to the Jetson Nano category for visibility.

1 Like

Hi @wqhwqz2008, did you start the container with the docker/run.sh script from jetson-inference? Are you able to run imagenet.py or detectnet.py on a test image from the container (this would confirm that GPU is working)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.