Nvidia-smi stops working after VM reboot

Problem: After reboot Ubuntu VM on hyper-v the nvidia-smi stops working. Still shows driver installed but nvidia-smi says no devices found though the system clearly shows it is.

Details: Running 20.04 Ubuntu server LTS VM on Windows 2019 hyper-V. Using DDA passthrough to pass NVIDIA P2000 video card to the Ubuntu VM

Steps:
Install latest NVIDIA driver for Ubuntu using .run file. Everything works. Shows driver active. nvidia-smi running fine. But after reboot I have to run the reinstall .run file each time and its a pain. secure boot is disabled on the hyper-v vm.

Any ideas how to resolve so I don’t have to reinstall every time I reboot the ubuntu headless VM?

Please run nvidia-bug-report.sh as root and attach the resulting nvidia-bug-report.log.gz file to your post.

Thanks @generix . Sorry took a few to get back to this. I upgraded to 535.129.03 and weirdly nvidia-smi was still working after reboot this time and I didn’t have to do re-install.

Attached is my log file.
nvidia-bug-report.log.gz (258.1 KB)

You were using the runfile installer without --dkms option so the driver will get lost on kernel updates. Either reinstall it with that option or rather uninstall it and use the packaged driver from ubuntu repo.

1 Like

Great thank you! I missed that! I will reinstall with the -dkms. Appreciate the help!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.