watch -n 1 nvidia-smi cause 100% cpu on irq/127-nvidia

As the title describes, if run nvidia-smi continuously, then there is 100% cpu usage on kernel thread irq/127-nvidia.
don’t see this problem with another box with double 1070 and i7-6700 installed.
System info is provided below.
Is this system configuration supported?
Thank you!

gpu: 1080ti
cpu: Intel® Core™ i5-7500 CPU @ 3.40GHz
motherboard: gigabyte B250M-D3H

cat /proc/driver/nvidia/version
NVRM version: NVIDIA UNIX x86_64 Kernel Module 387.12 Thu Sep 28 20:18:48 PDT 2017
GCC version: gcc version 5.4.0 20160609 (Ubuntu 5.4.0-6ubuntu1~16.04.4)

driver 384.09 has the same problem.

nvm.

I need the nvidia-persistenced. this solves the problem.

Can you elaborate on your solution? I’m having the same issue.

My guess is that running nvidia-smi includes a lot initialization overhead whereas nvidia-persistenced might amortize the initialization and nvidia-smi speaks to nvidia-persistenced?

What do you need? You just need to run nvidia-persistenced. If you run on Ubuntu, there is nvidia-persistenced.service you can put in /etc/systemd/system and run

systemctl daemon-reload
systemctl enable nvidia-persistenced
systemctl start nvidia-persistenced

And for why it works, there is lengthy whole section here:
http://docs.nvidia.com/deploy/driver-persistence/index.html

1 Like