This question is cross listed on Stack over flow at http://stackoverflow.com/questions/4455980/manually-control-gpu-fan-speed-on-ubuntu but I thought that I might get more experts here.
I’m having problems with code slowing down the longer a program runs. This is a problem because I need this program to run for a long time, Hundreds of thousands of iterations if not millions of iterations. The longer the program runs the slower it gets. For one run the times per iteration are
N time per iteration
[1,] 8192 0.03601563
[2,] 4096 0.05169434
[3,] 2048 0.06622070
[4,] 1024 0.08333984
For another in ascending order:
N time per iteration
[1,] 1024 0.02515625
[2,] 2048 0.02777832
[3,] 4096 0.03501221
[4,] 8192 0.06790527
[5,] 16384 0.14563477
[6,] 32768 0.27092957
[7,] 65536 0.52886856 <— 12 hours running.
By this I’ve come to the conclusion that It might be the GPU heating up that might slow it down. Might there be other things slowing this down?
I’m running on a GTX 480. The computations on the GPU are all integer computations. The memory is static and does not grow.