Deminishing performance?

Since the problem was reproduced with the nviDia sample code as well, there must be something fishy…

Can some1 from NVIDIA comment on this please?

btw,
We recently participated in a roadshow where we showcased our financial algos on a multi-TESLA10-GPU platform. We were launching our kernels again and again and reporting speedups on the screen. The app ran for more than 2 hours without showing any bugs or slow-downs. Things were normal. But we did not use any driver API or contexts…

Best Regards,
Sarnath

I’m not sure that this is related to contexts at all. My program holds on to the same context, and never switches, yet still has this same behavior.

Driver team says:

We could repro on 180.60, but not 181.20. Try 181.20 and report back while we’re following up on our end…

Are these Windows versions?

I actually just upgraded to 180.22 on Linux (released a month ago, newest I know of), and this problem doesn’t seem to be present anymore. I will do more intensive testing overnight. Also, others have mentioned monkeying around with pushing and popping contexts… I will be adding this and will post here if the issue reappears.

Thanks!

I can confirm this appears to be fixed in 181.20 on Windows XP - but that’s nothing new - someone above already stated XP was unaffected - this was a potentially a Vista issue.

If/when I get the time I’ll boot back into Vista and give it a shot.

That sounds goood! Thanks!

Problem should not exist in Linux. We’ve confirmed that it was a Vista specific bug that was fixed between 180.60 and 181.20/22.

Well, this behavior was quite evident on Linux under driver 180.06. That’s the reason I went looking for a thread like this and posted in it. :) It might be an unrelated bug, but the results were the same. As I said earlier, 180.22 fixed whatever the issue was, it seems.

This topic seems to have been resolved, at least somewhat, but I wanted to add that I have been experiencing this problem on our Mac Pro w/ Leopard. I have a kernel that normally runs in about 40 ms, but after a period of time, it shoots up to 100 ms or so. I log in w/ SSH and run everything in a console, and no one else was using the computer. What I discovered is that after 15 minutes or so, an OpenGL screen saver was coming on, which I could see slowing things down like what I saw. If I VNCed into the computer, it disabled the screen saver, and all was good again (of course I went through days of remotely rebooting every 30 minutes before discovering this). So, now I have disabled the screen saver, and it seems to be working again. Hopefully this helps other people who might be experiencing slow-downs like this!

@jawilson@cae.wisc.edu,

Good piece of information. Similar things were found out even in the past… Hopefully, it isthe same problem haunting others too.

Thank you,

Best Regards,
Sarnath