GPU code execution time varies

Hi guys,

my program can be executed by cpu and gpu. (host device).

If i run it on the cpu each frame needs X milliseconds to be rendered (-> same interval to render an image).

But if I execute it on the gpu, the rendering process for each frame varies very much in time (you can see it in the attached video).
I could not find out any variation by measuring the timings, so i would like to know, if someone of you guys has an idea,
what could cause this problem.

I know it is not much information about the code, but i don’t know which part is responsible for that - so I just would like to know, if there is a part which I have missed.

Sample-Video.zip (MP4 - play with VLC)

thanks a lot!

Hi Beteigeuze, I couldn’t open your video but I met the same issue as yours. I have a code that does FFT on the input array. When I execute in exe file, the time to run the file is stable and short, which is 5 sec. But if I make it dll and measure the whole time starting from calling the function, the time varies from 5 sec to 40 sec. So do you have an idea how this happened and do you have a solution? Thanks.