Hey Guys,
I want to compare the timespan for executing some algorithms on GPU using CUDA and on CPU.
- I took the CudaEvents to measure the time span. → duration for one slices where about 0.1608µs
- I took the same algorithm and have count the number of ticks while executing the algorithm on CUDA using Stopwatch.ElapsedTicks C# property → with the number of ticks an my CPU frequency i got a duration for one slice about 6.85µs.
Why is there such a hugh different betwenn the timespan of the CudaEvents und the ticks of the CPU?
Thanks for Answers.