when I measure time on device,firstly I should get GPU’s cycle with clock() function,and then I must know the GPU runtime clock,so I try to get GPU core clock from GPU-Z and GPU Clock Speed from GPU Computing SDK,but I don’t know which one is the right GPU runtime clock which I need?
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Clock() function how to calculate the processing time | 4 | 3379 | June 11, 2007 | |
| how to compute time on device | 4 | 2659 | April 11, 2012 | |
| How to measure time inside Kenrel | 12 | 6658 | August 5, 2011 | |
| Number of GPU clock cycles | 15 | 10722 | June 16, 2017 | |
| Compare Execution Times CPU vs GPU the proper way? | 5 | 6222 | September 8, 2009 | |
| timing the kernel | 1 | 5703 | June 18, 2008 | |
| Clock | 3 | 3272 | March 20, 2008 | |
| Clock Cycles of CUDA kernel How to determine the clock cycles...? | 9 | 14899 | June 27, 2008 | |
| cudaGetDeviceProperties() - clockRate: Core Clock or Shader Clock? | 1 | 1258 | July 25, 2011 | |
| About time function (between CPU and GPU) time, time function, clock, sec | 0 | 1043 | January 16, 2009 |