Anyone care to explain why the event stream elapsed time function computes a float instead of a 64-bit integer?
[indent][font=“Courier New”]cudaError_t cudaEventElapsedTime (float* ms, cudaEvent_t start, cudaEvent_t end)[/font]
[font=“Georgia”]Computes the elapsed time between two events (in milliseconds with a resolution of around 0.5 microseconds). If
either event has not been recorded yet, this function returns cudaErrorInvalidValue. If either event has been recorded
with a non-zero stream, the result is undefined.[/font][/indent]