Accuracy on the GPU

How accurate is the calculations that is done on the graphics card compared it being done on the CPU? I ask this because I ran a method that does some calculations and then the exact same thing on the graphics card and it returned slightly different results. it was very minor differences like 10exp-8 minor but still different. The is the graphics card calculations slightly inaccurate?

If you used double precision calculations in your CPU-based code, and only single-precision in your GPU code, that would make a difference.

Also, remember that the difference in the algorithms between the CPU and GPU might differ in their numerical stability (i.e. rounding errors introduced in the algorithm), which would affect your result, but fundamentally can’t be blamed on the GPU.

You will get the same relative differences on the CPU if you simple change the order of your operations. These are the basic rules of any floating point calculation.

See the appendix in the programming guide if you want to know exactly which operations on GPUs deviate from IEEE floating point standards, and by how much (which isn’t much at all, by the way).

you’ll probably get this kind of accuracy issues when you’re comparing different CPUs, not just CPU to GPU

1e-8 is the lowest-significance bit in a single precision number. this thing is essentially random. (it’s not any more “accurate” to flip it one way than another.)