Finding machine precision on GPU


Is there something like EPS sort of function on the GPU? I need to compare a floating point variable to machine precision. What is the best way to do this? So basically I need to determine the machine float epsilon on the GPU.



Don’t you know the machine EPS since your are doing either single or double precision on the GPU, and the precision of various operations in ulps is given in the Appendix of the Programming Guide?