I have GTX 280 with compute capability 1.3 and it supports double precision but I do not know whether there is support of double type variables in texture memory?? Can anybody tell me…
I have not been able to find declaration of textures for double type!!!
There is a trick, if you are not using any of the normalization or interpolation features of the texture hardware, where you break the double into two 32-bit integers and store them in an int2 texture.
As seibert says, there is no double-precision texturing support in the hardware. If you want to use texture merely for faster access to double-precision data (due to a not fully coalesced acces pattern), the recommended approach is to map an int2 texture over the double data, and re-interpreting each int2 element as a double on read, using the CUDA device function __hiloint2double(). One can abstract this in any way desired, e.g.