texture of char with cudaReadModeNormalizedFloat returns [0.0,1.0] why not [-1.0,1.0]?

Hi all,

I declare a texture as

texture<char, 3, cudaReadModeNormalizedFloat> tex;

The doc explains that I should obtained values between [-1,1] but I always obtain values between [0,1].

If I declare the texture with “cudaReadModeElementType”, I well obtain the negative values.

Any idea what could be the problem?

(unfortunately cudaReadModeNormalizedFloat is needed to have the linear interpolation because it is not a texture of float)

Thanks for your help.

– Pium

Here’s my suggestion: Use unsigned char to represent numbers [-1…1] with 0…255.

In the kernel you can then scale the obtained range [0…1] to [-1…1] by multiplying with 2 and subtracting 1.

yeah sure, but why add computations if the cudaReadModeNormalizedFloat on a signed type should returns values [-1,1]
I am forgetting something somewhere, I’d like to know what!

Here’s my guess: The texture interpolation circuit may have a limited range in hardware. For example, by default most OpenGL applications clamp inputs and outputs to [0…1], unless one makes use of float textures and a shader based rendering path. So there may have been a good reason (costs?) to limit the capability of the texture interpolator.

However a puzzling detail is that the CUDA documentation states signed textures return [-1…1].

How are you initializing this texture? Can you post some code?

Christian

I have found :)

The fact that if the input values are signed or unsigned is not depending on the template type.
It has to be specified in the channelDesc.

During the initialization, I have added:

myBeautilfulChannelDesc.f = cudaChannelFormatKindSigned;

(Thank you for your help Christian!)