Texture Memory and Emulation Mode


I have code which runs perfect in device-mode but it doesn’t give the right results in the emulation mode.
When I debug my program, it seems that when I read from the texture memory something goes wrong.
I want to use emulation to debug but unfortunately the code doens’t work :unsure:

I get the problem on several os’es and several nvidia cards…

Has anyone got the same problem and knows how to solve it?


There shouldn’t be any problem. Emulation mode emulates textures, too. Can you narrow down the issue to a very small reproduction case that you can post here?

Here is another possibility. Are you taking advantage of syncthreads()-less syncing between threads in a warp anywhere? In emulation, the warp size is one so this often results in big differences between device and emulation.