Symbol resolution failure on Tesla K80 CUDA 7.0, works on Geforce 970 GT CUDA 7.5

I am trying to setup cuFFT callbacks. I am using the standard way of storing the device function pointer in a device global variable, and then cudaMemcpyFromSymbol() it. I then compile with “-gencode arch=compute_37,code=compute_37” (or *_52 for the 970 GT). On the Tesla K80, cudaMemcpyFromSymbol() returns cudaErrorInvalidSymbol, whereas on the 970 GT everything works correctly. I also tried to compile with “-gencode arch=compute_37,code=sm_37” to build for a real architecture, but then I get the following messages:

nvlink info    : Function '_ZN4gNLS9evolvepsiEPvm7double2S0_S0_' has address taken but no possible call to it
/cineca/prod/compilers/cuda/7.0.28/none/bin/..//lib64/libcufft_static.a(kernel.o): In function `__sti____cudaRegisterAll_53_tmpxft_0000294c_00000000_16_kernel_compute_53_cpp1_ii_e0b5062a()':
tmpxft_0000294c_00000000-13_kernel.compute_35.cudafe1.cpp:(.text+0x2d): undefined reference to `__cudaRegisterLinkedBinary_53_tmpxft_0000294c_00000000_16_kernel_compute_53_cpp1_ii_e0b5062a'
/cineca/prod/compilers/cuda/7.0.28/none/bin/..//lib64/libcufft_static.a(kernel.o): In function `global constructors keyed to lut_dp_2_4':
tmpxft_0000294c_00000000-13_kernel.compute_35.cudafe1.cpp:(.text+0x1ac7d): undefined reference to `__cudaRegisterLinkedBinary_53_tmpxft_0000294c_00000000_16_kernel_compute_53_cpp1_ii_e0b5062a'
collect2: error: ld returned 1 exit status

So my question is, are function pointers supported at all with the K80 and CUDA 7.0? Note that I am already initializing constant memory with cudaMemcpyToSymbol(), and that works on both systems.

Yes, cufft callbacks are supported on K80 and CUDA 7.0.

CUDA 7.0 comes with a simpleCUFFTCallback sample project, which you should be able to compile and run on your K80 using the supplied makefile:

You may want to study it for correct compilation method and overall usage of cufft callbacks.

Yeah, my problem was different and much more trivial. I would delete this thread if I could.