The number of FPU and shared memory size per SM in Tesla C1060

In Tesla C1060 GPU, how many FPUs per SM and what is the shared memory size per SM?

I cannot find those information from an official source, either from google. can somebody provide them, or a link to them.

Thanks very much!

Like all CUDA-enabled GPUs, the Tesla C1060 GPU has 16 KB shared memory per multiprocessor. (See programming guide)
I believe there’s one FPU per core (8 cores per SM).


8 single precision FPUs, 1 double precision FPU.

thanks very much for all the replies!