cufft 2.3 batched 1D fft of size 80 on 1GB relative low performance of batched cufft of size 80 on 1

Hi,

    [*]I was looking at the performance of batched 1D FFT using CUFFT 2.3 for various non-powers of 2.

    [*]Found that the performance is relatively very low for batched FFT of size 80 on a 1GB dataset.

    [*]I see a significant increase in speed when I use 0.5 GB and 2GB datasets. But it is extremely poor on 1GB dataset. Overall performance of batched FFT of size 80 is also low compared to batched FFT of sizes 36 and 108.

    [*]I am using c1060 for my experiments.

    [*]Did anyone see a similar behavior or pls let me know if you don’t see this performance drop for batched FFT of size 80

.

Thanks

gvsaradhi