I am facing a problem that when I switch to use 32 bit integer data (i.e. int data type), the performance is significantly drop (~50% slower) compared to the same kernel but with 16 bit integer data (i.e. short data type). I run my application on GTX 480 card. As far as I known, on 2.x computability cards the operations on 8 bit and 16 bits operands utilize the 32bit operations at the hardware level. So it should not has big difference on performance when we switch from 16 bit integers to 32 bit integer operands, right?
Can anyone delight me on this problem? What is the main reason cause the performance drop here?
I am facing a problem that when I switch to use 32 bit integer data (i.e. int data type), the performance is significantly drop (~50% slower) compared to the same kernel but with 16 bit integer data (i.e. short data type). I run my application on GTX 480 card. As far as I known, on 2.x computability cards the operations on 8 bit and 16 bits operands utilize the 32bit operations at the hardware level. So it should not has big difference on performance when we switch from 16 bit integers to 32 bit integer operands, right?
Can anyone delight me on this problem? What is the main reason cause the performance drop here?