About integer calculation

How fast can GPU operate on integers with common operations like PLUS, MINUS, XOR, AND, etc.

Is there any comparation about integer operation performance between CPU and GPU?


See section in the Programming Guide for discussion of the speed of these operations. Note that the number of clock cycles listed in the guide are per warp. A warp is 32 threads, and each multiprocessor on the GPU is running a different warp, so you can take:

[shader clock] / [clock cycles for operation] * 32 threads per warp * [# of multiprocessors]

to figure out how fast the GPU can perform some operation.