I have a program with many transfers between CPU and GPU.
I do one calculation step on GPU, then I transfer the calculated values back to CPU to make some updates there. Then I transfer it to GPU again to make another calculation step; and so on…
I have recognized that there occur deviations of the values (floats) I transfer!
i. e. the more transfers I do, the more deviations of the transfered values occur.
Is that possible and normal? I don’t understand it, because on CPU the values are floats and on GPU the values are floats, too. (especially float4 but that shouldn’t matter, should it?)
Can these deviations take place by many GPU<–>CPU transfers or must there be another error I just haven’t found already?