Maybe I’m confused and seeing strange things, but I came across this:
In a global kernel running in emulation mode I have:
printf(“%d %d %d”, 1, 2, 1 % 2);
gives:
1 2 1
(step is an int parameter of the kernel)
printf(“%d %d %d”,step , 2 , step % 2);
gives:
1 2 1
(Y is: #define Y 32/16)
printf(“%d %d %d”,1, Y, 1 % Y);
gives:
1 2 0
which should have been:
1 2 1
However Y does not appear to be a float or something. If I change the #define into:
int Y = 32 / 16;
it works…