Casting char to int

Hello guys!
I have a problem with casting. The following example should explain, what I’am trying to do:

char c = 240;
min(230, 240) == 230
min(230, 240+20)==230
min(230, c+20) == 4
min(230, (int)c+(int)20) == 4
min(230, ((int)c)+(int)20) == 4

I expected that the result of “min” operation will always be the same -> 230, at least with (int) casting, but it isn’t. Does anybody know what I’am doing wrong?

I don’t remember whether char is considered signed or unsigned in C, but if it is signed, then char has the range -128 to 127. This will definitely change the expected behavior.

You are right. My question was quite silly :"> char stores of course signed value.

I think I remember reading that it’s not actually specified, ie. whether char stands for unsigned char or signed char. Better just be explicit.

It is unspecified, and it varies depending on platform. (Some things, while unspecified, are consistent nonetheless. This is not one of those.) I’ve been bitten by this before.

This is one of many reasons why it’s better to use explicitly sized types from <stdint.h> such as int8_t and uint8_t.