This is a general question, but let's talk Java for the example.
Primitive types byte and short are signed. If I'm not too wrong, the sign bit is the leftmost.
Here's some code (in Java) with simple masking:
1 byte b = -100;
2 short s = -100;
5 System.out.println(b & 0xFF);
6 System.out.println(s & 0xFF);
7 System.out.println((byte)(b & 0xFF));
8 System.out.println((byte)(s & 0xFF));
Masking with all 8 bits 'on' for the byte value, means (I think) that line 5 should print -100. But it prints 156.
Line 6 prints 156, which is more or less expected since the sign bit is lost(?)
Lines 7 and 8 print -100, which means that somehow the short's sign bit was shifted to the right by 8 positions.
So, what exactly is going on with masking and casting? Why I don't get -100 from line 5 and why I get -100 from line 8?