I read that FTP ASCII-mode only transfers 7 bits. Which statement is more true?:
1) It only puts the lowest 7 bits in 8-bit byte "containers"... simply the most significant bit (msb) is always hardcoded to 0?
2) It literally bit-shifts 7 bits into 8-bit containers... 7 bits of first char + 1st bit of second character in first byte, then last 6 bits of second character and first two bits of third character in the second byte, etc.
Isn't the Internet a "byte-oriented" beast?
In the ancient era of computing where they had devices called "modems" I can imagine that it would actually only send 7 bits per character... you needed to squeeze as much over the phone line as possible... but I find it hard to believe bit shifting occurs over the Internet.