• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 547
  • Last Modified:

How does the computer interprets symbols and ascii into binary code ?


My question started when I saw that the bytes of a file, encoded with Base64 encoding type, will result in ASCII text. Like this :

Open in new window

and once I reverse the process , and write the bytes into a file, it will result the same working file.

Now my question is, how does the computer knows what means an 'H' or an '5' , when all it knows are 111010101010101 ? And how is this so fast ?
  • 3
  • 2
  • 2
  • +1
6 Solutions
Paul SauvéRetiredCommented:
One ASCII character has 7 or 8 bits. However, Unicode characters use 16 bits.

8 bits = 1 octet (an octet is also known as a byte)

Base 64 uses 3 bytes (24 bits) then divides them into 4 numbers of 6 bits.

The computer has to know what coding is used for the characters in order to interpret them.

"And how is this so fast ?" A CPU (central processing unit) of 1 Ghz is the equivalent of 1,000,000,000 calculations per second!

The Operating System of a given computer is normally written in machine language. It consists on MANY routines and subroutines which control the input (keyboard, etc), storage (writing to hard discs, etc.) and output (screen, speakers, printer, etc.) and many more operations.
Technically speaking, computers (i.e., the machines) do not know about ASCII characters.
If you type the letter 'A' on your console, then they keyboard electronics is simply sending out a sequence of bits that represent the character 'A'. There is an I/O Driver (Input/output driver) computer program that is written by a systems programmer that puts those bits into an 8-bit memory location (i.e., one byte of memory). Since the source was keyboard, it is highly likely that the applications that use that byte will use it in a way to keep the idea that the number stored in the byte is actually an ASCII representation of the character 'A'. Now, when you type, you usually see an echo of what you type; i.e., the letter 'A' appears on your screen. The I/O driver that received the number representing the 'A' simply sends that bit stream back to the screen without interpretation. Again, the I/O driver program doesn't really know that the letter 'A' was typed. These bits go to the screen electronics which lights up pixels on the screen to create an image of the letter 'A'.

Here is a brief discussion of ASCII codes (and extended ASCII codes).

By the way, in the above example, there may actually be some extra bits in addition to the byte that relate to error handling and framing of the character so that the screen knows where one character begins and ends.
AlexMertAuthor Commented:
Well I've found the answer myself so Ill pot it here :


So basically it searches in a ByteArray for the encoded value of a value
Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

how does the computer knows what means an 'H' or an '5'
the computer knows because computer programmers have come to agreements about what means. See, for example,
These agreements can be useful when computer programs want to work with other computer programs.
And how is this so fast ?
I'm not sure what you mean by "this".
The agreements have developed over many years.
But once they have been made, that time does not need to be spent again, and it takes no extra time to follow them.
AlexMertAuthor Commented:
@ozo , regarding my first question, I know the ASCII codes are same everywhere, but I mean how the computer, the windows translates the ASCII codes, for example symbol A into binary code ? It looks for A in a array full of those binary codes or how ?

And by how is this so fast, I mean how are all those calculations, and binary transformations so fast ? Thanks alot !
>> "It looks for A in a array full of those binary codes or how ?"
If you write in your program the character literal 'A', then, for example, a C or C++ compiler will convert that char literal into the 1's and 0's for you. A compiler could possibly have an table to convert from literal char to binary.

>>binary transformations so fast ?"
As ozo explained, "it takes no extra time to follow them."

I mentioned earlier that when you type the letter A in a window console, then the keyboard sends a stream of bits to the cpu and some of that stream is sent back to the windows console screen. For these steps, it is the "agreement" that ozo talked about that the pixels are lit in a way that looks like the letter A. The computer doesn't have to look anything up to echo an ASCII character. (Some control chars are exceptions to this rule. The computer may take special action when you type, for example, control/C, which usually will abort your program.)
If you write a program that takes some user inputs from a menu; say the user has to type in a number 1 through 5. If the user types in a 1, the program may look like this:

if( user_input == '1' ) {
    // do something
else ....

Again, the compiler converts the '1' to a 8-bit binary representation per the agreement of what a '1' should be represented as in ASCII. There are other binary representations as well - for example, EBCDIC

The ASCII table is found here:

Summary: Compilers need to know what table representation to use (i.e., ASCII or some other). Running programs do not need to know, so there is no time needed to do conversions.
Paul SauvéRetiredCommented:
Alex - in fact it is the other way around. The computer never "sees" an 'A', all it "sees" is code and displays it on the screen or prints this code as an 'A'.

Exactly HOW it interprets the code depends on the operating system (the machine code), which is generally programmed in assembly language, which is a compiled programming language (ex. C, C++, VisualBasic) as opposed to an interpreted programming language (ex. Perl, HTML, JavaScript, CSS). Versions of BASIC can be either interpreted OR compiled.

Compiled code is FIRST traslated by a peofrem called a compiler into the the PC's machine code and is much faster that interpreted code, which must be translated to machine language each time it is executed.

How fast it does this depends on the CPU speed. For example, MY PC has a a clock speed of 3.1 GigaHertz, or operations (or calculations) per second.

Please have a look at this article: How Computers Work: The CPU and Memory
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

  • 3
  • 2
  • 2
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now