How does the computer interprets symbols and ascii into binary code ?

Posted on 2013-06-02
Medium Priority
Last Modified: 2013-06-03

My question started when I saw that the bytes of a file, encoded with Base64 encoding type, will result in ASCII text. Like this :

Open in new window

and once I reverse the process , and write the bytes into a file, it will result the same working file.

Now my question is, how does the computer knows what means an 'H' or an '5' , when all it knows are 111010101010101 ? And how is this so fast ?
Question by:AlexMert
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
  • 2
  • +1
LVL 33

Assisted Solution

by:Paul Sauvé
Paul Sauvé earned 1680 total points
ID: 39214588
One ASCII character has 7 or 8 bits. However, Unicode characters use 16 bits.

8 bits = 1 octet (an octet is also known as a byte)

Base 64 uses 3 bytes (24 bits) then divides them into 4 numbers of 6 bits.

The computer has to know what coding is used for the characters in order to interpret them.

"And how is this so fast ?" A CPU (central processing unit) of 1 Ghz is the equivalent of 1,000,000,000 calculations per second!

The Operating System of a given computer is normally written in machine language. It consists on MANY routines and subroutines which control the input (keyboard, etc), storage (writing to hard discs, etc.) and output (screen, speakers, printer, etc.) and many more operations.
LVL 32

Assisted Solution

phoffric earned 240 total points
ID: 39214679
Technically speaking, computers (i.e., the machines) do not know about ASCII characters.
If you type the letter 'A' on your console, then they keyboard electronics is simply sending out a sequence of bits that represent the character 'A'. There is an I/O Driver (Input/output driver) computer program that is written by a systems programmer that puts those bits into an 8-bit memory location (i.e., one byte of memory). Since the source was keyboard, it is highly likely that the applications that use that byte will use it in a way to keep the idea that the number stored in the byte is actually an ASCII representation of the character 'A'. Now, when you type, you usually see an echo of what you type; i.e., the letter 'A' appears on your screen. The I/O driver that received the number representing the 'A' simply sends that bit stream back to the screen without interpretation. Again, the I/O driver program doesn't really know that the letter 'A' was typed. These bits go to the screen electronics which lights up pixels on the screen to create an image of the letter 'A'.

Here is a brief discussion of ASCII codes (and extended ASCII codes).

By the way, in the above example, there may actually be some extra bits in addition to the byte that relate to error handling and framing of the character so that the screen knows where one character begins and ends.

Author Comment

ID: 39214771
Well I've found the answer myself so Ill pot it here :


So basically it searches in a ByteArray for the encoded value of a value
Put Machine Learning to Work--Protect Your Clients

Machine learning means Smarter Cybersecurity™ Solutions.
As technology continues to advance, managing and analyzing massive data sets just can’t be accomplished by humans alone. It requires huge amounts of memory and storage, as well as high-speed processing of the cloud.

LVL 84

Assisted Solution

ozo earned 80 total points
ID: 39215208
how does the computer knows what means an 'H' or an '5'
the computer knows because computer programmers have come to agreements about what means. See, for example,
These agreements can be useful when computer programs want to work with other computer programs.
And how is this so fast ?
I'm not sure what you mean by "this".
The agreements have developed over many years.
But once they have been made, that time does not need to be spent again, and it takes no extra time to follow them.

Author Comment

ID: 39216861
@ozo , regarding my first question, I know the ASCII codes are same everywhere, but I mean how the computer, the windows translates the ASCII codes, for example symbol A into binary code ? It looks for A in a array full of those binary codes or how ?

And by how is this so fast, I mean how are all those calculations, and binary transformations so fast ? Thanks alot !
LVL 32

Assisted Solution

phoffric earned 240 total points
ID: 39216946
>> "It looks for A in a array full of those binary codes or how ?"
If you write in your program the character literal 'A', then, for example, a C or C++ compiler will convert that char literal into the 1's and 0's for you. A compiler could possibly have an table to convert from literal char to binary.

>>binary transformations so fast ?"
As ozo explained, "it takes no extra time to follow them."

I mentioned earlier that when you type the letter A in a window console, then the keyboard sends a stream of bits to the cpu and some of that stream is sent back to the windows console screen. For these steps, it is the "agreement" that ozo talked about that the pixels are lit in a way that looks like the letter A. The computer doesn't have to look anything up to echo an ASCII character. (Some control chars are exceptions to this rule. The computer may take special action when you type, for example, control/C, which usually will abort your program.)
LVL 32

Assisted Solution

phoffric earned 240 total points
ID: 39216970
If you write a program that takes some user inputs from a menu; say the user has to type in a number 1 through 5. If the user types in a 1, the program may look like this:

if( user_input == '1' ) {
    // do something
else ....

Again, the compiler converts the '1' to a 8-bit binary representation per the agreement of what a '1' should be represented as in ASCII. There are other binary representations as well - for example, EBCDIC

The ASCII table is found here:

Summary: Compilers need to know what table representation to use (i.e., ASCII or some other). Running programs do not need to know, so there is no time needed to do conversions.
LVL 33

Accepted Solution

Paul Sauvé earned 1680 total points
ID: 39216986
Alex - in fact it is the other way around. The computer never "sees" an 'A', all it "sees" is code and displays it on the screen or prints this code as an 'A'.

Exactly HOW it interprets the code depends on the operating system (the machine code), which is generally programmed in assembly language, which is a compiled programming language (ex. C, C++, VisualBasic) as opposed to an interpreted programming language (ex. Perl, HTML, JavaScript, CSS). Versions of BASIC can be either interpreted OR compiled.

Compiled code is FIRST traslated by a peofrem called a compiler into the the PC's machine code and is much faster that interpreted code, which must be translated to machine language each time it is executed.

How fast it does this depends on the CPU speed. For example, MY PC has a a clock speed of 3.1 GigaHertz, or operations (or calculations) per second.

Please have a look at this article: How Computers Work: The CPU and Memory

Featured Post

Optimize your web performance

What's in the eBook?
- Full list of reasons for poor performance
- Ultimate measures to speed things up
- Primary web monitoring types
- KPIs you should be monitoring in order to increase your ROI

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This paper addresses the security of Sennheiser DECT Contact Center and Office (CC&O) headsets. It describes the DECT security chain comprised of “Pairing”, “Per Call Authentication” and “Encryption”, which are all part of the standard DECT protocol.
It’s been over a month into 2017, and there is already a sophisticated Gmail phishing email making it rounds. New techniques and tactics, have given hackers a way to authentically impersonate your contacts.How it Works The attack works by targeti…
Windows 8 came with a dramatically different user interface known as Metro. Notably missing from that interface was a Start button and Start Menu. Microsoft responded to negative user feedback of the Metro interface, bringing back the Start button a…
With the advent of Windows 10, Microsoft is pushing a Get Windows 10 icon into the notification area (system tray) of qualifying computers. There are many reasons for wanting to remove this icon. This two-part Experts Exchange video Micro Tutorial s…
Suggested Courses
Course of the Month9 days, 17 hours left to enroll

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question