What is ASCII, Unicode and Binary Code?

Bits and Bytes

Describe How Computers Represent Data Using Binary Code – Humans have 10 digits, which is why we find the decimal, or base 10, number system to be natural. Remember how you used your fingers and toes to do math when you were a kid? Computers don’t have fingers; they have switches and use the binary code, or base 2, number system, which has only two digits: 0 and 1.

Binary Code

Computers don’t speak English or Spanish, Chinese, or Greek, for that matter so how does a computer understand what you enter?

On a typewriter, when you press the A key code, you get an A on the paper, but computers only understand 0s and 1s, so when you press the A key, it must somehow be represented by 0s and 1s.
On a typewriter, when you press the A key code, you get an A on the paper, but computers only understand 0s and 1s, so when you press the A key, it must somehow be represented by 0s and 1s.

Digital data is represented using binary code. Binary code works like a bank of light switches. If you have only a single light switch in a room, there are two possible states: The light can be on, or it can be off. This code works in situations with only two possibilities, such as yes/no or true/ false, but it fails when there are more than two choices, such as with vanilla/chocolate/strawberry. Adding another switch increases the possible combinations by a factor of two, which equals four possibilities.

A third switch, or bit, gives us eight possibilities, and so on (Table below). A bit, short for binary digit, is the smallest unit of digital information. Eight bits equal a byte, which gives us 256 possibilities. A byte is used to represent a single character in what modern computer systems. For example, when you press the A key, the binary code 01000001 65 in decimal) is sent to the computer.

Table binary code using 8 switches, or bits, and has 256 different possible combinations.

Number of Bits (switches)PossibilitiesPower of Two

ASCII (American Standard Code for Information Interchange) was developed in the 1960s using a 7-bit system that represented 128 characters and included what English upper and lowercase alphabet symbols, numbers 0 through 9, punctuation, and a few special characters. ASCII was later expanded to an 8-bit extended set with 256 characters, but ASCII needed to be adapted to be used for other languages, and many extended sets were developed. The most common extended ASCII set is Unicode.

Unicode is the standard on the Internet and unicode includes codes for most of the world’s written languages, unicode can code mathematical systems, and unicode can codes special characters. Unicode has codes for more than 100,000 characters. The first 256 characters are the same in both ASCII and Unicode; however, the characters in the last rows in Table below include Latin, Greek, and Cyrillic symbols, which are represented only in Unicode.

Table ASCII and Unicode Representations

CharacterASCII (in decimal)Unicode (Unicode in decimal)Binary Code

Measuring Data

Bits (b) are used to measure data transfer rates such as an Internet connection, and bytes (B) are used to measure file size and storage capacity. The decimal prefixes of kilo, mega, giga, tera, peta, and so on are added to the base unit (bit or byte) to indicate larger values.

Binary prefixes kibi, mebi, and gibi, have been adopted, although their use isn’t widespread. A megabyte (MB) is equal to 1,000,000 bytes, and a mebibyte (MiB) is equal to 1,048,576 bytes, a slightly larger value. Tables below and next table compare the two systems.

Table Decimal Storage Capacity Prefixes

Decimal PrefixSymbolDecimal Value
kiloK or k1031,000

Table Binary Storage Capacity Prefixes

Binary PrefixSymbolBinary ExponentDecimal Value

A megabyte (MB) can hold about 500 pages of plain text, but a single picture taken with a digital camera can be more than 25 megabytes in size. As the types of files we save have changed from plain text to images, music, and video, file sizes have become larger, and the need for storage has grown dramatically but, fundamentally, all files are still just 0s and 1s.

Career Spotlight

Bioinformatics – Computers have become integral in almost every modern career. Nowhere is this more evident than in the field of biology. Bioinformatics is the application of information technology to the field of biology. Computers are used to analyze data, predict how molecules will behave, and maintain and search massive databases of information. The rapid growth of biological information over the past few decades has created a demand for new technologies and people who know how to use them. This field requires at least a four year degree. If you have a strong interest in science and technology, bioinformatics might be a good career choice for you.

4 Things You Need to KnowKey Terms
  • Computers use the binary (base 2) number system.
  • ASCII and Unicode are binary code character sets.
  • A bit is the smallest unit of digital information.
  • A byte is equal to 8 bits and represents one character.
ASCII (American Standard Code for Information Interchange)
binary code
binary (base 2) number system