You Are At: AllSands Home > Computers > How computers work
The information revolution is all around us. It seems everywhere we go today, we cannot escape from the variety of computers and other computer empowered gadgets that continuously store, process and present information to us. But with so much information technology around, it often boggles the mind to wonder just how all these computers store information.

Well, stop all your wondering! In the next few paragraphs you will learn the secret to the foundation of all computer technology!

What is Information?

Let's begin at the top. We know computers (and all of it’s offspring gadgets) store, process and present information. But how do computers know when something is an information or not? In other words: just what IS information?

Simply put, information is anything that makes some sense to humans. This includes but is not limited to sights, sound, smell, signals and written text like alphabets, writings, symbols, etc.

Information Inside Computers
Fine, well and dandy, but does that mean computers can smell? Well, not exactly. In order to process information, we need to find a way to represent information inside of computers. Computers represent information in a way that is entirely different then that of our brain. This is because a computer is composed of thousands and millions of tiny electrical circuits. These electrical circuits can be in only one of two conditions or states: they are either ON or OFF. This dual state of being is often called “binary”.

The Binary System
The binary state is often represented with numbers (so that IT makes some sense to US). The ON state is represented with 1 and the OFF state with 0 (zero). Thus the computer’s memory is full of 1’s and 0’s. Each of this 1’s and 0’s is called a bit and represent a unit of information. A bit then is the smallest representation of information in a computer.

Bits Get Bigger
By themselves, bits are of no use to humans. But together, they can perform miracles. In order to represent information, bits are grouped together. Any combination of 8, 16, 32 and 64 bits may be grouped together to represent a single piece of information.

Bits of Information
But how can a group of bits like 10011011 possibly mean anything? This is where the genius of mathematicians comes in. By assigning a different group of bit rows to represent different information, we can put all of human knowledge into groups of bits. For example, the number 1 is represented in binary as 00000001, the number 2 as 00000010 the number 3 as 00000011 and so on and so forth. This way bits can represent any symbol, letter, number or signal. Thus colors, sound video and written text are all broken down and stored into groups of bits.

How is Information Measured?
Now that we know how information is stored in computers, how do we state how much information is there? In other words, how do we measure information? It is obvious saying that a certain document is 1,234,546 bits long does not give a clear picture. So how then should the amount of information be communicated?

Bits and Bytes
By taking an entire group of bits such as 10011011 as a single unit, we have the measurement for a single piece of information. This single group of bits is called a byte (pronounced “ba-ate”). Therefore, a single piece of information like the number “1” is called a byte. In the same way the word “moon” which consist of four letters, is said to be 4 bytes long. Further expanding this, a document that has 1034 letters or characters (including spaces) in it is said to be 1034 bytes long.

Kilobyte, Megabyte and Gigabyte
I would be glad to end this article here, but in this age of information overload, I’m afraid even bytes are not enough to quantify information. Borrowing from the metric system of measurement, large amounts of bytes are now called with prefixes to make them easier to understand. For example, one thousand bytes are called one Kilobyte or KB for short. On a greater scale, one million bytes are called one Megabyte or MB for short. These days, Gigabytes or GB are also quite commonly used. One Gigabyte is equivalent to one billion bytes of information. In time to come we will be used to using Terabytes, whereby one Terabyte or TB is approximately a thousand Gigabytes or 1,000,000,000,000 bytes. To simplify: -

1 Kilobyte = 1KB = approximately 1,000 bytes.
1 Megabyte = 1MB = approximately 1,000,000 bytes.
1 Gigabyte = 1GB = approximately 1,000,000,000 bytes.
1 Terabyte = 1TB = approximately 1,000,000,000,000 bytes.

Megabytes (MB) and Gigabytes (GB) are often used in the computer world to indicate the amount of information in a certain document or file or the amount of information stored on a certain disk like the floppy disk or the hard disk drive. All disk also come with their capacity in MB or GB clearly stated outside has a guide for consumers.

So there you have it. The world of information and computers now makes more sense and all those bits and bytes don’t sound so threatening anymore!