A Brief History Of The Internet
The evolution of Internet. How a simple experiment to connect US defense centers to a powerful global communicating tool mushroomed to become today's internet.
A commonly asked question is "What is the Internet?" The reason such a question gets asked so often is because there's no agreed upon answer that neatly sums up the Internet. The Internet can be thought about in relation to its common protocols, as a physical collection of routers and circuits, as a set of shared resources, or even as an attitude about Interconnecting and intercommunication.
Today's Internet is global resource connecting millions of users that began as an experiment over 25 years ago by U.S. Department of Defense. While the networks that make up the Internet are based on a standard set of protocols ( a mutually agreed upon method of communication between parties ), the Internet also has gateways to networks and services that are based on other protocols.
The Internet was born about 25 years ago, trying to connect together a U.S. Defense Department network called ARPAnet and various other radio and satellite networks. The ARPAnet was an experimental network designed to support military research-in particlular, research about how to biult networks that could withstand partial outages ( like bomb attacks ) and still function. In the ARPAnet model, communication always occurs between a source and a destination computer. The network itself is assumed to be unreliable; any portion of the network could disappear at any moment. It was designed to require the minimum of information for the computer clients. To send a message on the network, a computer only had to put its data in an envelope, called as Internet Protocol (IP) packet, and "address" the packets correctly. The communicating computers- not the network itself- were also given the responsibility to ensure that the communication was accomplished. The philosophy was that every computer on the network could talk, as a peer, with any other computer.
These decisions may sound odd, like the assumptions of an "unreliable" network; but history has proven that most of them were reasonably correct. Although the Organization for International Standardization (ISO) was spending years designing the ultimate standard for computer networking, people couldn't wait. Internet developers in the US, UK, and Scandinavia, responding to market pressures, began to put their IP software on every conceivable type of computer.
Newer networks developed were LANs, UNIX and most important NSFNET, commissioned bye the National Science Foundation (NSF), an agency of the US government. In the late 80's the NSF created five supercomputer centers. Up to this point, the world's fastest computer has only been available to weapons developers and a few researches from very large corporations. NSF was making these resources available for any scholarly research. Only five centers were created because they were so expensive so they had to be shared. These created a communication problem: they needed a way to connect their centers together and to allow the clients of these centers to access them. At first, the NSF tried to use the ARPAnet for communications, but these failed due to bureaucracy and staffing problems.
In response, NSF decided to build its own network, based on the APRAnet's IP technology. It connected the centers with 56,000 bit per second ( 56 kbps )telephone lines. ( This is roughly the ability to transfer two full typewritten pages per second. ) It was obvious. However, that if they tried to connect every university directly to a supercomputing center, they would go broke. You pay for these lines by the mile. One line per campus with a supercomputing center at the hub, like spokes on a bike wheel, adds up to lots of miles of phone lines. Therefore, they decided to create regional networks. In each area of the country, schools would be connected to their nearest neighbor. Each chain was connected to a supercomputing center at one point and the centers were connected together. With this configuration, any computer could eventually communicate with any other by forwarding the conversation through its neighbor.
This solution was successful, but a time came when it no longer worked. Sharing supercomputers also allowed the connected sites to share a lot of other things not related to the centers. Suddenly these schools had a world of data and collaborators at their fingertips. The network's traffic increased until, eventually, the computers controlling the network were overloaded. The older network was replaced with faster telephone lines (by a factor of 20), with faster computers to control it.
And the jungle fire spreads… with budding technologies and faster speeds.