year # of subnets forming the Internet
1985 100
1989 500
1990 2,218 (Center for Defense data)
1991 4,000 (approximate figure from NSF)
By 1994 there were approximately 20,000 separate networks, in 100 countries, encompassing more than 2.5 million computers on the Internet. The present growth rate is 100% per year. There are estimates of nearly 15 millions user in the USA and nearly 25 million worldwide.
The number of people accessing the Internet will increase faster still with the easy access that individuals now have via commercial vendors who provide dial-up access through modems and phone lines. The major telephone companies, cable companies and the traditional purveyors of networks access are all competing for this new growth area.
The ARPANET was a link between universities, military centers and defense contractors. It was built to facilitate comunications between researchers and, appropriately during the cold war, to maintain communications in case of a nuclear catastrophe. The best way to achieve the later was to have dynamic routing from point to point, so that alternate communications paths could always be found, short of comple destruction of all the wires on the network.
ARPA, under the new acronym DARPA (Defense Advanced Project Agency) started a new project called the Internetting Project in 1973. The agency was then looking for ways to connect networks of different architectures; gateways (specialized computers which translate one network ``protocol'' into another) came into being then.
In 1974, Robert Kalm and Vinton G. Cerf established the protocol that we know today as TCP/IP; it stands for Transmission Control Protocol/Internet Protocol . The IP rules standardize the network addressing conventions, while the TCP rules guarantee the proper delivery of the information at the proper location.
The Internet could be said to be a network of networks that run under the TCP/IP protocol suite.
This last way of looking at the Internet as the ``mother of all networks'' -- a network of networks -- shows its dipersed and decentralized nature. This what makes it so robust. Cutting a wire somewhere, or putting a gateway off line, may sever a little piece of the Internet, but by no means can it bring down the whole thing! This would imply a nightmare of administration if it would all be centralized; fortunately, much of the administration of the Internet is at the local network level. Of course, the addressing system must be consistent as a whole.
There are also other protocols which are in common use; in particular the Open Systems Interconection (OSI) is fast gaining ground over TCP/IP, especially in Europe. This may one day become ``the standard'', being developed by the International Organization for Standardization (IOS).
Packet switching breaks the data into small pieces, and adds to each piece a header specifying the destination address. The computers, gateways, routers along the way examine each packet coming their way and move them to the next site, which is closer to the final destination. There are two advantages: alternate routes can often be found by the computers on the way in case of a network failure; and the relative speed of the components along the way and of the communicating computers is somewhat irrelevant, in that the network acts as a buffer to equalize timing differences.
In 1988, NSF awarded a contract to MERIT (Michigan Education and Research Infrastructure Triad), MCI and IBM to administer the network. In 1990, NSF anounced the formation of ANS (Advanced Network & Services) under Merit, IBM and MCI, to was administer the backbone NSFNET. ANS was to form, and has already done so, ANS CO+RE, a commercial subsidiary, to support the commercial use of the network. UUNET and Performance Systems International are other commercial providers of network services.
With this new direction in ANS, the NSFNET is no longer the major backbone of the Internet in the USA. It remains an important network, but nothing more. Thre are also numerous local networks which have become very large as well; for instance PREPNET (in Pennsylvania), SURANET (Southern States) and MIDNET, which are sometimes run by Universities and sometimes by commercial organizations.
People do not remember numbers well, so each IP address is granted a mnemonic equivalent, a combination of words separated by dots. For instance 129.25.1.46 translates into newton.physics.drexel.edu, a name sanctioned by the general authority of the Internet.
A name server is a computer which acts as ``name resolver'' for a subnet. The command
nslookup newton.physics.drexel.edu
will instruct your computer to ask aname server ``who is this computer?,'' upon which it will be told its IP address. The adressing is dynamic under TCP/IP.
Suffixes indicate various organizations or countries. For instance:
.edu educational (e.g. Drexel) .gov government (e.g. the White House) .mil military (e.g. the Pentagon) .net network administration (e.g. uunet) .org organizations (e.g. National Public Radio)
In addition, each country has its own suffix: .me for Mexico, .jp for Japan, .uk for the United Kingdom, and so on. Many public schools in this country now add a .us extension onto their names.
Then comes a series of tools for actually searching Internet resources. Among others, let's cite:
- Hypertext
That is the text is interlaced with links that allow to read the information not only in a linear fashion like in a book, but also easily jump to new entry points following leads given by highlighted text- Hypermedia
The links can point to any type of information, whether it be textual, graphical, sound or even movies- Graphical
The browsers are best at displaying graphical contents; an image is worth a thousand words- Global
The links can point to information anywhere on the internet and be fully cross referenced accross the net- Cross-platform
There exists ``browsers'' and ``servers'' for MacIntoshes, PCs and UNIX based computers of most types.- Distributed
The information is distributed over the thousands of computers across the Internet that participate in it, each contributing space to store the information. As a user of the information you only need to figure out where the required information is located; you then retrieve the information through the net, and then let your local system reclaim the space once you are done. No need to mount countless disks or CD to get the information!- Dynamic
The web is growing at a rate of about 20% per month, by some estimates. The information at the various sites is easy to update and often times it allows for up-to-date information to be availble.- Interactive
The simple action of going out to new sites to get further information is an interactive action per se. In addition the browsers aften allow ``interactive forms'' to be built. Connection to e-mail is possible, as well as connections to news readers.
The exploration is done via a browser which works on your local computer; it starts from a ``home'' page of your choice. Each time the document asks for a link it is responsible to open a link to the site where the information resides and obtain the information, and then to display it, whether it be text or graphics. It is the browser which formats the ``page'' on the screen. The information is transferred via the HyperText Transfer Protocol (HTTP), a simple protocol invented at CERN, the European Center for Particle Physics in Switzerland, from the ``server'' to the ''client''.
The documents are written in the HyperText Markup Language (HTML), also invented at CERN, which is based on the Standard Generalysed Markup Language (SHML). The characteristic of these languages is taht only the logical structre of the text is marked via inserted ``tags'', with the burden of formatting the page is left for the browsers. No fonts, no color information, no typesetting information need to be carried over the network, insuring optimal speed in transmission.
The browsers are what get the information from the servers, format it for display, and call external viewers if necessary to display the information. Two that are the most popular now are Mosaic, from NCSA, which was the first really useful user interface to the Web, and netscape, written by the same group of people who wrote Mosaic, but who have now formed their own company.
You can start your exploration of the WWW from either of two sites:
CERN, which was the birth place of the Web in 1990 since the HTTP and the HTML language was invented there. You will find in there a worldwide list of the sites on the Web. You will also find ther a primer for the HTML language.
NCSA, the National Center for Supercomputing Applications, the birth place of Mosaic , the first fully graphical, easy to use, and yet sophisticated interface to the HTML language texts. You will also find there a wealth of information on the various sites on the Web, and in particular a ``What's new on the Web'' list.
One recent reference on the Internet is:
The Internet Navigator Paul Gilster John Wiley & Sons, Inc 1994
The statistics quoted in this section are drawn from this reference. However, as with all Internet-related books, the contents were probably obsolete well before the date of publication!