war: distributed computer networks are less susceptible to damage because transmissions can be routed around the damage. Standard protocols ensure that any platform can be connected to this network and this meant that local area networks (LANs) could be linked while retaining all the advantages of LANs, specifically the need not to rely on a single timesharing computer. These developments have continued through the 1970's and 1980's and now we are at the Internet, as we know it. The Internet is an informal network of networks spanning the globe, with almost 4 million hosts, each of which may be serving anywhere between one and 2 million users. Theorists believe that by the year 2003 everyone in the world could be connected to the Internet (Treese, 1994). Alongside this growth that is aided by availability of low-cost computers, free software and inexpensive telecommunications, is the most important fact, that any single authority does not control The Internet. The Internet Society (ISOC) is a voluntary organization responsible for technical standards while the Internet Engineering Task Force (ITF) handles operational and technical problems, but no single body can be said to control the Internet and what is distributed over it. There is no overall set of standards to apply to the quality of material available over the Internet, quality-meaning factors like accuracy, currency, editing and updating policies. Right now quality control is only exercised by the people who make the documents and because of that the standards are sometimes low. There is also a problem of currency and revision as well as the accuracy of the original material and the most common complaint that out-of-date items are being re-found, sometimes after several years (Cockerill, 1994). Secondly, the anarchic nature means that there is little or no control over the content of documents posted over the Internet. National governments may try to apply legislation bu...