Paper Details  
 
   

Has Bibliography
10 Pages
2521 Words

 
   
   
    Filter Topics  
 
     
   
 

History of the Computer Industry in America

, 40). In 1959, Robert Noyce, a physicist at the Fairchild Semiconductor Corporation, invented the integrated circuit, a tiny chip of silicon that contained an entire electronic circuit. Gone was the bulky, unreliable, but fast machine; now computers began to become more compact, more reliable and have more capacity (Shallis, 49). These new technical discoveries rapidly found their way into new models of digital computers. Memory storage capacities increased 800% in commercially available machines by the early 1960s and speeds increased by an equally large margin. These machines were very expensive to purchase or to rent and were especially expensive to operate because of the cost of hiring programmers to perform the complex operations the computers ran. Such computers were typically found in large computer centers--operated by industry, government, and private laboratories--staffed with many programmers and support personnel (Rogers, 77). By 1956, 76 of IBMs large computer mainframes were in use, compared with only 46 UNIVACs (Chposky, 125). In the 1960s efforts to design and develop the fastest possible computers with the greatest capacity reached a turning point with the completion of the LARC machine for Livermore Radiation Laboratories by the Sperry-Rand Corporation, and the Stretch computer by IBM. ...

< Prev Page 7 of 10 Next >

    More on History of the Computer Industry in America...

    Loading...
 
Copyright © 1999 - 2025 CollegeTermPapers.com. All Rights Reserved. DMCA