Paper Details  
 
   

Has Bibliography
10 Pages
2521 Words

 
   
   
    Filter Topics  
 
     
   
 

History of the Computer Industry in America

(Rogers, 98). The trend during the 1970s was, to some extent, away from extremely powerful, centralized computational centers and toward a broader range of applications for less-costly computer systems. Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, began using computers of relatively modest capability for controlling and regulating their activities. In the 1960s the programming of applications problems was an obstacle to the self-sufficiency of moderate-sized on-site computer installations, but great advances in applications programming languages removed these obstacles. Applications languages became available for controlling a great range of manufacturing processes, for computer operation of machine tools, and for many other tasks (Osborne, 146). In 1971 Marcian E. Hoff, Jr., an engineer at the Intel Corporation, invented the microprocessor and another stage in the deveopment of the computer began (Shallis, 121). A new revolution in computer hardware was now well under way, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scale integration techniques. In the 1950s it was realized that "scaling down" the size of electronic digital computer circuits and parts would increase speed and ef...

< Prev Page 9 of 10 Next >

    More on History of the Computer Industry in America...

    Loading...
 
Copyright © 1999 - 2025 CollegeTermPapers.com. All Rights Reserved. DMCA