s more than a mere mortal could fathom, but gathering raw data and "keying" it in so the computer could "crunch the numbers" was a complicated and time-consuming task. Frustrations abounded, computer errors were called "glitches," and the phrases "garbage in/garbage out," "It's a computer mistake," and "Sorry, the computer's down and we can't do anything," were introduced into the lexicon. On college campuses in the 1960s, students carried bundles of computer cards to and from class, hoping that their share of the valuable computer time would not be bumped or allocated to someone else. The term, "Do not fold, spindle or mutilate," was coined so people wouldn't disable the process of feeding the punched computer cards into punch card readers, where the intricate patterns of holes were decoded. The computer mystique was reinforced in people every time they heard of some new accomplishment. In 1961, a computer calculated the value of pi to 100,000 decimal places. A computer could play checkers, and in 1967 a chess playing computer program was made an honorary member of the United States Chess Federation. Banks began printing checks with magnetic ink so they could be processed by the computers. A Small Change in Thought Until 1971, nobody even thought of a computer as anything but a big, fast, electronic brain that resided in a climate-controlled room and consumed data and electricity in massive quantities. In 1971, an Intel 4004 chip containing 4004 transistors was programmed to perform complex mathematical calculations; the hand-held calculator was born. Suddenly, scientists and engineers could carry the computational power of a computer with them to job sites, classrooms, and laboratories; but the hand-held calculator, like the ENIAC before it, was not yet a computer. The microprocessor was developed by Robert Noyce, the founder of Intel and one of the inventors of the integrated circuit, a...