The history of the computer goes back hundreds of years. From the abacus through the modern era the evolution of computers has involved many innovative individuals. It was out of this desire to innovate many fascinating tabulating machines developed. The modern computer, therefore, evolved from an amalgamation of the genius of many individuals over a long period of history. Many people shaped the world by making the efforts to develop technology. An early counting machine (and relative of the computer) can be traced back to 3000 BC. This device is known as the abacus. Although ancient, the abacus is not archaic.
It is still used in math education and in some businesses for making quick calculations (Long and Long 33C). This ancient device represents how far into history the desire of humans to use a machine for calculations goes. Another early relative of the computer was created in the seventeenth century by Blaise Pascal, a French mathematician (Long and Long 33C). Pascal was born in Clermont-Ferrand on June 19, 1623 and his family settled at Paris in 1629 (Fowlie). In 1642 the young prodigy developed what is now known as “Pascal”s Calculator” (or the “Pascaline”) to speed calculations for his father, a tax collector.
Numbers were dialed on metal wheels on the front of the machine and the solution appeared in windows along the top (Kindersley). The “Pascaline” used a counting-wheel design (Long and Long 33C). “Numbers for each digit were arranged on wheels so that a single revolution of one wheel would engage gears that turned the wheel one tenth of a revolution to its immediate left” (qtd. in Long and Long 33C). All mechanical calculators used this counting- wheel design until it was replaced by the electronic calculator in the mid-1960s (Long and Long 33C). Pascal”s Calculator, however, was only the first step between the abacus and the computer.
The next step involves a loom. In 1801 the weaver Joseph-Marie Jaquard invented a machine that would make the jobs of over worked weavers tolerable (Long and Long 34C). His invention was known as the Jaquard loom. Jaquard’s loom used holes punched in cards to direct the movement of the needle and thread (Long and Long 34C). Jaquard”s use of punched cards is significant because it is considered the earliest use of binary automation, the same system of mathematics employed by computers today (Long and Long 34C). Later in the same century Charles Babbage stepped into the scope of computer history.
Babbage was born in 1792 in British Teignmouth, Devonshire. He was educated at Cambridge, was a fellow of the Royal Society, and was active in founding the analytical, Royal Astronomical, and the Statistical societies (“Charles”). In the 1820s Babbage designed the “Difference Engine”, generally considered a direct forerunner of the modern computer. Although he began construction of his machine he never completed it due to lack of funding and insufficient technology (“Charles”). Nevertheless, in 1991 British scientists constructed the Difference Engine based on the designs of Babbage.
It worked flawlessly, computing up to 31 digits (“Charles”). Although the “Difference Engine” had no memory a later idea, the “Analytical Engine”, would have been a true programmable computer had it been possible to construct the machine (“Babbage”s”). The Analytical Engine was to be computer that could add, subtract, multiply, and divide in automatic sequence at a rate of 60 additions per second (Long and Long 34C). “His 1833 design, which called for thousands of gears and drives, would cover the area of a football field and be powered by a locomotive engine” (qtd. Long and Long 34C).
A woman named Augusta Ada Byron worked along side Babbage while he was designing the Analytical Engine. It was she who suggested punched cards (like those used for Jaquard”s loom) as a primitive memory for the machine. Augusta Ada Byron (Countess of Lovelace) was born in 1815 to the famed poet Lord Byron. She was among the only female mathematicians of her time (“Computer”). Her suggestion that punched cards be used as a type of simple programming for the Analytical Engine earned her the title ” the first computer programmer” (Long and Long 35C).
In addition, the United States Department of Defense honored Byron by naming its high-level security program “Ada”, in 1977 (“Byron”). In 1890 a man named Herman Hollerith devised a machine to speed up census taking (Long and Long 35C). Hollerith was born in 1860 in Buffalo, New York, and was educated at Columbia University (“Hollerith, Herman”). With the aid of a professor he got a job helping with tabulation of the 1880 census, a process that took eight years (Long and Long 35C).
After experiencing the 1880 census Herman devised a “Tabulating Machine” in order to speed the 1890 census. This machine used cards encoded with data in the form of punched holes. The machine read the punched holes after they were passed through electrical contacts (“Hollerith, Herman”). “Closed circuits, which indicated hole positions, could then be selected and counted” (qtd. in “Hollerith, Herman”). “Hollerith”s Tabulating Machine” cut the time it took to do the census to under three years and saved the Census Bureau 5 million dollars (Long and Long 35C).
In addition, the machine represents the first use of punched cards as a set of operation instructions, an idea originating with Jaquard and perpetuated by the Analytical Engine. The innovative Hollerith went on to form the “Tabulating Machine Company”. In 1911 this company merged with several others forming “IBM”, an enterprise vital in the history of computers (Long and Long 35C). For many years IBM marketed punched card machines (mechanical computers) only. All computers were mechanical until the mid-twentieth century. The years between the 1920s and the 1950s are known as the “EAM Era”.
During the EAM (Electromechanical Accounting Machine) Era punched card technology improved. More punched card devices were invented and added to equipment. This innovation led, of course, to more sophisticated capabilities (Long and Long 36C). In 1942, however, Dr. John Atanasoff, a professor at Iowa State University along with a graduate student, Clifford E. Berry, invented the first electronic computing device. They called their computer the ABC (Atanasoff Berry Computer). This was the invention that would set the course of the modern electronic computer into motion.
The ABC used the base 2 numbering system, vacuum tubes, and memory and logic circuits (Long and Long 36C). Iowa State never patented the device and IBM claimed they would never be interested in an electronic computing device (Long and Long 36C). However, “a 1973 federal court ruling officially credited Atanasoff with the invention of the automatic electronic digital computer” (qtd. in Long and Long 36C). While the EAM Era employed electromechanical accounting machines that were controlled by punched cards a significant breakthrough achieved by Atanasoff was the elimination of the need for punched cards.
The first electromechanical computer (without the need for punched cards) was created by Harvard University professor Howard Aiken in 1944 (Long and Long 37C). This computer was known as the “Mark I”. Standing 8 feet high and 51 feet long, the Mark I was a collection of electromechanical calculators similar to the Analytical Engine (Long and Long 37C). Although IBM sponsored development of the Mark I the company felt electromechanical computers would never replace punched card machines (Long and Long 37C).
In 1946, American physicist John W. Mauchly and American electrical engineer J. Presper Eckert Jr. developed a device for the American Army (Long and Long 37C). This computer was known as the Electronic Numerical Integrator and Computer (or ENIAC). ENIAC had various functions. Initially it was used to calculate paths of artillery shells. Later it was used in making calculations for nuclear weapons research, weather prediction, and wind tunnel design (“ENIAC”). Working out of the Moore School of Electrical Engineering at the University of Pennsylvania, Eckert and Mauchly demonstrated ENIAC less than three years after its building was commissioned (“ENIAC”).
In 1947 ENIAC moved from the University of Pennsylvania to the Aberdeen Proving Ground in Maryland where it operated until October 1955 (“ENIAC”). Mauchly and Eckert designed ENIAC with vacuum tubes for processing data. ENIAC had 19,000 light bulb sized tubes, weighed over 30 tons, and occupied 1800 square feet (“ENIAC”). It could perform about 5,000 calculations per second (“ENIAC”).
Although this was faster than previous computers it was still 10,000 times slower than the modern personal computer (“ENIAC”). “Initially, scientists programmed and entered data into ENIAC by manually setting switches and rewiring the machine” (qtd. “ENIAC”). Later, an IBM punch-card reading machine made data input more efficient while another punch-card system was used to store data (“ENIAC”).
The use of vacuum tubes made ENIAC the precursor to “the first generation of computers”. It was another machine, however, the successor to ENIAC, that is thought to have spawned the vacuum tube generation. This machine was the Universal Automatic Computer (UNIVAC). Also designed by Mauchly and Eckert, the UNIVAC was smaller than ENIAC containing only 5,000 vacuum tubes, weighing only 8 tons, and occupying only 943 cubic feet (Hudson).
The UNIVAC was the first electronic computer designed and sold to solve commercial problems (Hudson). Government and commercial customers bought a total of 48 UNIVAC computers (Hudson). The machine was used to tabulate the 1951 census and later that year CBS News gave UNIVAC national exposure when it predicted the results of the presidential election with only 5% of the vote counted (Long and Long 37C). It was finally in 1954 IBM embraced the electronic computer as a viable marketing commodity. The change of heart of the formerly mechanically inclined company came after the success of UNIVAC (Long and Long 37C).
The IBM 650 was introduced as a logical upgrade to previously installed IBM punched card systems (Long and Long 37C). The next generation of computers was started in part by a man named Walter Houser Brattain. Brattain was born in 1902 in Xiamen, China (“Brittain”). He would become a physicist . After working in the radio division of the National Institute of Standards and Technology he joined the staff at Bell Laboratories (“Brittain”). At Bell, Brittain worked with the American physicists William Shockely and John Bardeen to develop the “transistor” (“Brittain”), a critical step in the history of the computer.
For his work on semiconductors and discovery of the transistor effect, Brattain shared the 1956 Nobel Prize in physics with Shockley and Bardeen” (qtd. in “Brittain”). The invention of the transistor ushered in the second generation of computers. Computers using transistors were more powerful, more reliable, and less expensive than those using vacuum tubes (Long and Long 38C). In 1959 the Honeywell 400 transistor driven computer established the company “Honeywell” as a competitive force in the transistor generation (Long and Long 38C).
In addition, the company Digital Equipment Corporation introduced the first “minicomputer”, the “PDP-8” in 1963 (Long and Long 38C). Minicomputers represented a cheaper alternative to mainframes. Many businesses could afford minicomputers. The PDP-8 was, therefore, a step towards the modern popularity of computers. The late 1960s saw the introduction of the “integrated circuit” and the third generation of computers. Integrated circuits were tiny transistors and other electrical components arranged on a single silicon chip (“Computers”). They replaced individual transistors in computers (“Computers”).
IBM introduced the first integrated circuit based computer in 1964, the “IBM System 360” (Long and Long 38C). The System 360 was introduced as a computer with “upward compatibility” (Long and Long 38C). Thus, when a company out grew one model it could move up to the next without having to convert data (Long and Long 38C). Computers based around integrated circuits made previous systems obsolete (Long and Long 38C). In the 1970s refinements of integrated circuits led to the development of modern microprocessors (“Computer”). Microprocessors are integrated circuits that contain thousands of transistors (“Computers”).
Modern microprocessors can contain as many as 10 million transistors (“Computers”). It was development of the modern microprocessor that ushered in the fourth generation of computers. The first “fourth generation computer” was marketed by Instrumentation Telemetry Systems (“Computer”). The Altair 8800 was introduced in 1975 (“Computer”). “It used an 8-bit Intel 8080 microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and displayed output on rows of light-emitting diodes (LED’s)” (qtd. in “Computer”). The Altair was the first PC (Personal Computer).
Over time more technological innovation lead to video displays, better storage devices, and CPUs with better computational abilities (“Computer”). The microprocessor generation has experienced a massive growth in the popularity of computers. This is partly the result of the introduction of Graphical User Interfaces (GUIs). Popularized especially by the Microsoft operating system “Windows”. GUIs are defined as user friendly interfaces used to interact with the computer system by pointing to processing options with an input device such as a mouse (Long and Long G7).
Initially, Xerox invented GUIs. It was Apple, however, that introduced the first graphical user interface in 1984 with the Macintosh desktop computer (Long and Long 40C). It is clear the development of the modern computer is an amalgamation of many centuries of amassed human knowledge. From the abacus, through the tabulating machines of the past, through the EAM Era and the four generations one breakthrough led to another. The sum of this amassed human knowledge has grown into what we know as the modern computer.