Essays24.com - Term Papers and Free Essays
Search

History Of The Computer

Essay by   •  November 1, 2010  •  1,159 Words (5 Pages)  •  1,755 Views

Essay Preview: History Of The Computer

Report this essay
Page 1 of 5

The History of the Computer

Long before computers were invented, humans realized the need for them. The history of the Computer started about 2000 years ago with the abacus. It is a wooden rack holding two horizontal wires with beads strung on them and was one of the best calculating machines until the seventeenth century (PBS, 1). In 1835, English inventor, Charles Babbage came up with the idea of the Analytical Machine, a general purpose, fully programmed-controlled, automatic mechanical digital computer, which consisted of two parts, a calculating section and a storage section. His machine was capable of reading the punched holes in cards, just as the loom did (Campbell-Kelly, 15-17).

In 1890 Herman Hollerith and James Powers, who worked for the U.S. Census Bureau, developed devices that could read the information that had been punched into cards. The information was recorded by a machine equipped with many metal pins that poked through the punched holes but stopped where no holes existed. The 1890 census was completed in one-third the time taken in 1880, reading errors were reduced and work flow increased (Campbell-Kelly, 20-21). These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by IBM and Remington. These computers used electromechanical devices in which electric power provided mechanical motion. They could be fed a specified number of cards automatically, add, multiply, and divide, and feed out cards with punched results. For more than 50 years after their first use, punched cards machines did most of the world's first business computing, and a considerable amount of the computing work in science (Ceruzzi, 16-17).

The start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tables and other essential data were needed. Two men, John W. Mauchly and J. Presper Eckert, Jr., built a monstrous computer in Philadelphia, completing it in 1941, named ENIAC. "It weighed thirty tons and contained eighteen thousand bulky electric switches called vacuum tubes, costing $400,000 (Campbell-Kelly 87-99)." In 1944, Howard Aiken and others created the fifty-foot-long Mark I at Harvard University (Ceruzzi, 81-82). Both the Mark I and ENIAC were used to improve the ways weapons worked during World War II. Not all computers during the early postwar era were for military items. IBM, UNIVAC, and Control Data all sold huge and expensive computers to the civilian government and to big businesses. They helped businesses keep track of every worker's hours, what he or she was paid and how much Social Security and income tax was being withheld.

The first transistor was created in 1947 in Bell Telephone Laboratories by William Shockley and others. Transistors could perform a crucial on-off switch in a tiny fraction of a second. They were one hundred times smaller than vacuum tubes, worked faster, used less electricity and broke down less frequently (PBS, 2). Advances in technology made it possible to produce circuits containing many transistors, called integrated circuits. In 1971, Intel introduced an intergraded circuit containing a complete central processing unit, which makes all the decisions in a computer; it became known as the microprocessor or microchip (Campbell-Kelly, 222). This concentration of so many transistors in a single chip has several advantages, including the ability to process information at a higher speed. A microchip the size of a penny may contain more information than two hundred transistors. This creation helped reduce the size, weight and cost of computers. During this time, Digital Equipment Corporation took advantage of microchips to produce the first minicomputer. Digital was founded by a scientist named Kenneth Olsen. He and others worked several years to create their small unit which sold for thousands of dollars (Ceruzzi, 264-268).

Once inventors figure out that many transistors and their connections could be etched on a piece of silicon, computers began to shrink in size. The silicon was a fingernail-sized computer processor unit that was able to do more than entire sections of big computers of only a few years earlier. While Intel was the first to sell the microprocessor in 1971, firms such as Motorola and Rockwell began manufacturing their own chips. These tiny bits were running video games making the video game industry

...

...

Download as:   txt (7.1 Kb)   pdf (100 Kb)   docx (11.5 Kb)  
Continue for 4 more pages »
Only available on Essays24.com
Citation Generator

(2010, 11). History Of The Computer. Essays24.com. Retrieved 11, 2010, from https://www.essays24.com/essay/History-Of-The-Computer/6720.html

"History Of The Computer" Essays24.com. 11 2010. 2010. 11 2010 <https://www.essays24.com/essay/History-Of-The-Computer/6720.html>.

"History Of The Computer." Essays24.com. Essays24.com, 11 2010. Web. 11 2010. <https://www.essays24.com/essay/History-Of-The-Computer/6720.html>.

"History Of The Computer." Essays24.com. 11, 2010. Accessed 11, 2010. https://www.essays24.com/essay/History-Of-The-Computer/6720.html.