A Brief History of Computer Technology

Overview

Nothing epitomizes modern life better than the computer. For better or worse, computers have infiltrated every aspect of our society. Today computers do much more than simply compute: supermarket scanners calculate our grocery bill while keeping store inventory; computerized telphone switching centers play traffic cop to millions of calls and keep lines of communication untangled; and automatic teller machines (ATM) let us conduct banking transactions from virtually anywhere in the world. But where did all this technology come from and where is it heading? To fully understand and appreciate the impact computers have on our lives and promises they hold for the future, it is important to understand their evolution.

The computer was not created in a single step, but was the result of a slow development process taking hundreds of years. Human imagination very often takes only small evolutionary steps, and the notion of a general-purpose machine could perform an unlimited range of tasks was not one that came in a blinding flash. The technical expertise to build computers existed many years before they actually came into being. However, the overall idea of the computer could only be understood after its separateelements had been developed. Two important steps were the development of simple machines for calculation (calculator) and for industrial control (program-controlled machine).

A complete history of computing would include a multitude of diverse devices such as the ancient Chinese abacus, the Jacquard loom (1805) and Charles Babbage's analytical engine'' (1834). It would also include discussion of mechanical, analog and digital computing architectures. As late as the 1960s, mechanical devices, such as the Marchant calculator, still found widespread application in science and engineering. During the early days of electronic computing devices, there was much discussion about the relative merits of analog vs. digital computers. In fact, as late as the 1960s, analog computers were routinely used to solve systems of finite difference equations arising in oil reservoir modeling. In the end, digital computing devices proved to have the power, economics and scalability necessary to deal with large scale computations. Digital computers now dominate the computing world in all areas ranging from the hand calculator to the supercomputer and are pervasive throughout society. Therefore, this brief sketch of the development of scientific computing is limited to the area of digital, electronic computers.

The evolution of digital computing is often divided into generations. Each generation is characterized by dramatic improvements over the previous generation in the technology used to build computers, the internal organization of computer systems, and programming languages. Although not usually associated with computer generations, there has been a steady improvement in algorithms, including algorithms used in computational science. The following history has been organized using these widely recognized generations as mileposts.

The Mechanical Era (1623-1945)
First Generation Electronic Computers (1937-1953)
Second Generation (1954-1962)
Third Generation (1963-1972)
Fourth Generation (1972-1984)
Fifth Generation (1984-1990)

Sixth Generation (1990 - Beyond)

| Back |