Skip to main content
No. 1059:
Inventing the Computer
Audio

Today, let's watch the microcomputer invent itself. The University of Houston's College of Engineering presents this series about the machines that make our civilization run, and the people whose ingenuity created them.

That extension of your brain, the computer on your desk, is changing human history as dramatically as harnessing fire once did. The programmable computer, first conceived by Charles Babbage in the 1830s, wasn't finally built until the 1930s. At first, we used fragile radio tubes in its logic circuits. Soon after WW-II, we figured out how to replace those bulky and failure-prone tubes with the new transistors. Then the real fun could begin.

Those computers were huge, isolated machines. In 1943 Thomas Watson, chairman of IBM, said, "I think there's a world market for maybe five computers." With that kind of thinking, no one paid much attention in 1952 when a British scientist named Dummar wrote:

It seems now possible to envisage electronic equipment in a solid block with no connecting wires. The block may consist of layers of insulating, conducting, rectifying and amplifying materials, [and] electrical junctions.

The subtle meaning of that remark came clear as computers grew more complex. When an electronic element in a computer had, say, one chance in ten thousand of failing during a day's use, and the computer had ten thousand elements, maintenance became a nightmare.

Dummar's idea of casting a set of electronic functions into one monolithic electric element stood to vastly reduce the rate of failures. In July 1958 Jack Kilby of Texas Instruments finally created such an integrated circuit. A few months later, Robert Noyce, head of Fairchild Semiconductor Corporation, created a slightly better version, independently. And a patent war was underway.

After dumping money into the courts for years, Fairchild and TI saw how foolish combat was. They agreed to forget the lawyers and share the idea. Kilby and Noyce acknowledged one another's contributions, and life went on. It was a very wise thing to do.

By 1969 both Fairchild and TI had managed to put complete central processing units on single chips. Then Noyce formed a new company, INTEL, for INTegrated ELectronics, and he started producing whole computer motherboards. Costs plunged, but we still didn't see where all this was going.

In 1977 the president of Digital Equipment Company could still say, "There's no reason people would want computers in their homes." Then new kinds of software made it possible for you and me to use our computers without writing their programs. And computers promptly did enter our homes -- and the closest quarters of our daily lives, as well.

So the computer was the fruit of a wisdom formed not by the industry, nor the inventor, nor the consumer. It is a wisdom that rises out of all three -- in concert with the machine itself.

I'm John Lienhard, at the University of Houston, where we're interested in the way inventive minds work.

(Theme music)


Reid, T.R., The Chip: How Two Americans Invented the Microchip and Launched a Revolution. New York: Simon and Schuster, 1984.

Malone, M.S., The Microprocessor: A Biography. New York: Springer Verlag, ELOS, 1995.

I am grateful to Jeffery Scoggins of The Detering Book Gallery in Houston for flagging the Malone source for me.