This week in history: The integrated circuit is patented

Robert Noyce's design for an integrated circuit helped usher in the modern era of computers and earned him a Nobel Prize in Physics.

14-4-25-noyce

Robert Noyce and a Nobel-winning idea

In the early days of computing, one of the biggest problems was the huge size of the machines. This was due to the large vacuum tubes that controlled the electricity that went through each circuit in the computer. As a result, even the simplest computers could take up a whole room.

However, even with this advance, engineers quickly found that the size of the transistors, as well as the wires and other parts of the circuits, still limited the complexity of the circuits they could build.

MoneyWeek

Subscribe to MoneyWeek today and get your first six magazine issues absolutely FREE

Get 6 issues free
https://cdn.mos.cms.futurecdn.net/flexiimages/mw70aro6gl1676370748.jpg

Sign up to Money Morning

Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter

Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter

Sign up

In 1958, Jack Kilby at Texas Instruments and Robert Noyce of Fairchild Semiconductor both independently came up with the idea of the integrated circuit'. The idea was to produce the whole circuit, including transistors, on a single block of semiconductor material (known as a computer chip).

Kilby was first to come up with the idea, but Noyce's design, which used silicon rather than germanium, was seen as more efficient and suitable for mass production. They shared the Nobel Prize in Physics in 2000.

Since the development of the integrated circuit, computer companies have focused on increasing the number of transistors on each chip. One of the top chip manufacturers is Intel, which Noyce co-founded with Gordon E Moore in 1968.

Moore is known for the observation that the number of transistors on a single chip tends to double every two years. Experts agree that Moore's Law' has generally held true since 1971, resulting in a huge increase in computer speed and power over time.

Dr Matthew Partridge
MoneyWeek Shares editor