This week in history: The integrated circuit is patented
Robert Noyce's design for an integrated circuit helped usher in the modern era of computers and earned him a Nobel Prize in Physics.
In the early days of computing, one of the biggest problems was the huge size of the machines. This was due to the large vacuum tubes that controlled the electricity that went through each circuit in the computer. As a result, even the simplest computers could take up a whole room.
A partial solution came in 1947 when the vacuum tube was replaced by the smaller and more reliable transistor. This drastically reduced the size of computers.
However, even with this advance, engineers quickly found that the size of the transistors, as well as the wires and other parts of the circuits, still limited the complexity of the circuits they could build.
Subscribe to MoneyWeek
Subscribe to MoneyWeek today and get your first six magazine issues absolutely FREE
Sign up to Money Morning
Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter
Don't miss the latest investment and personal finances news, market analysis, plus money-saving tips with our free twice-daily newsletter
In 1958, Jack Kilby at Texas Instruments and Robert Noyce of Fairchild Semiconductor both independently came up with the idea of the integrated circuit'. The idea was to produce the whole circuit, including transistors, on a single block of semiconductor material (known as a computer chip).
Kilby was first to come up with the idea, but Noyce's design, which used silicon rather than germanium, was seen as more efficient and suitable for mass production. They shared the Nobel Prize in Physics in 2000.
Since the development of the integrated circuit, computer companies have focused on increasing the number of transistors on each chip. One of the top chip manufacturers is Intel, which Noyce co-founded with Gordon E Moore in 1968.
Moore is known for the observation that the number of transistors on a single chip tends to double every two years. Experts agree that Moore's Law' has generally held true since 1971, resulting in a huge increase in computer speed and power over time.
Sign up to Money Morning
Our team, led by award winning editors, is dedicated to delivering you the top news, analysis, and guides to help you manage your money, grow your investments and build wealth.
Matthew graduated from the University of Durham in 2004; he then gained an MSc, followed by a PhD at the London School of Economics.
He has previously written for a wide range of publications, including the Guardian and the Economist, and also helped to run a newsletter on terrorism. He has spent time at Lehman Brothers, Citigroup and the consultancy Lombard Street Research.
Matthew is the author of Superinvestors: Lessons from the greatest investors in history, published by Harriman House, which has been translated into several languages. His second book, Investing Explained: The Accessible Guide to Building an Investment Portfolio, is published by Kogan Page.
As senior writer, he writes the shares and politics & economics pages, as well as weekly Blowing It and Great Frauds in History columns He also writes a fortnightly reviews page and trading tips, as well as regular cover stories and multi-page investment focus features.
Follow Matthew on Twitter: @DrMatthewPartri
-
Water companies blocked from using customer money to pay “undeserved” bonuses
The regulator has blocked three water companies from using billpayer money to pay £1.5 million in exec bonuses
By Katie Williams Published
-
Will the Bitcoin price hit $100,000?
With Bitcoin prices trading just below $100,000, we explore whether the cryptocurrency can hit the milestone.
By Dan McEvoy Published