Remembering the Legacy of Trailblazing Technologist Gordon Moore

The Intel cofounder charted the growth of transistors

4 min read
A black and white photo of a man in a tie and glasses.
Intel

Intel cofounder Gordon E. Moore, the man behind Moore’s Law, died on 24 March at the age of 94.

The IEEE Fellow was awarded the 2008 IEEE Medal of Honor for “pioneering technical roles in integrated-circuit processing, and leadership in the development of MOS memory, the microprocessor computer, and the semiconductor industry.”

Moore founded Intel in 1968 with computing pioneer Robert Noyce. Moore, Noyce, and other Intel engineers are credited with bringing laptop computers and numerous other electronics to millions of people thanks to their semiconductor development. Intel microprocessors now power personal computers made by major manufacturers including Dell, HP, and IBM.

Moore is best known for his 1965 prediction, which would become known as Moore’s Law: the observation that the number of transistors on an integrated circuit would grow exponentially while the retail cost of computers would decrease.

His original hypothesis, published in a 1965 Electronics magazine article, was that the number of transistors would double each year. His projection came true over the decade that followed. In 1975 he revised the theory and forecast that transistors would double every 18 months—a statement that held true for several decades. Moore’s Law set the bar for semiconductor manufacturers and is still driving computing innovations today.

“Gordon Moore, with his prediction that turned to law, captured the very gestalt of the semiconductor industry as an exponential ambition,” says IEEE Fellow Aart de Geus, CEO of Synopsys. “He became not only a visionary but also our coach, pushing us to build the impossible. Now, 58 years later, classic Moore’s Law has morphed into SysMoore—systemic complexity with a Moore’s Law ambition. His legacy fuels our aspirations and inspirations to further decades of exponential impact.

“Gordon, thank you for being the motivating coach in our field and on my own professional path!”

From researcher to entrepreneur

Moore received a bachelor’s degree in chemistry in 1950 from the University of California, Berkeley. After earning his Ph.D., also in chemistry, in 1954 from Caltech, he began his career as a researcher in the Applied Physics Laboratory at Johns Hopkins University, in Baltimore.

After two years, he moved back to California and joined Shockley Semiconductor, a West Coast division of Bell Labs that set out to develop an inexpensive silicon transistor. Unhappy with William Shockley’s leadership, Moore, Noyce, and six other Shockley associates left the company on the same day in 1957. They became known as the “traitorous eight” when they left to form Fairchild Semiconductor, a division of Fairchild Camera and Instrument in Sunnyvale, Calif. The company became a pioneer in the manufacturing of transistors and ICs.

The cornerstone of Silicon Valley

A photo of 3 men behind the image of a microchip. Moore [right], along with Andy Grove [left] and Robert Noyce founded Intel in 1968.Intel

Moore and Noyce decided in 1968 to leave Fairchild and start their own company dedicated to semiconductor memory. The two engineers, along with Andrew Grove, an IC engineer and former assistant director of development at Fairchild, founded Integrated Electronics (later shortened to Intel). Moore served as the company’s executive vice president.

The founders experimented with silicon-gate metal-oxide semiconductors. To create MOS, they deposited aluminum wires connecting multiple transistors on the surface of a thumbnail-size piece of silicon. The chemically treated substance was key to the development of smaller and smaller electronic circuitry that would work at increasingly higher speeds.

Intel’s first product, the 3101 64-bit SRAM, was released in 1969. It was nearly twice as fast as existing memory products by competitors including Fairchild and the Electrotechnical Laboratory of Tsukuba, Japan. Intel’s 1103 was released in 1970, and it became the world’s best-selling semiconductor memory chip by 1972.

The company created the first commercially available microprocessor, the 4004, in 1971. It miniaturized the central processing unit, enabling small electronics to perform calculations that only large machines had been capable of doing.

Moore served as Intel’s president from 1975 to 1979, and then became CEO and chairman of the board. In the early 1980s, inspired by the success of the 4004, Moore decided to shift the company’s focus from semiconductors to microprocessors.

Intel supplied microprocessors to several companies, including IBM, helping it capitalize on the rapidly growing PC market and ushering in a 10-year period of unprecedented growth.

Moore stepped down as CEO in 1987 but remained chairman until he retired in 1997. He served as chairman emeritus until 2006.

Under his leadership, Intel didn’t just fuel the growth of personal computing; it also provided the foundation of what became known as Silicon Valley, as detailed in his Washington Post obituary. Intel helped cement the region as a global center for technological innovation, the article says.

Moore’s Law: a self-fulfilling prophecy

In Moore’s now-famous article for Electronics, he predicted the trajectory of how powerful microchips would become over time, while costs to the consumer would continue to drop.

“At the time I wrote the article, I thought I was just showing a local trend,” he told IEEE Spectrum in 2015. “The integrated circuit was changing the economy of the whole [electronics] industry, and this was not yet generally recognized. So I wrote the article to try to get the point across: This is the way the industry is going to get things really cheap.”

His theory came from an observation of the planar transistor—designed in 1957 by Fairchild physicist Jean Hoerni, in which the oxide layer is left in place on a silicon wafer to protect the sensitive semiconductor materials underneath.

“I noticed that the [number of components] had about doubled every year. And I just did a wild extrapolation, saying it’s going to continue to double every year for the next 10 years,” he told IEEE Spectrum.

Nearly 60 years later, his prediction is still driving the industry forward. As of December 2022, the largest transistor count on a commercial processor—Apple’s M1 Ultra chip—was 114 billion.

Although Moore’s Law will inevitably slow and come to an end, Intel predicts that chip density will continue to increase to 3 trillion transistors by 2030.

A lasting legacy

Moore received several IEEE recognitions for his pioneering innovations. In addition to the 2008 Medal of Honor, he received the IEEE Computer Society’s 1978 Goode Memorial Award and its 1978 McDowell Award, and he and Noyce received its 1986 Computer Entrepreneur Award.

In 2002 Moore received a U.S. Presidential Medal of Freedom—the country’s highest civilian honor. He also was awarded a National Medal of Technology and Innovation in 1990.

Moore was a dedicated philanthropist who donated to charities devoted to environmental conservation, science, and improved health care. Along with his wife of 72 years, in 2000 Moore established the Gordon and Betty Moore Foundation, which has donated more than US $5.1 billion to charitable causes.

“Gordon Moore’s contributions to society went far beyond semiconductors and Moore’s Law,” says IEEE Member Siavash Alamouti, cofounder of computing company Mimik and the 2022 Marconi Prize recipient. “He was a champion for digital inclusion and supported our initiatives for affordable and open mobile Internet and many other impactful technologies with a direct impact on our lives. He will be sorely missed.”

The Conversation (0)