Moore’s Not Enough: ​4 New Laws of Computing

Moore’s and Metcalfe’s conjectures are taught in classrooms every day—these four deserve consideration, too

8 min read
Laptop and book low poly vector illustration stock illustration
iStock Photo

I teach technology and information-systems courses at Northeastern University, in Boston. The two most popular laws that we teach there—and, one presumes, in most other academic departments that offer these subjects—are Moore’s Law and Metcalfe’s Law. Moore’s Law, as everyone by now knows, predicts that the number of transistors on a chip will double every two years. One of the practical values of Intel cofounder Gordon Moore’s legendary law is that it enables managers and professionals to determine how long they should keep their computers. It also helps software developers to anticipate, broadly speaking, how much bigger their software releases should be.

Metcalfe’s Law is similar to Moore’s Law in that it also enables one to predict the direction of growth for a phenomenon. Based on the observations and analysis of Robert Metcalfe, co-inventor of the Ethernet and pioneering innovator in the early days of the Internet, he postulated that the value of a network would grow proportionately to the number of its users squared. A limitation of this law is that a network’s value is difficult to quantify. Furthermore, it is unclear that the growth rate of every network value changes quadratically at the power of two. Nevertheless, this law as well as Moore’s Law remain a centerpiece in both the IT industry and academic computer-science research. Both provide tremendous power to explain and predict behaviors of some seemingly incomprehensible systems and phenomena in the sometimes inscrutable information-technology world.

King Camp Gillette reduced the price of the razors, and the demand for razor blades increased. The history of IT contains numerous examples of this phenomenon, too.

I contend, moreover, that there are still other regularities in the field of computing that could also be formulated in a fashion similar to that of Moore’s and Metcalfe’s relationships. I would like to propose four such laws.

Law 1. Yule’s Law of Complementarity

I named this law after George Udny Yule (1912), who was the statistician who proposed the seminal equation for explaining the relationship between two attributes. I formulate this law as follows:

If two attributes or products are complements, the value/demand of one of the complements will be inversely related to the price of the other complement.

In other words, if the price of one complement is reduced, the demand for the other will increase. There are a few historical examples of this law. One of the famous ones is the marketing of razor blades. The legendary King Camp Gillette gained market domination by applying this rule. He reduced the price of the razors, and the demand for razor blades increased. The history of IT contains numerous examples of this phenomenon, too.

The case of Atari 2600 is one notable example. Atari video games consisted of the console system hardware and the read-only memory cartridges that contained a game’s software. When the product was released, Atari Inc. marketed three products, namely the Atari Video Computer System (VCS) hardware and the two games that it had created, the arcade shooter game Jet Fighter and Tank, a heavy-artillery combat title involving, not surprisingly, tanks.

Crucially, Atari engineers decided that they would use a microchip for the VCS instead of a custom chip. They also made sure that any programmer hoping to create a new game for the VCS would be able to access and use all the inner workings of the system’s hardware. And that was exactly what happened. In other words, the designers reduced the barriers and the cost necessary for other players to develop VCS game cartridges. More than 200 such games have since been developed for the VCS—helping to spawn the sprawling US $170 billion global video game industry today.

A similar law of complementarity exists with computer printers. The more affordable the price of a printer is kept, the higher the demand for that printer’s ink cartridges. Managing complementary components well was also crucial to Apple’s winning the MP3 player wars of the early 2000s, with its now-iconic iPod.

From a strategic point of view, technology firms ultimately need to know which complementary element of their product to sell at a low price—and which complement to sell at a higher price. And, as the economist Bharat Anand points out in his celebrated 2016 book The Content Trap, proprietary complements tend to be more profitable than nonproprietary ones.

Law 2. Hoff’s Law of Scalability

This law is named after Marcian Edward (Ted) Hoff Jr.—the engineer who convinced the CEO of Intel to apply the law of scalability to the design and development of processors. Certainly, the phenomenon of scalability was well known in the automobile industry before it made a significant impact on the computing industry. Henry Ford was a notable example of the application of this scalability law. Henry Ford’s company was perhaps the first company to apply this law on a grand scale. Ford produced the Model T, which was the first mass-produced car. At the core of Henry Ford’s achievement was the design of an automobile that was made for mass production. Ford’s engineers broke down the assembly process of the Model T into 84 discrete steps. The company standardized all the tasks and assigned each worker to do just one task, thus standardizing the work each worker performed as well. Ford further built machines that could stamp out parts automatically. Together with Ford’s innovative development of the first moving assembly line, this production system cut the time to build a car from 12 hours to about 1.5 hours. The Model T is probably the paradigmatic example of how standardization enables designing processes for scalability.

Until the early 1960s, each IBM system had its own distinct operating system, processor, peripherals, and application software. After the purchase of a new IBM computer, customers had to rewrite all their existing code.

Intel also mastered the law of scalability early in its history. In 1969, Busicom, a Japanese company, approached Intel about building custom chips for use in its programmable computers. Gordon Moore was not interested in a custom chip because he knew that it would not be scalable. It was the quest to create a scalable product that led Intel’s Ted Hoff to partition the chip into a general-purpose logic processor chip and a separate read-only memory (ROM) chip that stored an application program. As Albert Yu shows in his history of Intel, Creating the Digital Future, the fledgling semiconductor company’s general-purpose processor, the 4004, was scalable and pretty much bequeathed the world the hardware architecture of the modern computer. And it was Hoff who redesigned the 4004 to scale. Hoff’s Law of Scalability could thus be described as follows:

The potential for scalability of a technology product is inversely proportional to its degree of customization and directly proportional to its degree of standardization.

In sum, the law predicts that a technology component or process that has a high degree of customization and/or a lower degree of standardization will be a poor candidate for scaling.

Law 3. Evans’s Law of Modularity

This law derives its name from Bob Overton Evans. He was the engineer who in the early 1960s persuaded IBM’s chairman, Thomas J. Watson Jr., to discontinue IBM’s technology design approach, which had produced a hodgepodge of incompatible computers. Evans advocated that IBM should instead embark on the development of a family of modular computers that would share peripheries, instructions, and common interfaces. IBM’s first product family under this new design rubric was called System/360.

Prior to this era, IBM and other mainframe computer manufacturers produced systems that were unique. Each system had its own distinct operating system, processor, peripherals, and application software. After the purchase of a new IBM computer, customers had to rewrite all their existing code. Evans convinced CEO Watson that a line of computers should be designed to share many of the same instructions and interfaces.

If a paper is copied four times, one can now share the resource with five people. But digitize the document and the value-creation opportunities are multiplicative rather than additive.

This new approach of modular design meant that IBM’s engineers developed a common architecture (the specification of which functions and modules will be part of the system), common interfaces (a description of how the modules will interact, fit together, connect, and communicate), and common standards (a definition of shared rules and methods that would be used to achieve common functions and tasks). This bold move on Big Blue’s part created a new family of computers that revolutionized the computer industry. Customers could now protect their investments because the instructions, software, and peripheries were reusable and compatible within each computer family.

Evans’s Law could be formulated as follows:

The inflexibilities, incompatibilities, and rigidities of complex and/or monolithically structured technologies could be simplified by the modularization of the technology structures (and processes).

This law predicts that the application of modularization will reduce incompatibilities and complexities.

One further example of Evans’s Law can be seen in the software development industry, as it has shifted from the “waterfall” to the agile software development methodology. The former is a linear and sequential model stipulating that each project phase can begin only the previous phase has ended. (The name comes from the fact that water flows in only one direction down a waterfall.) By contrast, the agile development approach applies the law of modularization to software design and the software development process. Agile software developments tend to be more flexible, more responsive, and faster.

In other words, modularization of software projects and the development process makes such endeavors more efficient. As outlined in a helpful 2016 Harvard Business Reviewarticle,the preconditions for an agile methodology are as follows: The problem to be solved is complex; the solutions are initially unknown, with product requirements evolving; the work can be modularized; and close collaboration with end users is feasible.

Law 4. The Law of Digitiplication

The concept of digitiplication is derived from two concepts: digitalization and multiplication. The law stems from my own study and observations of what happens when a resource is digitized or a process is digitalized.

The law of digitiplication stipulates that whenever a resource or process is digitalized, its potential value grows in a multiplicative manner.

For example, if a paper is copied four times, one can now share the resource with five people. But digitize the document and the value-creation opportunities are multiplicative rather than additive.

Consider the example of a retail store. The store’s sales reps, tasked with selling physical products to individual people, are able to service only one customer at a time. However, if the same retail environment is placed online, many customers can view the store’s products and services. Digital text can also easily be transformed into an audio format, providing a different kind of value to customers. Search functionality within the store’s inventory of course adds another layer of value to the customer. The store’s managers can also monitor how many customers are viewing the store’s website’s pages and for how long. All of these enhancements to the customer’s (and retailer’s) experience provide different kinds of value. As can be seen by these examples, the digitalization of a resource, asset, or process creates multiplicative rather than additive value.

As a further example, Amazon founder Jeff Bezos first began digitizing data about books as a way to facilitate more and greater book sales online. Bezos quickly transformed Amazon into a digitiplication engine by becoming a data-centric e-commerce company. The company now benefits from the multiplicative effects of digitalized processes and digitized information. Amazon’s search, selection, and purchase functions also allow the company to record and produce data that can be leveraged to predict what the customer wants to buy—and thus select which products it should show to customers. The digitization of customer feedback, seller ratings, and seller feedback creates its own dimension of multiplicative value.

Conclusion

These four laws can be useful for engineers and designers to pose questions as they begin to develop a product. For example: Do customer requirements lend themselves to a product design that could be scaled (or mass-produced)? Might the functional requirements they’re working with be satisfied through the development of a modular product design? Could Yule’s Law of Complementarity provide cues toward mass production or modular design alternatives? Could product complements be developed in-house or outsourced? Software engineers might also be led toward productive questions about how data could be digitized, or how specific processes could be digitalized to leverage the law of digitiplication.

The fields of IT and electrical engineering and computer science (EECS) have become critical disciplines of the digital age. To pass along the most succinct and relevant formulations of accumulated knowledge to date to the next generation, it's incumbent on academics and thought leaders in these essential technical fields to translate lessons learned into more formalized sets of theorems and laws. Such formulations would, I hope, enable current and future generations of IT and EECS professionals to develop the most useful, relevant, impactful, and indeed sometimes even disruptive technologies. I hope that the proposed four laws in this article could help to trigger a larger discussion about the need for and relevance of new laws for our disciplines.

The Conversation (7)
Jeremy Chabot
Jeremy Chabot04 Feb, 2022

I'm not sure the modding community would at all agree with the formulation of this 'law of scalability'.

Extensible platforms definitely need careful standardization to be successful but I would argue their entire premise is that they break this 'law of scalability'.

The most famous example would be Minecraft, however there are many other heavily modded platforms which are far more customizable than Minecraft even down at a systems level. For example Warcraft III, Civilization IV, Starcraft 2 in that order.

Duncan Walker
Duncan Walker05 Feb, 2022
LM

Law 3 most overlaps with Dave Parnas' Information Hiding Principle, used throughout software development. In System/360, it was used in the computer architecture and the hardware interfaces, to hide the wide variation in system implementations.

Rochish Manda
Rochish Manda21 Feb, 2022
INDV

In economics -

Law 1 could be translated as 'price elasticity of supply'.

Law 2 could be inferred from 'economies of scale'.

Law 3,4 could determine the slope of the 'yield curve', and hence ways to mitigate long-term recession/losses.

Nonetheless, a nice article with real-life examples.