We’d all like more battery life from our phones, but the situation could have been much worse without John Hennessy and David Patterson. In the 1980s, as mainframes were yielding ground to microcomputers, the computer scientists invented a new way for software to talk to the CPU, allowing chips to run much faster and use less power. The pair got big thanks for their work today, sharing the $1 million annual ACM A.M. Turing Award, which bills itself as the “Nobel Prize of Computing.”
Years after their pioneering research, their invention, the reduced instruction set computer (aka RISC) became the standard for virtually all mobile device processors. (They were also honored for writing the influential textbook Computer Architecture: A Quantitative Approach.) Before Hennessy and Patterson, all chips were designed to understand many complex instructions, which made life easier for programmers, but also took time and energy. Hennessy and Patterson’s RISC technology simplified each instruction down to something a processor could achieve as efficiently as possible. “You needed maybe 20% more instructions, but you could run each instruction four times faster. That’s a great trade-off,” says Patterson, who went on to chair the UC Berkeley Computer Science Division. (Hennessy later become president of Stanford University.)
RISC first took on the server market, in chips like Sun Microsystems’ SPARC and IBM’s PowerPC; but Intel, which didn’t use RISC, virtually wiped them out with the brute computing force of its conventional processors. The tables turned in this century when smartphones and other tiny battery-powered devices became indispensable. Major chipmakers like Motorola, Samsung, Qualcomm, and Apple picked up ARM’s RISC technology, and it now dominates the mobile industry.