The Innovation Engine • Government-funded Academic Research
- Media
- article
- Title
- The Innovation Engine • Government-funded Academic Research
- Author
- David Patterson
- Edited by
- Communications of the ACM
David Patterson does not need an introduction. Being the brain behind many of the inventions that shaped the computing industry (repeatedly) over the past 40 years, when he put forward an opinion article in Communications of the ACM targeting the current day political waves in the USA, I could not avoid choosing it to write this review.
Patterson worked for a a public university (University of California at Berkeley) between 1976 and 2016, and in this article he argues how government-funded academic research (GoFAR) allows for faster, more effective and freer development than private sector-funded research would, putting his own career milestones as an example of how public money that went to his research has easily been amplified by a factor of 10,000:1 for the country’s economy, and 1,000:1 particularly for the government.
Patterson illustrates this quoting five of the “home-run” research projects he started and pursued with government funding, eventually spinning them off as successful startups:
- RISC (Reduced Instruction Set Computing): Microprocessor architecture that reduces the complexity and power consumption of CPUs, yielding much smaller and more efficient processors.
- RAID (Redundant Array of Inexpensive Disks): Patterson experimented with a way to present a series of independent hard drive units as if they were a single, larger one, leading to increases in capacity and reliability beyond what the industry could provide in single drives, for a fraction of the price.
- NOW (Network Of Workstations): Introduced what we now know as computer clusters (in contrast of large-scale massively multiprocessed cache-coherent systems known as “supercomputers”), which nowadays power over 80% of the Top500 supercomputer list and are the computer platform of choice of practically all data centers.
- RAD Lab (Reliable Adaptive Distributed Systems Lab): Pursued the technology for data centers to be self-healing and self-managing, testing and pushing early cloud-scalability limits
- ParLab (Parallel Computing Lab): Given the development of massively parallel processing inside even simple microprocessors, this lab explores how to improve designs of parallel software and hardware, presenting the ground works that proved that inherently parallel GPUs were better than CPUs at machine learning tasks. It also developed the RISC-V open instruction set architecture.
Patterson identifies principles for the projects he has led, that are specially compatible with the ways research works at universitary systems: Multidisciplinary teams, demonstrative usable artifacts, seven- to ten-year impact horizons, five-year sunset clauses (to create urgency and to lower opportunity costs), physical proximity of collaborators, and leadership followed on team success rather than individual recognition.
While it could be argued that it’s easy to point at Patterson’s work as a success example while he is by far not the average academic, the points he makes on how GoFAR research has been fundamental for the advance of science and technology, but also of biology, medicine, and several other fields are very clear.