Professor Jan Rabaey of UC Berkeley gave the keynote address, on “Scaling the Power Wall, Revisiting Low-Power Design Rules.” at the International Symposium on System-on-Chip in Tampere, Finland. He noted that power consumption has been addressed as an issue in the past and that there were two proposed solutions to the problem. Allowing power-supply voltage to be a design variable, and to use new low power design techniques. Supply voltages have been dramatically reduced and most designs go below 1V. However, Rabaey suggest that we made spotty progress in using new low power techniques such as matching computation to architecture, preserving data locality, exploiting signal statistics and supplying energy on demand. Instead of matching computation to architecture, most chip designers fit computation problems to fixed architectures. There is inappropriate use of global data buses to move data around on the chip. There has not been any exploitation of signal statistics. The most used technique has been supplying energy to circuits on demand, mostly through clock gating. This technique has succeeded because of the availability of automated EDA tools that can do this job. The rest of the new design techniques require mental intervention and are not automated, hence their lack of adoption. Moreover, we have missed the importance of standby power. Supply voltages now approach transistor threshold voltages and as a result, large numbers of leaky transistors inhabit big nanometer chip designs.
There is some good news; EDA companies are starting to form a low-power design methodology. This is driven by the fact that power is now the dominant design constraint. For example, data center costs for Web-centric companies such as Google are now dominated not by equipment or by plant costs but by energy costs. At the small end of the design spectrum, mobile devices are now completely defined by their available power budgets. In the past Technology scaling was a solution but according to Rabaey it will no longer help. This is because of silicon’s fundamental limits. Although active power density continues to be somewhat limiting, leakage power, which is growing at the same rate as computational power, is the killer. At the same time, technology scaling is actually causing more design pain—in the form of process variability, which drives chip designers to adopt wider design margins or face lower manufacturing yields. Smaller circuits are also subject to more soft errors from many causes.
Rabaey suggests the following techniques for a successful electronic design:
- Concurrency galore
- Always-optimal design (no energy waste, ever)
- Better-than-worst-case design (accommodate computational and memory failures)
- Ultra-low supply voltages
- “Exploration of the unknown”
Concurrency is a good idea because it drives clock rates down, which can help save energy by allowing a further reduction of supply voltage. Always-optimal design attempts to optimize at design time, during run time, and during sleep modes. This approach requires additional circuitry. Better-than-worst-case design (also called aggressive deployment) moves design from the use of “design corners,” essentially worst-case design, to one where circuits operate in statistical operating regions where some errors are tolerated. This design approach requires the use of error-detection and –correction circuits. Ultra-low-voltage design recognizes that the true operating limit for a MOSFET is actually about 35 mV. Rabaey suggests that we can more closely approach this limit by rethinking all of digital logic. He suggested stacked transistors and logic design based on transmission gates as likely areas for productive research. Exploring the unknown employs radically new architectures that might employ millions of small processors on a chip, arrayed in collaborative networks. Experiments with search and recognition algorithms suggest that such imprecise networks of estimating processors might produce excellent results with very low energy consumption.