Anyone old enough to associate the word "computer" with 1950s-era images of the original UNIVAC, with its 5200 tubes cooled by water drawn from a river, probably won't be shocked by the news that a computer could inflict a second-degree burn. Indeed, the fabled machine once failed spectacularly when a wayward fish obstructed the water's flow. Nevertheless, engineers, lulled by the ubiquitous hum of their workstations' fans, can be forgiven for thinking that the heat thrown off by a computer's innards is no longer a burning issue.
But it is. Chip designers, computer makers, assorted university researchers, and chip-packaging specialists are uniting to tackle one of the most urgent, but overlooked, of the litany of potential showstoppers looming for the global semiconductor industry: the soaring densities of heat on integrated circuits, particularly high-performance microprocessors. Researchers are studying exotic new kinds of heat-conducting "goop" that suck the heat out of a chip and convey it to heat sinks, which radiate it into the air. Still, it is a measure of the seriousness of the problem that engineers are also pursuing concepts that have been considered too elaborate and far too expensive for such a mass-produced consumer product as a personal computer. Possibilities on the horizon include tiny, self-contained evaporative cooling systems and even devices that capture the heat and turn it directly into electricity.
What has led researchers to such measures? Basic physics: virtually all the power that flows into a chip comes out of it as waste heat. Today's standard-issue Pentium 4 throws off 100watts, the same as the bulb in a child's Easy-Bake Oven and, as the hapless Swede learned, more than enough to cook meat. Divide by the area and you get a heat flux of around 30watts per square centimeter--a power density several times higher than that of a kitchen hot plate.
Addressing engineers at the 2001 IEEE International Solid State Circuits Conference, Patrick P. Gelsinger, the then chief technology officer at Intel Corp., Santa Clara, Calif., said that if the trend toward ever more fiery chips were to continue unchecked, and surely it will not, microprocessors a decade from now will be pouring out as much power as the surface of the sun, some 10000W/cm2. "We need a fresh approach," Gelsinger concluded.
Heat Hurts Performance because transistors run faster when they're cool rather than hot. That's why power-mad "overclockers," in search of an additional 20–30 percent of switching speed, clap custom heat sinks and cryogenic refrigeration systems onto the microprocessors in their souped-up PCs. Heat, or rather repeated cycling from hot to cool, also shortens the life of the chip. One way it does this is by inducing mechanical stress that can literally tear a chip apart. "Typically, it's not the silicon but the package that fails," says Avram Bar-Cohen, an IEEE fellow and professor of mechanical engineering at the University of Maryland in College Park. But the silicon suffers, too. Hot copper and aluminum interconnects on the chip are also more susceptible to disintegration in a phenomenon called electromigration, a serious reliability issue. Supercomputer designers think nothing of adding chilled-water cooling and other refinements to their systems, but mass-market manufacturers have so far been unwilling to pay for such things. Garden-variety desktop computers today come with cooling equipment worth just US $3 to $5--basically a fan and a heat sink.
Computing has coasted on the fan and heat sink for quite some time. Indeed, for many in the electronics industry during much of the last decade, there was little urgency in the quest for new thermal management technology. That was thanks to the switch, in the 1980s, from ICs built using bipolar transistors to chips using today's technology, CMOS. CMOS set the clock back on the heat problem because,unlike transistors in bipolar technology, CMOS transistors draw power only when they switch from one state to another. "But by the late 1990s, we got to the same power dissipation levels we'd had with bipolars," says Bar-Cohen. "We had a 10-year free ride, using the technology we'd developed before. Now we need new ideas."
Perhaps the Biggest Bottleneck in air-cooling technology is getting the heat from the chip to the heat sink. Blocking the flow of heat are the interface between the chip itself and the lid of the chip package, if there is one, and the interface between the lid and the heat sink. Merely pressing the heat sink against the package lid won't do the trick, because microscopic roughness on both components makes for a joint full of air pockets, highly resistant to the flow of heat.
Historically, a common solution has been to fill one or both interfaces with solder, which is what the makers of power electronics systems still do. But this solution is not without its drawbacks. For one, you can't break a soldered connection without breaking the chip, which makes prototyping difficult. Even more troubling, a hard connection is liable to fail after a few thousand cycles of heat-induced expansion and contraction.
That's why most manufacturers resort to a thin layer of grease or "goop"--shorthand for thermal paste--as an interface that is soft enough to withstand expansion and contraction. Thermal paste consists of a bonding agent, say, mineral oil or epoxy, and a filler, such as silica or some more exotic substance. The filler does most of the job of conducting heat; the bonding agent holds it together and ensures that no microscopic air gaps remain between the chip and the lid or the lid and the heat sink. The problem is that the more filler you add to improve conductivity, the thicker the goop becomes, making it unable to fill all the gaps.
The Bane Of The Thermal Engineer is the cost of cooling. Designers of laptops and PCs are under extreme pressure to keep costs down and are unwilling to spring for much more than a heat sink and a fan. But you don't hear the supercomputer guys complaining about the heat--their customers are happy to pay for exotic technologies.
Another form of cooling, evaporative cooling (or phase change) was implemented by Cray Inc., Seattle, in its X1 supercomputer, and today it is used in the SV2 model, as well. The system, from Parker Hannifin Corp., Cleveland, the main supplier of jet-fuel delivery systems to the aviation industry, sprays a fluorocarbon fluid, made by 3M, St. Paul, Minn., that has a boiling point of 56 °C. "As the microscopic droplets boil off, the bubbles create nucleation points" for more bubbles to form, says Greg Pautsch, a thermal-packaging engineer at Cray. Result: even faster boiling, letting the system sweat off 45 W/cm2. Cray recently settled an intellectual property tiff over the technology with Isothermal Systems Research, Spokane, Wash.
A project being worked on at the Institute for Complex Engineered Systems at Carnegie Mellon University, in Pittsburgh, is a miniaturized evaporative system that Cristina Amon - director of the institite hopes can eventually be produced for 20 30 $ per machine (compared with the $5 or so it costs to air-cool today's standard-issue PCs). Her project, funded by the U.S. Defense Advanced Research Projects Agency (DARPA), uses microelectromechanical systems (MEMS) fabrication techniques to fashion a plate not much bigger than the chip itself but employing many tiny spray guns that bond directly to the chip.
Each nozzle shoots 100-micrometer droplets of a fluorine-based dielectric fluid at the chip's hot spots, metering the flow according to the temperature inferred from the switching speed of local transistors. The liquid boils, carrying off a big dollop of energy in the gas, which flows to a condenser. The condensate is then pumped back to the spray nozzles by a micropump. "We are removing 300 to 400 watts per square centimeter with our current prototype, all locally, on the chip," says Amon, an IEEE fellow. "But if you spread the heat a bit with conducting plates, you can easily double the amount." That would mean dissipating as much heat as even the high-performance chips are expected to produce in the foreseeable future. Best of all, as a cooling system, the technology is self-governing, working especially well precisely where it is needed most. That's because a dielectric fluid with the proper boiling point provides cooling at just the right temperature, and because it boils off faster in the hotter areas, reducing the temperature differential across the chip. Such differentials cause some parts of the chip to expand more than others, pulling the circuitry apart at the seams. Moreover, surface tension tends to suck liquid to the hotter, faster-drying parts.
Amon's system would, however, require some basic rethinking. For one thing, to preserve the coolant, the package must be hermetically sealed. The slightest leak would cause the remaining coolant to boil off even faster, and the chip would fail catastrophically. For another, the nozzle array would have to be designed concurrently with the chip, both to ensure that the chip's hot spots are spread out and to optimize the control of each nozzle.
The system, together with other heat-conducting concepts, was backed by DARPA in part because the military wants wearable computers that won't get fouled by mud or dust, as they would if they depended on a fan. And what's good for your PC may be good for you, too, someday: a few of the concepts DARPA is studying may even pave the way to air-conditioned uniforms for desert commandos or urban firefighters, and air-conditioned clothing for hot, cranky city dwellers.
While chip designers, university researchers have been busy developing their models and pedigree of their approaches, my enthusiasts of overclocking have also explored new ways of dissipating the new found calories accompanying the faster performance.
Liquid cooling makes Mac Pro near silent even when overclocked
0 comments:
Post a Comment