It was very interesting to read this article about the work IBM are undertaking together with École Polytechnique Fédérale de Lausanne (EPFL) and the Swiss Federal Institute of Technology Zurich (ETH) on a 3D stacked architecture for multiple cores. The four year collaborative project, called CMOSAIC, promises to deliver an interconnection density from 100 to 10,000 connections per square millimetre – 10 to 1000 times that previously possible. Wow!
One of the main challenges the team face is removing heat from the structure. Each layer in the stack is anticipated to dissipate 100-150W/cm2. With the overall structure measuring just 1-3cm3 the heat dissipation is likely to be in the region of several kW.
Consequently a novel cooling system is required. The team plans to use hair-thin 50 micron cooling channels within the structure, employing both single and two-phase cooling systems. How they fit these cooling channels in between the die when there are die-die interconnections exceeding 100 per square millimetre will be tremendous engineering design and manufacturing challenge!
Energy efficiency is at the heart of the approach, as the liquid cooling system will both require less energy to run, and provide high grade heat as a by-product, delivering coolant at around 65°C while still keeping the chip operating temperatures below 85°C. Consequently it makes liquid cooling far more financially viable for data center cooling. Today, annual cooling costs for a 1U server are approaching the cost of the server itself.
Liquid cooled applications are routinely solved with FloTHERM and FloEFD, with applications ranging from conventional cold plates to liquid cooling of bright LEDs for automotive applications.
As a footnote, Bruno Michel, Manager of Advanced Thermal Packaging at IBM Research in Zurich is a past winner of the 2008 Harvey Rosten Award for Excellence, for his work in hierarchically nested microchannels to reduce thermal interface resistances.