Bruce Guenin, PhD
Editor-in-Chief, Summer 2010 Issue
The dramatic trajectory of Moore’s law has produced a number of changes in key areas of technology that are critical to our industry: 1) greater packaging and system complexity; 2) increased power dissipation; and 3) enhanced software tools and computing performance. The first two we put into the challenge category. They are the things that continually test our ingenuity and work ethic. The third one is in the solution category. There has been a dramatic growth in the computing power available to engineers. This has been accompanied by a significant evolution in the functionality and level of automation in both the design and analysis software tools and in the data exchange between them. This has changed the way we perform thermal analyses.
In previous decades, thermal analysis was performed with dramatically less computing power than is available today. Before an analysis could be performed, it was first necessary for engineers to have considerable insight into the heat transfer processes that were operative. In order to limit the required calculations to a manageable amount, it was also necessary to give priority to those calculations associated with the greatest design risk. Only then could they set up the mathematics to calculate the required outputs. The insight that this process engendered often proved to be of significant value in enabling the engineer to “size up” new designs with respect to their ability to achieve thermal performance goals. Also, their facility in performing analytical calculations supplemented their intuition by rapidly quantifying the impact of critical design features.
In today’s time-constrained, turnkey software analysis environment, there is a tendency to focus on expediting the process of importing design databases and other inputs and optimizing the computation process to maximize the production of results. This emphasis can thwart the intellectual growth of an an engineer in developing a more nuanced understanding of heat transfer in a proposed design and a greater ability to assess the relative risk of different design approaches.
Without a doubt, when a thermal analysis using state-of-the-art tools is executed effectively, the end result is more accurate and represents a greater level of detail than that possible with more traditional calculation methods. However, the greater complexity of these approaches also brings with it certain liabilities. For example, they can be less efficient than traditional methods in evaluating early design concepts. Also, it can be more difficult to confirm that a complex analysis is error free. In this regard, the traditional methods, because of their simplicity, are easier to error check. They can supplement the state-of-the-art methods by providing reference solutions to aid in the detection of gross errors, such as those resulting from an incorrect input
Since its inception, this publication has strived to present articles that provide insight into the fundamental heat transfer processes that are operative in many application environments along with the appropriate mathematics needed to characterize them. We hope that they will promote an appropriate balance in the use of simple, conceptually satisfying models and state-of-the-art computationally complex methods.
Futurists point to a time when human intelligence will be dwarfed by that of computers. Therefore, it becomes even more important that we continue to develop our capability for critical thinking in order to anticipate and manage risks that are not accounted for in computer models and that continue to grow.