Quote:
Originally Posted by BGP Spook
However, every time there is a die shrink there is an increasing likely hood that products will push the thermal density up.
Even if the power draw goes down if the die area goes down disproportionally then the next generation chip is in a worse situation than the last generation chip.
|
Yeah this is true unless the die shrink also includes more efficient architecture. I remember the Thunder Bird CPU's were pretty hot, yet the XP came out after it and was considerably cooler yet faster and smaller. And now they have an IHS which helps some with the heat density as well.
I have been looking at this as well:
http://www.intel.com/research/platfo...supercomputing
Also I think it is just a matter of time before GPU manufactures considerably lower their power requirements. They are using pretty old die technology from what I understand.
This thread is a little out of context though as I posted it almost a year ago. GPU's have stepped up a lot since then and multi-core CPU's had yet to appear.