Jump to content

Why do Desktop Processors need/have Heat Spreaders?

4 hours ago, M.Yurizaki said:

It's questionable that they do. People are still confusing TDP as power consumption and I'm afraid it's even seeping into NVIDIA's product literature. Also is it the TDP of the GPU itself or all of the parts of the video card? The RAM and VRMs generate a non-trivial amount of heat too. So I'd wager the GTX 1060 is no hotter than say a Core i7-6700K once you remove those things.

 

EDIT:

As some measure of proof about this, let's take a look at Tom's Hardware's power consumption measurements on the i7-6800K. Their test setup is measuring the 12V CPU power cables using a current probe. Here's their power consumption measurements of the GTX 1060 using the same method (current probing the power lines, including the PCI Express slot).  And according to http://www.anandtech.com/show/9266/amd-hbm-deep-dive/2 , GDDR5 at least on the R9 290X took up 15%-20% of its power.

 

So if we take the GTX 1080's power consumption (110W for the entire card), subtract what VRAM is possibly taking up at the worst case, we end up with about 88W. Not sure what the VRMs, fan, and other house keeping circuitry do so let's just say 80W is what the GPU takes up. This is within 30W of a stock i7-6800K or 15W with an overclock to 4.0GHz (and matching it in the worst case).

 

EDIT 2:

Oh, before I get called out on misunderstanding, I'm saying GPUs aren't as hot as you think they are. 

Actually it's the exact temperature I think it is, because it gives a read out of the temperature. I think your confusing temperature with heat output.

 

The 1080 uses FAR more then 110w, it averages around 170W in gaming and can use up to 310W. Lastly, as the cooling solution for the GPU die is just about always the same one that's cooling the VRM and VRAM etc, it's a bit unfair to just excluded these from your calculations. Your also using a mid range GPU in your example, trying doing these figures again with top end cards (1080Ti, Titan XP)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Riley-NZL said:

Why do desktop CPU's have heat spreaders? I work in the portable computer market and none of the processors in  laptops etc have a heat spreaders, the thermal compound and cooling solution are mounted directly onto the CPU die. 

 

I'm sure there is a good reason, or they wouldn't bother.

 

They don't need heatspreaders and would run cooler without them.

 

The issue is that you can easily destroy a CPU die without a heat spreader by scratching or chipping the corners away while installing a heatsink.

 

It's made to make them less prone to physical damage during shipping, installation, maintenance etc.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The reason they moved to a heat spreader is because the dies themselves are relatively easy to crack and was a huge issue some years ago with aftermarket coolers, the reason also being that the die itself gets hotter in certain areas because each core has a different temperature based on how heavy of a load it is under, the heat spreader makes it so it spreads the heat a little better across the cores, as well as the contact surface for the cooler is a uniform temperature, and doesn't have any hot spots, ergo, heat spreader.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Riley-NZL said:

Actually it's the exact temperature I think it is, because it gives a read out of the temperature. I think your confusing temperature with heat output.

 

The 1080 uses FAR more then 110w, it averages around 170W in gaming and can use up to 310W. Lastly, as the cooling solution for the GPU die is just about always the same one that's cooling the VRM and VRAM etc, it's a bit unfair to just excluded these from your calculations. Your also using a mid range GPU in your example, trying doing these figures again with top end cards (1080Ti, Titan XP)

I'm not confusing the two. The reason I brought up power consumption is that a part's TDP is often tied to the power consumption, but it is not the power consumption of the part. i.e., 150W TDP does not mean the part consumes 150W of electricity. Otherwise that would imply all of the electricity the part is using is converted to heat (watts is watts is watts is watts).

 

Tom's Hardware pegs the worst case GTX 1080 power consumption at 200W. I use them as a source because they're the only one, as far as I know, that does not do a total system power consumption that most other review sites do. They actually probe the power delivery lines to the GPU itself and thus is more accurate of how much power the video card actually consumes. I only excluded the VRAM and VRMs from my calculation to isolate how much the GPU is using, not the video card. I was going after the claim that the GPU produces all that heat.

 

I'm also not convinced the main heat sink of video cards contributes a significant amount of cooling to the VRM and VRAM chips, because it looks like there's not a whole lot of optimal contact between the heat sink and heat spreaders.

Link to comment
Share on other sites

Link to post
Share on other sites

The die can crack when installing a heatsink . Remember the athlon xp days ?

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, M.Yurizaki said:

I'm not confusing the two. The reason I brought up power consumption is that a part's TDP is often tied to the power consumption, but it is not the power consumption of the part. i.e., 150W TDP does not mean the part consumes 150W of electricity. Otherwise that would imply all of the electricity the part is using is converted to heat (watts is watts is watts is watts).

 

Tom's Hardware pegs the worst case GTX 1080 power consumption at 200W. I use them as a source because they're the only one, as far as I know, that does not do a total system power consumption that most other review sites do. They actually probe the power delivery lines to the GPU itself and thus is more accurate of how much power the video card actually consumes. I only excluded the VRAM and VRMs from my calculation to isolate how much the GPU is using, not the video card. I was going after the claim that the GPU produces all that heat.

 

I'm also not convinced the main heat sink of video cards contributes a significant amount of cooling to the VRM and VRAM chips, because it looks like there's not a whole lot of optimal contact between the heat sink and heat spreaders.

That exact link you linked (also the one I looked at myself btw), clearly states 173W Average, 311 Max (before overclocking). Considering most cards people will buy will have at-least an small factory overclock, those numbers will be higher for the average user, and still far from what higher end cards use.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Riley-NZL said:

That exact link you linked (also the one I looked at myself btw), clearly states 173W Average, 311 Max (before overclocking). Considering most cards people will buy will have at-least an small factory overclock, those numbers will be higher for the average user, and still far from what higher end cards use.

Oh yeah, I was looking at the wrong column XP

 

But still, the maximum is only something that will be reached some of the time, not all the time. The average is something that's more realistic.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×