Jump to content

Core i9-9900K Power & Thermals, Did Linus (and OC3D TV) Get it Wrong?

schwellmo92
1 minute ago, leadeater said:

 

Default setting is 95W as noted by this reviewer, they also go on to say anything below 200W will reduce performance/drop sustained all core clocks.

Well, of course the thing is going to run hot if you set the cTDP limit to 200w. ?

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Spotty said:

Well, of course the thing is going to run hot if you set the cTDP limit to 200w. ?

See the edit to my last post, the power and temp difference is hilarious lol.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Spotty said:

Well, of course the thing is going to run hot if you set the cTDP limit to 200w. ?

But the CPU will only do 4.2GHz if you set the cTDP for 95w. How many people buy the 9900K for running it at 4.2GHz?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

See the edit to my last post, the power and temp difference is hilarious lol.

Yeah, that's quite severe...

 

4 minutes ago, Deli said:

But the CPU will only do 4.2GHz if you set the cTDP for 95w. How many people buy the 9900K for running it st 4.2GHz?

Honestly, that's on Intel. They shouldn't have claimed it was a 95W TDP chip. They should have listed it as a 120W or 130W TDP chip if it can't reach its advertised turbo speeds at its rated TDP of 95W. Did anyone ever believe the 9900k would actually have the same TDP as the i5 9600k, which is also listed as 95W TDP from Intel? It's carrying the i9 branding so it wouldn't be too unrealistic or shocking to anyone if it was rated at 130W TDP.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Spotty said:

They shouldn't have claimed it was a 95W TDP chip. They should have listed it as a 120W or 130W TDP chip if it can't reach its advertised turbo speeds at its rated TDP of 95W

Technically it can, for Turbo Boost Power Time Window which the default is 16 seconds I believe (you can even increase it) and that is default of 120W which is high enough to reach the spec all core boost.

 

The CPU will also hit 5Ghz using these default as well, it will also sustain that so long as other cores are also not active and the TDP is not above 95W or it will power limit down to 95W and what ever the clocks are which is dependent on core loading etc.

 

10 minutes ago, Spotty said:

Did anyone ever believe the 9900k would actually have the same TDP as the i5 9600k, which is also listed as 95W TDP from Intel? It's carrying the i9 branding so it wouldn't be too unrealistic or shocking to anyone if it was rated at 130W TDP.

Intel should have just slapped 200W TDP on it and rode that i9 branding hard, get a marketing team on it and spin it as a good thing. It's Intel people will accept it if the performance is there which it is.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

Intel should have just slapped 200W TDP on it and rode that i9 branding hard, get a marketing team on it and spin it as a good thing. It's Intel people will accept it if the performance is there which it is.

It'll be interesting to see if the "95W TDP" i9 9900k makes it in to the 2019 Mac Pro iMac Pro... 

Edited by Spotty

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Yep, and just when you think you've got it all sorted and have your testing procedures perfected you get slapped in the face by someone else's decision. For someone like Hardware Unboxed this is like a worst case because they do those horrifically long 35 game benchmarks and if you have to add in just one extra data point that could be day(s) extra work.

Hardware Unboxed's strength is their testing, although I find myself not agreeing with their spin on interpreting the results many times. I haven't had a chance to catch up with this particular video yet.

 

I think it is clear, products are more complicated than most people think. When things aren't stressed then many things can be overlooked as insignificant. As we start to stretch some limits, we encounter situations like this. Judging by the comments, it is not helped that many people don't fully understand the usage of TDP, myself included.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Spotty said:

Response issued by LTT is pinned to the top of the comments for their video.

I've been seeing this 'pinned comment' thing all night and I couldn't figure out where the hell it was, checked all over the forums and on Floatplane....... argh Youtube you stupid fool lol. Took me hours to figure that out, so used to not watching LTT stuff on there.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Technically it can, for Turbo Boost Power Time Window which the default is 16 seconds I believe (you can even increase it) and that is default of 120W which is high enough to reach the spec all core boost.

 

The CPU will also hit 5Ghz using these default as well, it will also sustain that so long as other cores are also not active and the TDP is not above 95W or it will power limit down to 95W and what ever the clocks are which is dependent on core loading etc.

 

Intel should have just slapped 200W TDP on it and rode that i9 branding hard, get a marketing team on it and spin it as a good thing. It's Intel people will accept it if the performance is there which it is.

if they did that most people would be screaming fx 9590

32 minutes ago, porina said:

Hardware Unboxed's strength is their testing, although I find myself not agreeing with their spin on interpreting the results many times. I haven't had a chance to catch up with this particular video yet.

 

I think it is clear, products are more complicated than most people think. When things aren't stressed then many things can be overlooked as insignificant. As we start to stretch some limits, we encounter situations like this. Judging by the comments, it is not helped that many people don't fully understand the usage of TDP, myself included.

well intel has been basicly liyng about their cpu's tdp ever since boost was released, if you look at base frequencies they almost havent changed since then and only the boost frequencies have, they do this to be able to still claim low tdp numbers and look good, and with each generation its getting worse, with the difference between the all core boost and the base clock increasing, 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, porina said:

Hardware Unboxed's strength is their testing, although I find myself not agreeing with their spin on interpreting the results many times. I haven't had a chance to catch up with this particular video yet.

 

I think it is clear, products are more complicated than most people think. When things aren't stressed then many things can be overlooked as insignificant. As we start to stretch some limits, we encounter situations like this. Judging by the comments, it is not helped that many people don't fully understand the usage of TDP, myself included.

Intel also benefits from obscuring information. Thus there's benchmarks out there as "best gaming CPU", along with the productivity as much lower power. People will conveniently remember the "best" results, generally.

 

Though it does need to be noted, at "stock" settings, the 9900k is only between 5-15% better than the 2700X at stock in compute/render tasks. So that little IPC advantage and about 10% all-core clocks. And it's going to be double the price through the holidays.

 

Gaming build recommendations are clearly still the 8700 (non-K) on a Z-series board for expensive GPUs; Ryzen 2600 for practically everything else; 9900k for the budding Adobe CC user.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, cj09beira said:

well intel has been basicly liyng about their cpu's tdp ever since boost was released, if you look at base frequencies they almost havent changed since then and only the boost frequencies have, they do this to be able to still claim low tdp numbers and look good, and with each generation its getting worse, with the difference between the all core boost and the base clock increasing, 

TDP is correct by Intel's definition. The turbo states I believe are covered by something called PL1 and PL2, whatever/however they are defined.

 

How does AMD handle this?

 

5 minutes ago, Taf the Ghost said:

Intel also benefits from obscuring information. Thus there's benchmarks out there as "best gaming CPU", along with the productivity as much lower power. People will conveniently remember the "best" results, generally.

 

Though it does need to be noted, at "stock" settings, the 9900k is only between 5-15% better than the 2700X at stock in compute/render tasks. So that little IPC advantage and about 10% all-core clocks. And it's going to be double the price through the holidays.

I haven't looked yet, is there much difference between strict TDP and relaxed TDP for gaming on the 9900k? To me, it is missing the point to say the 9900k is poor value. It's existence is to be the best performing consumer level CPU. Certainly agree if you do consider value, it wouldn't be at the top of anyone's list.

 

On side note, I've done some IPC-like testing comparing Skylake (should be representative of the -lakes), Zen and Zen+ over the weekend. Results in CPU forum if anyone is interested.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

TDP is correct by Intel's definition. The turbo states I believe are covered by something called PL1 and PL2, whatever/however they are defined.

 

How does AMD handle this?

 

I haven't looked yet, is there much difference between strict TDP and relaxed TDP for gaming on the 9900k? To me, it is missing the point to say the 9900k is poor value. It's existence is to be the best performing consumer level CPU. Certainly agree if you do consider value, it wouldn't be at the top of anyone's list.

 

On side note, I've done some IPC-like testing comparing Skylake (should be representative of the -lakes), Zen and Zen+ over the weekend. Results in CPU forum if anyone is interested.

There would be a gaming difference, but you'd need a 2080 Ti to probably tease it out. And it depends on the game.

Link to comment
Share on other sites

Link to post
Share on other sites

I saw this one coming. I wonder how the 9900k compares to the 8700k with regards to boost clocks though. We didn't see this problem last gen which suggests that the 8700k was able to hit it's turbo speeds within TDP spec. This raises the question of what does Intel's own TDP spec mean if? Does it only apply to consumers but not the reviewers who get seeded the best boards with the TDP limits removed?

 

It was only a matter of days before this topic reared its head again, right?
@leadeater

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Carclis said:

I saw this one coming. I wonder how the 9900k compares to the 8700k with regards to boost clocks though. We didn't see this problem last gen which suggests that the 8700k was able to hit it's turbo speeds within TDP spec. This raises the question of what does Intel's own TDP spec mean if? Does it only apply to consumers but not the reviewers who get seeded the best boards with the TDP limits removed?

 

It was only a matter of days before this topic reared its head again, right?
@leadeater

Someone who's on GN Steve's discord really needs to ask him to test this, or point me to some power graphs on his tests with the 8700k like I posted of the 9900k. I'm very interested to see the power over time and frequency over time graphs for that CPU along with full details of power settings in the BIOS, configured cTDP and screenshot of Intel XTU.

 

Failing that anyone with bone stock 8700k could post the above, Intel XTU tracks all of it so run CB or something and take a screen shot of it.

image.png.023c0a509ad617ab4f6b9553613439f7.png

Like this, second run with 125W TDP just to see what it would do. 140W when limit set to 1000W, you can see a very slight dip in power draw and clocks with that 125W TDP limit.

Link to comment
Share on other sites

Link to post
Share on other sites

I feel people misunderstood this subject.

 

Intel definition for tdp:

 

 

"Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements."

 

 

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, asus killer said:

I feel people misunderstood this subject.

 

Intel definition for tdp:

 

 

"Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements."

 

 

Yeah that’s all good but they keep extending their turbo boost power requirements, it used to be barely above their rated TDP but now it’s like triple if you want to hit the all-core boost.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

Someone who's on GN Steve's discord really needs to ask him to test this, or point me to some power graphs on his tests with the 8700k like I posted of the 9900k. I'm very interested to see the power over time and frequency over time graphs for that CPU along with full details of power settings in the BIOS, configured cTDP and screenshot of Intel XTU.

 

Failing that anyone with bone stock 8700k could post the above, Intel XTU tracks all of it so run CB or something and take a screen shot of it.

I've posted in there hoping for a response. Maybe I'll get lucky ?

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Spotty said:

Yeah, that's quite severe...

 

Honestly, that's on Intel. They shouldn't have claimed it was a 95W TDP chip. They should have listed it as a 120W or 130W TDP chip if it can't reach its advertised turbo speeds at its rated TDP of 95W. Did anyone ever believe the 9900k would actually have the same TDP as the i5 9600k, which is also listed as 95W TDP from Intel? It's carrying the i9 branding so it wouldn't be too unrealistic or shocking to anyone if it was rated at 130W TDP.

That's what I'm saying the whole time, since the MACBOOK debacle...

The TDP definition from Intel is just totally useless because the Boost can totally ignore it...

 

1 hour ago, porina said:

TDP is correct by Intel's definition. 

That doesn't make it right, does it?
Especially since the "Industry Standard" is that TDP is at the upper end of what a CPU should consume. And that is how it is usually understood...

Intel is the odd one out in this case...

1 hour ago, porina said:

How does AMD handle this?

The way it is usually understood by everyone:
TDP is on the upper end of AMD Processors and the CPU don't violate it much. 

The Threadripper for example even have a hard TDP Limit. Means that the CPU can't consume more than 180W (unless you specify otherwise)....

 

1 hour ago, porina said:

I haven't looked yet, is there much difference between strict TDP and relaxed TDP for gaming on the 9900k?

People have measured of up to 260W CPU Power Consumption for the 9900K and see for yourself:

 

https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/21

What Anandtech mentions there is something I also did, that Intel should specify the "bTDP" and "rTDP" for base and real (=with turbo, Default configuration)

 

1 hour ago, cj09beira said:

if they did that most people would be screaming fx 9590

They are doing it anyway, when they see the measured values.

And Intel should be facing some backlash for their bullshit TDP "specification"

 

1 hour ago, cj09beira said:

well intel has been basicly liyng about their cpu's tdp ever since boost was released, if you look at base frequencies they almost havent changed since then and only the boost frequencies have, they do this to be able to still claim low tdp numbers and look good, and with each generation its getting worse, with the difference between the all core boost and the base clock increasing, 

I agree with you.

And that's kinda what I'm saying all the time.

 

But there are some people claiming Intel did nothing wrong and their TDP spec is correct...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, asus killer said:

I feel people misunderstood this subject.

Intel definition for tdp:

Is irrelevant.


And they should be called out for cheating as their "specification" has no real world value or implication.

So basically they can do a 1.4GHz Base 4 Core/8 Thread CPU and call it 10W TDP, when it boosts to 4GHz and consumes 100W at that clockrate.

 

But the CPU is still specified at 10W TDP....

 


The worst thing:
Ever remember the AMD "ACP" thing?? Average CPU Power.

And the outrage over that?

Intel does exactly that right now. Only worse than AMD...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Stefan Payne said:

Especially since the "Industry Standard" is that TDP is at the upper end of what a CPU should consume. And that is how it is usually understood...

A mosfet in an audio amplifier or VRM can't turbo boost though can it ?. Those are static loads where an Intel CPU is not. You can very clearly see in the graph GN Steve posted the CPU staying at the 95W TDP as per spec, with the allowed short term 120W boost.

 

Edit:

Go complain to motherboard makers, they are the issue not Intel.

 

10 minutes ago, Stefan Payne said:

The Threadripper for example even have a hard TDP Limit. Means that the CPU can't consume more than 180W (unless you specify otherwise)....

You mean exactly like Intel does that motherboard makers override, that you keep ignoring even though I told you this days ago and showed you direct evidence of it (Gamers Nexus 9900K review).

 

Only thing I agree with is adding a full all core boost power draw spec, the package TDP listed however is completely correct unless configured higher (cTDP).

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

A mosfet in an audio amplifier or VRM can't turbo boost though can it ?. Those are static loads where an Intel CPU is not. You can very clearly see in the graph GN Steve posted the CPU staying at the 95W TDP as per spec, with the allowed short term 120W boost.

The Cooling of Cars in Germany is designed to be used at full load, maximum speed for long periods of time - wich can cause other issues in colder situations. Your cars are not made for this high load scenario as you can't use the car legally at 160km/h for longer periods of time outside of race tracks.

 


And it seems like Ian Cutress agrees with me:

 

So is TDP Pointless? Yes, But There is a Solution

Quote

The solution here is to offer two TDP ratings: a TDP and a TDP-Peak. In Intel lingo, this is PL1 and PL2, but basically the TDP-Peak takes into account the ‘all-core’ turbo.

He agrees with what I'm saying. So why are you still defending Intel so much when even other reviewers agree with me that Intel should have a second TDP?!

 

You might be fine with the American Coolers in Cars, I am not.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

I was also confused aftert Derbaur's video. He was like, yeah this hot AF. Then Linus be like, yeah dude, great thermals. 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Stefan Payne said:

The Cooling of Cars in Germany is designed to be used at full load, maximum speed for long periods of time - wich can cause other issues in colder situations. Your cars are not made for this high load scenario as you can't use the car legally at 160km/h for longer periods of time outside of race tracks.

 


And it seems like Ian Cutress agrees with me:

 

So is TDP Pointless? Yes, But There is a Solution

He agrees with what I'm saying. So why are you still defending Intel so much when even other reviewers agree with me that Intel should have a second TDP?!

I'm not, I'm pointing out what is actually correct that EVERYONE has been wrong about or fails to understand. I don't specifically blame anyone for not getting it, cTDP values have been messed with this whole time it seems so every review you have ever seen by anyone that has sustained CPU package power draw over the rated TDP has been overriding Intels TDP, which is allowed but is not stock configuration.

 

The TDP has very specific purpose, it's not to tell you what the power draw is. Certainly not at boost either. It's there to design a cooler and unless the cTDP has been increased after 16 seconds (on current CPUs) the CPU will not draw any more than 95W.

 

image.png

 

So explain to me exactly how the 95W TDP Intel puts on the 9900K is pointless because this graph shows it has a point.

 

Edit:

Btw your example is totally wrong for the point. Again turbo is NOS, your German car will overheat with infinite NOS being pushed in to the engine. And my cars? Read my location tag, I'm not American and drive a Japanese car anyway so...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×