Jump to content

Intel 9900KS = 5.0 GHz on ALL cores

porina
7 hours ago, Drak3 said:

Even though that's pretty typical and has been for a long time in the high end?

The 88W TDP 4790K can draw 120W at load.

The 84W TDP 4770K can just over 100W.

The 77W TDP 3770K can draw 140W

The 95W TDP 2700K can draw 145W.

The 125W TDP Phenom II X6 1100T can draw 180W

 

https://www.tomshardware.com/reviews/ivy-bridge-benchmark-core-i7-3770k,3181-23.html

https://www.tomshardware.com/reviews/core-i7-4790k-devils-canyon-overclock-performance,3845-9.html

*120W under extreme AVX loads.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, porina said:

High end GPUs have been around 250W for many generations and that would seem to be the practical limit for consumer facing stuff. I think everyone is used to it. CPU wise I think around 200W (overclocked) is a practical limit. That's still manageable with air cooling.

What are you running to get 600W through a GPU?

 

I haven't ran a UPS in a long time, but a typical high end non-OC gaming system can comfortably run off a 550W PSU which isn't too bad for a UPS.

600 watts is what nVidia quotes as the recommended PSU for it's "250 watt" 1080ti (Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor.) Multiply that by two for a typical SLI configuration which I don't have. I bought a 750 watt PSU last time I replaced one. All but one PCIe slot is filled, and all PCH SATA ports are filled.

 

Anyway, the point comes back to the power draw not aligning with what is expected.

7 hours ago, mr moose said:

The anandtech test was at all turbo frequencies, not base.  So the their power draw results with regard to TDP spec is not a meaningful observation. 

It's meaningful since it's achievable without overclocking. If the CPU normally throttled back to never exceed TDP, they wouldn't have the performance stated. Intel CPU's rarely operate all cores at the same speed (see title of this thread.) If the stated TDP is only for the minimum (not power-saving) clock speed, then turbo is meaningless. The PL2 values should be stated.

 

This is not the first fraudulent performance spec to see CPU's use. Remember "Mhz Myth" and "PR ratings" ?

 

If my system kept entirely to the TDP, 84 watts for the CPU and 180 watts for the GPU, it should never exceed 264 at full load, but the difference between idle and full load for me is almost that entire headroom (145watts idle at the UPS, 390watts full tilt.) GPUCaps Viewer actually shows what's happening here, The GPU TDP maximum isn't 100%, but 120.2%. So the GPU tops at 216 watts. 216+84= 300w. what about the CPU? Intel says default is 1.25 * TDP. So at most (not taking into account hard drives or other PCIe cards) 216+105=321watts if the system is hitting both the GPU and CPU maximum default power draw.

https://www.intel.com/content/dam/www/public/us/en/documents/datasheets/4th-gen-core-family-desktop-vol-1-datasheet.pdf

 

Quote

PL2 establishes the upper power limit of turbo operation above TDP, primarily for platform power supply considerations. Power may exceed this limit for up to 10 ms. The default for this limit is 1.25 x TDP; however, the BIOS may reprogram the default value to maximize the performance within platform power supply considerations. Setting this limit to TDP will limit the processor to only operate up to the TDP. It does not disable turbo because turbo is opportunistic and power/temperature dependent. Many workloads will allow some turbo frequencies for powers at or below TDP.

 

I'm actually surprised PL2 isn't something CPU-z can find. XTU (Intel Extreme Tuning) can though.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Kisai said:

If the stated TDP is only for the minimum (not power-saving) clock speed, then turbo is meaningless. The PL2 values should be stated.

TDP is not a statement or spec for power draw it's a thermal specification that cooler and system designers use, along with other specs such as PL2 to create or imperilment cooling solutions. This is quite a common thing to mistake.

 

TDP is not supposed to be used like the way you're trying to use it.

 

Moreover most high end motherboards by default override the Intel spec for both PL2 maximum power as well as duration.

 

Edit:

And if you raise Tcase/Tjunction to maximum you'll be operating at base clocks and be drawing the same amount of power as the TDP spec. Intel's TDP spec is taken at TjMax, reviewers don't do that because why would you.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

TDP is not a statement or spec for power draw it's a thermal specification that cooler and system designers use, along with other specs such as PL2 to create or imperilment cooling solutions. This is quite a common thing to mistake.

 

TDP is not supposed to be used like the way you're trying to use it.

 

Moreover most high end motherboards by default override the Intel spec for both PL2 maximum power as well as duration.

 

Edit:

And if you raise Tcase/Tjunction to maximum you'll be operating at base clocks and be drawing the same amount of power as the TDP spec. Intel's TDP spec is taken at TjMax, reviewers don't do that because why would you.

My CPU is a non-K. I checked with XTU, it says PL2 is 105w. Sure motherboards can override this, but they shouldn't by default, and benchmarks are run at default settings except when people are trying to see how far they can overclock something before they destroy it. 

 

Why bother having Turbo Boost at all if using it will exceed the TDP when that stock cooler is only designed for the base clock?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kisai said:

Why bother having Turbo Boost at all if using it will exceed the TDP when that stock cooler is only designed for the base clock

Aftermarket cooling, like what some system integrators and. custom builders use.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kisai said:

but they shouldn't by default

They do which is why it's such a big problem and why so many reviews are invalid/different from each other. Each motherboard vendor and even different models from the same vendor are configured differently, then you have other issues like no two boards act the same when using the auto settings which are again default.

 

Unless you go in to the bios and insure MCE is off, CPU power is set to Intel default and you manually set the vcore comparing across reviews is not accurate. Some people already knew that it was a problem but it got highlighted during the 9900K reviews and now most reviewers are aware of it and more general consumers are too.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Kisai said:

Why bother having Turbo Boost at all if using it will exceed the TDP when that stock cooler is only designed for the base clock?

The other factor is that under the proper Intel spec the boost duration is short enough to not saturate a correctly spec'd cooler. Boosting even on the stock Intel cooler works fine, though toasty, so long as it's not sustained longer than the spec. Slap a stock cooler on to a Z370 and you're going to have problems, because those all run the CPU out of spec.

 

After the 9900K came out along with Z390 and when the whole "MCE" issue came to light vendors started having the defaults actually be the Intel spec. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Drak3 said:

Aftermarket cooling, like what some system integrators and. custom builders use.

Believe it or not, I used the stock cooler on mine, and that's primarily because it's a non-K, and there wasn't any point. The choice between the K and the non-K came down to the VT ability at the time.

 

 

1 minute ago, leadeater said:

They do which is why it's such a big problem and why so many reviews are invalid/different from each other. Each motherboard vendor and even different models from the same vendor are configured differently, then you have other issues like no two boards act the same when using the auto settings which are again default.

 

Unless you go in to the bios and insure MCE is off, CPU power is set to Intel default and you manually set the vcore comparing across reviews is not accurate. Some people already knew that it was a problem but it got highlighted during the 9900K reviews and now most reviewers are aware of it and more general consumers are too.

Benchmarks should maybe log the PL1 and PL2 values.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kisai said:

Believe it or not, I used the stock cooler on mine

Believe it or not, not everyone does.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Kisai said:

Benchmarks should maybe log the PL1 and PL2 values.

Yea they really should because there's a pretty big difference between a CPU set to default multipliers but PL2 set to 1000W and duration set to indefinite and one actually adhering to the Intel spec, probably around 100 points in CB.

 

From what I know however if you manually set the all core multiplier the boost duration is disabled, or at least not used because every core is now configured with the same static value. 

 

Edit:

Yep about 100 points ish

9900k_vs_all_cinebench_nt-100777240-larg

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

600 watts is what nVidia quotes as the recommended PSU for it's "250 watt" 1080ti (Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor.) Multiply that by two for a typical SLI configuration which I don't have. I bought a 750 watt PSU last time I replaced one. All but one PCIe slot is filled, and all PCH SATA ports are filled.

That's a general guide for someone who knows little about computers and is a "safe" choice even if a low quality PSU is chosen. It allows for the rest of the system also, so doubling that for SLI is plainly nonsense.

 

23 minutes ago, Kisai said:

Why bother having Turbo Boost at all if using it will exceed the TDP when that stock cooler is only designed for the base clock?

TDP is only of vague interest to low value box shifters like Dell, or mobile devices. They skimp on cooling and actually have to think about how much power to pump through it. If the load is transient, you can get high turbo for short times, That's the intent. If the load is sustained, only then it will cut back to TDP or even lower during thermal throttling if cooling is particularly poor like on Apple devices.

 

Enthusiast level desktops otherwise aren't so thermally constrained and more of the potential performance can be realised.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Kisai said:

600 watts is what nVidia quotes as the recommended PSU for it's "250 watt" 1080ti (Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor.) Multiply that by two for a typical SLI configuration which I don't have. I bought a 750 watt PSU last time I replaced one. All but one PCIe slot is filled, and all PCH SATA ports are filled.

 

Anyway, the point comes back to the power draw not aligning with what is expected.

It's meaningful since it's achievable without overclocking. If the CPU normally throttled back to never exceed TDP, they wouldn't have the performance stated. Intel CPU's rarely operate all cores at the same speed (see title of this thread.) If the stated TDP is only for the minimum (not power-saving) clock speed, then turbo is meaningless. The PL2 values should be stated.

 

This is not the first fraudulent performance spec to see CPU's use. Remember "Mhz Myth" and "PR ratings" ?

 

If my system kept entirely to the TDP, 84 watts for the CPU and 180 watts for the GPU, it should never exceed 264 at full load, but the difference between idle and full load for me is almost that entire headroom (145watts idle at the UPS, 390watts full tilt.) GPUCaps Viewer actually shows what's happening here, The GPU TDP maximum isn't 100%, but 120.2%. So the GPU tops at 216 watts. 216+84= 300w. what about the CPU? Intel says default is 1.25 * TDP. So at most (not taking into account hard drives or other PCIe cards) 216+105=321watts if the system is hitting both the GPU and CPU maximum default power draw.

https://www.intel.com/content/dam/www/public/us/en/documents/datasheets/4th-gen-core-family-desktop-vol-1-datasheet.pdf

 

 

I'm actually surprised PL2 isn't something CPU-z can find. XTU (Intel Extreme Tuning) can though.

 

 

All the stuff that leadeater mentioned aside,  the bit that I refer to as being meaningless is talking about the power draw being above TDP as if TDP was wrong or pointless.  The article linked had poorly presented what it means to get power draw values it did.   

 

The important bit for end users is to see how much power a CPU will draw under known loads and clocks (as the average consumer/enthusiast won't generally know this or be able to work it out prior to purchase),  This helps choose a PSU and in extreme cases an aftermarket cooler,  other than that referring to TDP is meaningless.

 

Here's some interesting reading on thermals and coolers with regard to Intel Processors and throttling.

 

https://www.pugetsystems.com/labs/articles/Impact-of-Temperature-on-Intel-CPU-Performance-606/

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

Enthusiast level desktops otherwise aren't so thermally constrained and more of the potential performance can be realised.

And even then the TDP spec is of little use to us.  If you want to push your CPU as hard as you can then you already know you need a decent aftermarket cooler, good power delivery and the Size of the PSU would be calculated from power draw reviews.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×