Jump to content

Intel 9th Gen Paid Benchmarks Take Advantage of NDA Periods

Carclis
1 hour ago, Stefan Payne said:

Because Intel doesn't accept the Intel Spec for the TDP, that is, more or less, used in the same way throughout all electronic devices - and then there is Intel right now...

And nowhere they do mention the "average usage Power" vs. the "standard definition TDP". 

Yea no, you're not not right about it. Intel's spec is actually in line with industry norms if you actually care to read the definition, understand that it's a thermal specification and rated at specific frequencies and temperatures. 

 

But hey like I said contentious issue because some just don't like that the CPUs can and will go over the rated TDP output for short periods of time, package and core temperatures permitting.

 

It's not 'an average', never was, the measurements are averaged because power is never constant but deviation under the defined parameters is very minimal (few watts). Really what is so hard to understand about the TDP being at Tcase max and base clocks, you realize it's no different to Nitrous in engines because it allows the engine to exceed rated power output for short periods, or do you have an issue with that too?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, leadeater said:

Yea no, you're not not right about it. Intel's spec is actually in line with industry norms if you actually care to read the definition, understand that it's a thermal specification and rated at specific frequencies and temperatures. 

Industry standard is average maximum under normal (useful) usage, at default configuration.

That does not apply to Intel.

 

And thus they should mention the higher, absolute maximum with turbo - wich is nowhere to be found...

 

10 minutes ago, leadeater said:

But hey like I said contentious issue because some just don't like that the CPUs can and will go over the rated TDP output for short periods of time, package and core temperatures permitting.

Short periods of time are OK and that is allowed by the Specification.

I'm not talking about that.

 

I'm talking about long periods of time or throtteling due to thermal issues.

 

10 minutes ago, leadeater said:

It's not 'an average', never was, the measurement's are averaged because power is never constant but deviation under the defined parameters is very minimal (few watts). Really what is so hard to understand about the TDP being at Tjmax and base clocks, you realize it's no different to Nitrous in engines because it allows the engine to exceed rated power output for short periods, or do you have an issue with that too?

Do I have to get out the old Intel documents where they was bashing AMD's ACP definition? And now they are basically doing the same right now.

When AMD did that back in 2009 or so, everyone gave them crap - why not when Intel does basically the same thing?

 

And why is there not a "Second TDP" Definition in the CPU Specs? Why only the lower one that makes their CPUs look better??

One with Turbo, one without - that would also be fine. But right now we have only the lower non boost Turbo...

 

Yeah, because with a real TDP, we would be talking about 125-150W TDP or more, wouldn't you agree?

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Stefan Payne said:

Industry standard is average maximum under normal (useful) usage, at default configuration.

That does not apply to Intel.

The defined default configuration is base clocks so applies to Intel, you're quite welcome to install XTU and completely disable all boost technologies to satisfy your obsession with not exceeding spec sheet TDP if you so wish. The rest of us will continue to be happy that the CPUs can and will deliver more performance when thermal conditions permit.

 

10 minutes ago, Stefan Payne said:

Yeah, because with a real TDP, we would be talking about 125-150W TDP or more, wouldn't you agree?

No, never. Because an Intel CPU with a 65W TDP at base clocks and Tcase max will never go above 65W, exactly as the TDP is defined. The real TDP is 65W. Edit: There are only two reasons the CPU would drop below base clocks, the cooling solution is not actually 65W or the ambient temperature is above the defined spec of the cooling solution.

 

P.S. Don't bother trying to convince me otherwise, off topic and you'll never change my mind about it.

Link to comment
Share on other sites

Link to post
Share on other sites

So...is it not going to be the fastest gaming CPU in the world?

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

The defined default configuration is base clocks so applies to Intel, you're quite welcome to install XTU and completely disable all boost technologies to satisfy your obsession with not exceeding spec sheet TDP if you so wish.

See, that is the Problem.

You have to use the CPU outside of the default configuration to be able to stay inside the specified TDP Area.
Why are you fine with that??

When the CPU could consume 150W under normal Operation, with commercial available and useful Software (wich excludes stuff like Prime95 and similar stuff)

 

It would have been fine if:

a) Intel would have mentioned the Maximum power with Turbo somewhere

b) they would have called it "baseclock TDP" -> bTDP or whatever.

 

1 minute ago, TahoeDust said:

So...is it not going to be the fastest gaming CPU in the world?

Depends...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stefan Payne said:

Thing isn't just about the Paid Benchmarks, that isn't the real issue...

That is totally fine.

The issue is that they used the values on their website for advertizing without double checking and then doubling down when they were caught.

They still defended the shit...

I would add to that trying to pass it off as 3rd party independent testing too, it sort of was but not fully and I'm sure Intel had some hand in designing the test configurations etc. From memory PT carried out all the tests on a single day, I don't see that being possible without input from Intel before hand. Game testing is not PT's wheel house.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Stefan Payne said:

You have to use the CPU outside of the default configuration to be able to stay inside the specified TDP Area.

No you don't, with an exactly rated cooler for the CPU and ambient temperatures exactly the same as the cooling manufactures design spec the CPU won't go below base clocks and will be capable of boosting but only for very short times. Putting a better cooler on only allows for longer or possibly sustained boosting. No change or tweaking is required for the CPU at all, if you see base clocks being dropped it's one or both of the reasons I mentioned in my edit, not because the CPU is breaching TDP.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Jito463 said:

While I honestly believe that Intel recognized how deceptive these benchmarks were prior to releasing them, one could make the argument that they were referring to the 9900k benchmarks specifically, not their comparison to the Ryzen/Threadripper benchmarks.

While that may be the case, that is negligence which is completely unacceptable. You're paying for testing so you'd be an idiot to give it a pass without validating it. I also don't believe for a second that they would have tested only their own hardware without comparing it to the competition, otherwise they would have no idea how to price the thing. Adding to that, Steve was unable to match the Intel 8700k performance results using PT's own extensive testing methodology, often seeing worse performance than they had even after enabling 3200Mhz memory instead of PT's 2666Mhz. Steve also noted that the 8700k performance improved over the original results that he could not match, after the benchmarks were updated to include Ryzen creator mode performance.

6 hours ago, mr moose said:

That's what covering your arse looks like.   And the results probably were identical to their own results, Anyone can get the same results if they do thee test the same way. 

 

I had a manager once that tried to call me mentally deficient because he wanted me to have my team square out the rounded corners of a rectangle hole so a steel frame (that was rectangle in shape) could be installed.  The hole was already larger than the frame so nothing needed to be done, I tried to inform him we didn't need to waste labor squaring out the corners as the frame could just be installed and he argued I needed to see a psychologist because you can't put a rectangle in a round hole.   Nearly every foreman, supervisor and team leader in that room collectively rolled their eyes.  That is the kind of product ignorance you can get in management and the inability for those with more knowledge to change things (I put the frame in the hole with ample clearance right in front of him),  so it would be of no surprise to me at all if this is exactly how Linus claims it could be.

But that's the thing. If their results are the same as PT's, they used the same nonsensical methodology. Just because Intel paid somebody else to do "the bad thing" so that you can't criticise Intel for doing it, doesn't make them innocent. If they did that and wanted it done by a third party so that the results looked more credible then that is absolutely malice.

4 hours ago, pas008 said:

Didn't they update the article with game mode off?

Yeah. I haven't updated the main post as it's already been covered on the WAN Show. The original link to the paper is for the updated benchmarks though.

46 minutes ago, TahoeDust said:

So...is it not going to be the fastest gaming CPU in the world?

Well Intel's claim was the best gaming CPU in the world. Given best is subjective their claim is often untrue. If the new PT results are true then best is absolutely not true from a price/performance perspective compared to the Ryzen 7 2700x or even their own i7 8700k.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Carclis said:

otherwise they would have no idea how to price the thing.

In a way, or at least for their higher end parts, Intel doesn't actually know how to price them lol. I think there is a lot of people out of touch with reality, like the 32GB/64GB "being normal" for example.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Carclis said:

Well Intel's claim was the best gaming CPU in the world. Given best is subjective their claim is often untrue. If the new PT results are true then best is absolutely not true from a price/performance perspective compared to the Ryzen 7 2700x or even their own i7 8700k.

I'm pretty sure they did not claim it to be the best gaming processor  value in the world.

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TahoeDust said:

I'm pretty sure they did not claim it to be the best gaming processor  value in the world.

My point was that "best" is subjective. Best for the budget, best without any price considerations or best performance per dollar. In the real world people have budgets and they consider performance per dollar. So best is not the word they should have used. They should have said fastest, like you did.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

In a way, or at least for their higher end parts, Intel doesn't actually know how to price them lol. I think there is a lot of people out of touch with reality, like the 32GB/64GB "being normal" for example.

I think they're desperate to cling to the profit margins they currently have, especially since they're unable to keep up with demand anyways.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Stefan Payne said:

That is a very bad idea...

But we should start with People who call for bans, shouldn't we? ;)

 

You'll note I said people who "refuse" to accept it.  That means ban people like yourself who just intentionally and absolutely refuse
 to accept the facts about TDP no matter how many times you are told. 

 

I personally think of someone keeps lying flat out about a tech spec they don't understand and they refuse to stop then they should be banned for trolling.

 

This is not a spec that is up for debate, it is not matter of perspective,  it is a black and white fact.

5 hours ago, Carclis said:

 

But that's the thing. If their results are the same as PT's, they used the same nonsensical methodology. Just because Intel paid somebody else to do "the bad thing" so that you can't criticise Intel for doing it, doesn't make them innocent. If they did that and wanted it done by a third party so that the results looked more credible then that is absolutely malice.

 

No one is saying you shouldn't criticize Intel for doing it, criticize away,  I'm just saying they aren't lying and its marketing guff 101.  If the product doesn't look good make it loo good.   I don't like it any more than the next person, but the fact is its here to stay and its in every industry.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Carclis said:

For now it is. That was my point though. Despite the fact that we have this healthy industry where all the results disagree with the posted paper a lot of the publications didn't know better or were unwilling to dispute the results. Plus I would have expected that someone with as significant a presence in the industry as Linus would have taken the opportunity to stand up for customers as well, especially since he gets a lot of viewers that the more advanced tech channels do not. In the end though we only had two people cover the stuff as it unfolded.

 

I still don't buy the "accidental" angle. Their public statement said that their results aligned with PT's and they went with a third party for testing. I mean they obviously wanted to have their results look legitimate by going with a third party, no other reason for it. Given their recent behavior concerning the 5Ghz 28 core CPU I would characterize their motives as deception. It makes sense though. They're under pressure from the 2700x and they're unwilling to compete on price or drop the profit margins. It just seems like a desperate situation.

I agree [Emphasis mine]. I found out today that the news/media/reporting is stupid: 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, TahoeDust said:

So...is it not going to be the fastest gaming CPU in the world?

It's probably still going to be. 

 

It's just not going to be the best value. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Define "Gaming". Is it a fast CPU, yes. Is it the fastest "blankpunktechniqueMcGuffin2000" CPU in the world? Yep. ;)

 

Marketing is all about defining a new system/product/niche and saying you are the best at it. I could for example release the best "Pork Steak cutting Knife ever!". How would we measure it?

 

It is the fastest single core, multi core, or combination of both? Does it have the largest cache size? Does it have the quickest memory access? Does it boost the highest? Is it the best overclockable?

 

Lots and lots of ways to measure it here.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

No one is saying you shouldn't criticize Intel for doing it, criticize away,  I'm just saying they aren't lying and its marketing guff 101.  If the product doesn't look good make it loo good.   I don't like it any more than the next person, but the fact is its here to stay and its in every industry.

That's not the impression I got from the WAN Show. Here is something I posted before:

Quote

Which is all well and good to say except the tech publications who didn't ignore it gave it coverage and failed to properly disclose the flawed methodology of the benchmarks as well as failing to realise how out there the numbers were compared to the independent results already out there for the 2700x. Even LTT covered it on the WAN Show as well as clickbaiting it, yet failed to properly research and acknowledge what the problems were. Just a few timestamps here:

  • 45:02 Claims that handicap is not an appropriate word to describe the approach to the Ryzen system configuration.

  • 45:36 Claims the cooler doesn’t matter for performance testing.

  • 45:41 Skips over the part about game mode without acknowledging how seriously this could impact performance.

  • 47:32 Misunderstands why people are upset. Chalks it up to the differences between the AMD and Intel products being misrepresentative ie cherry picked when it’s actually related to them disabling half the product.

  • 47:45 Claims people are making a mountain out of a mole hill and that we don’t know what the intent was.

  • 50:22 Compared to Hawaii launch (feel free to inform me on why this one was bad), Ryzen launch where more RAM was used in the Intel system (not great, but definitely not akin to disabling half a CPU) or benchmarks showing cherry picked results (standard behaviour).

  • 52:26 Suggests that it could be a mistake despite Intel statements declaring the results as legitimate and reflective of what they saw in their own lab.

  • 53:48 Blames it on poor choice of third party tester, yet the company came up with the same results as them.

That's almost 12 minutes of sidestepping the problem and claiming that Intel did nothing particularly bad. I don't understand the rationale there.

Honestly, take a look at the general sentiment of the community there. I know it's YouTube and all, but man... they sure missed the mark there. I dunno if it was just a bad day for Linus or what but that response was quite disheartening.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Carclis said:

That's not the impression I got from the WAN Show. Here is something I posted before:

Honestly, take a look at the general sentiment of the community there. I know it's YouTube and all, but man... they sure missed the mark there. I dunno if it was just a bad day for Linus or what but that response was quite disheartening.

There is a difference between something subjective like, is what they did wrong/upsetting? and something objective like, did they lie and or are the results accurate as presented?  If they'd lied then what did they say that was a lie? And if you are not happy with Linus's opinion, then so be it.  The only difference between you , me and him is he has industry experience and his business model requires he remain reputable.   I can;t see anyone in his position risking revenue and reputation for the sake of a few free processors and a custom PC (none of which he actually needs for personal use or to make further videos).

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, mr moose said:

There is a difference between something subjective like, is what they did wrong/upsetting? and something objective like, did they lie and or are the results accurate as presented?  If they'd lied then what did they say that was a lie? And if you are not happy with Linus's opinion, then so be it.  The only difference between you , me and him is he has industry experience and his business model requires he remain reputable.   I can;t see anyone in his position risking revenue and reputation for the sake of a few free processors and a custom PC (none of which he actually needs for personal use or to make further videos).

Well half of those are not subjective. I'm not sure if he had to go somewhere or what as he got interrupted just as he introduced the topic. It just seems like he was either rushed or so jaded by the industry that he doesn't care anymore.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Carclis said:

Well half of those are not subjective. I'm not sure if he had to go somewhere or what as he got interrupted just as he introduced the topic. It just seems like he was either rushed or so jaded by the industry that he doesn't care anymore.

And all of them are not proof of Intel's motivation, how it came to be, or even that his appraisal of what happened on the Intel side is wrong. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

And all of them are not proof of Intel's motivation, how it came to be, or even that his appraisal of what happened on the Intel side is wrong. 

Well you can argue that Intel's motivation was not to handicap and deceive all you like but more recent history would suggest otherwise. Even so what they did was double down on the claims when they were challenged; something that was proven to be wrong later when PT redid their tests. If their motivations were to be transparent and give the public the truth then that would have been reflected in their public statement. Something like this would have sufficed:

"We are actively working with PT to verify and address your concerns."

 

I also believe his appraisal of the situation was affected by a poor understanding of many of the earliest points introduced from what was also a condensed list.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, mr moose said:

You'll note I said people who "refuse" to accept it.  That means ban people like yourself who just intentionally and absolutely refuse to accept the facts about TDP no matter how many times you are told. 

The facts are that Intel Changed the definition and that the industry agreed on the older INTEL definition that they used for decades.

Look at this definition of TDP:

https://www.intel.com/content/dam/doc/white-paper/resources-xeon-measuring-processor-power-paper.pdf

 

Back in the day they took a shit on AMD's ACP definition, while using the same definition right now!


And it should be allowed to criticize what Manufacturers do, shouldn't it?
Or are you saying one should be banned for criticizing Intel for the shit they're doing?

 

And I am using Intels own definition of TDP as shown in that document!

 

 

Quote

I personally think of someone keeps lying flat out about a tech spec they don't understand and they refuse to stop then they should be banned for trolling.

Its a manufacturer's definition, not a "tech spec", that should be criticized and talked about.

 

Because right now, when using the Intel Definition, I could claim that my Ryzen 7/1700x has a TDP of 50W. AMD includes the Turbo mode of the Ryzen Desktop processors mostly in their TDP, look up reviews.

nVidia also uses the TDP or TBP in a similar way as the one that was standard for decades in the industry...

 

Quote

This is not a spec that is up for debate, it is not matter of perspective,  it is a black and white fact.

No, its not.

Its a manufacturers definition, that they themselves changed a while back and that should and must be talked about and criticized. And that is what a Forum should be for.

 

Quote

No one is saying you shouldn't criticize Intel for doing it, criticize away,  I'm just saying they aren't lying and its marketing guff 101.  If the product doesn't look good make it loo good.   I don't like it any more than the next person, but the fact is its here to stay and its in every industry.

Then why did they redefine their "TDP" from average power with useful software to average power at base frequency and do not mention the maximum power of the Turbo mode??

Especially since it isn't really disclosed that great, is it?

 

Its not like this redefinition of "TDP" has caused problems in the past, has it not?
Like in Notebooks, the ones that are throtteling.

 

And read the Intel Document I've linked. They say so themselves!

And there should be an information about the maximum power consumption of the CPU for normal operation inside the manufacturer's defined operation.

 

And everything a manufacturer says and defines should be up for scrutiny!

That is what Forums are for, to disagree about that!

 

 

But since you said that Manufacturer's Specification aren't up for debate, then an AMD FX8350 is an 8 Core Processor. Because that's how AMD calles it.

And right now I'm using a 4 Core A10-7850K because the manufacturer specified it to have 4 Cores.

 

There was also some criticism towars Intel about the TDP in the past.

I'm totally fine if they would have called it "SDP":
https://www.anandtech.com/show/6655/intel-brings-core-down-to-7w-introduces-a-new-power-rating-to-get-there-yseries-skus-demystified

 

Well and it looks like their TDP Definition makes their processors look more power efficient than they really are...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Stefan Payne said:

snip

If you want to criticise Intel for their use of TDP your easiest route is right here:

How can a 6c/6t CPU, 8c/8t and 8c/16t CPU all have the same TDP at virtually the same clock speed?

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Carclis said:

How can a 6c/6t CPU, 8c/8t and 8c/16t CPU all have the same TDP at virtually the same clock speed?

Actually rather easily, both 8 core products are 3.6Ghz base and the 6 core product is 3.7Ghz and you'll find that works out to the same TDP. Someone will actually have to tests this to prove it but this is just another example of someone not understanding that TDP is done on base clocks so many products can in fact have the same TDP even though number of cores and/or boost clocks can be different.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Actually rather easily, both 8 core products are 3.6Ghz base and the 6 core product is 3.7Ghz and you'll find that works out to the same TDP. Someone will actually have to tests this to prove it but this is just another example of someone not understanding that TDP is done on base clocks so many products can in fact have the same TDP even though number of core and/or boost clocks can be different.

But hyperthreading also fairly significantly affects TDP, doesn't it? Each core would spend more time actually doing work.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×