Jump to content

Intel 9th Gen Paid Benchmarks Take Advantage of NDA Periods

Carclis
5 hours ago, Suika said:

Oh my, I just noticed game mode results...

 

  Reveal hidden contents

1?token-time=1540252800&token-hash=tRWfD
 

 

Y I K E S

 

Deceit at its finest.

This is disgusting. Makes me glad that the laptop I just bought uses a Ryzen 7 2700U.

 

Did Intel think that people wouldn't rip this apart?

Current Build:

CPU: Ryzen 7 5800X3D

GPU: RTX 3080 Ti FE

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Tuf X570 Plus Wifi

CPU Cooler: NZXT Kraken X53

PSU: EVGA G6 Supernova 850

Case: NZXT S340 Elite

 

Current Laptop:

Model: Asus ROG Zephyrus G14

CPU: Ryzen 9 5900HS

GPU: RTX 3060

RAM: 16GB @3200 MHz

 

Old PC:

CPU: Intel i7 8700K @4.9 GHz/1.315v

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Prime Z370-A

Link to comment
Share on other sites

Link to post
Share on other sites

The way I see it is Intel has approved this document with its flawed testing results, and it is now the responsibility of Intel as the approved the release of this document.

Intel made it's bed, now it's time to sleep in it.

Link to comment
Share on other sites

Link to post
Share on other sites

On a side note, the fact that they used the 2700X with a CCX disabled tells me one of two things. 

  • It was intentionally done so to make Intel's offerings look much better 
  • They are complete dumbasses who don't know what's going on

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

It seems like as AMD chips have become more and more competitive, Intel has just become more and more scared and desperate. They seem so unconfident right now, not to mention that AMD is going to beat them to 7nm. How's things can change. When I built my PC, no one recommended AMD CPUs, back in 2013, now it's certainly not so one-sided.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, D13H4RD2L1V3 said:

On a side note, the fact that they used the 2700X with a CCX disabled tells me one of two things. 

  • It was intentionally done so to make Intel's offerings look much better 
  • They are complete dumbasses who don't know what's going on

I'd say it's the first one, I think Intel and in turn Principled Technologies knew what they were doing. 50% gain over 2700x 8c/16t does not sound right. I could be wrong, still learning all the hardware and what makes a CPU perform well. Obviously the 9900k has a sizeable turbo advantage but still, it's not 50% better, really?

IntelBenchmarks.png.6c999cab21b910fa11a23a13bec636a7.png

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, ZacoAttaco said:

I'd say it's the first one, I think Intel knew what they were doing. 50% gain over 2700x 8c/16t does not sound right. I could be wrong, still learning all the hardware and what makes a CPU perform well. Obviously the 9900k has a sizeable turbo advantage but still, it's not 50% better, really?

IntelBenchmarks.png.6c999cab21b910fa11a23a13bec636a7.png

In gaming, the 8700K is 15%~ faster than 2700X, I expect that to increase maybe 5% with the 9900K which would mean 20%~ better gaming performance, the only difference this gen is that it’ll also have almost as much a margin in heavily multithreaded workloads.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, schwellmo92 said:

the only difference this gen is that it’ll also have almost as much a margin in heavily multithreaded workloads.

Forgive my ignorance, but why would that be? They both have the same core and thread count? Is just the frequency difference?

 

Side note: WikiChip is so useful for someone like me, way clearer than AMD or Intel's website.

 

Edit: They ran benchmarks 3 times, and instead of averaging the scores...they just picked the middle benchmark? What?

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, YouSirAreADudeSir said:

Steve has just drop a vid about this

(...)

I'm watching it now

It doesn't happen very often (I think it's the second time since I joined this forum), and it's not central to his arguments, but my ears hurt when he brings statistics to the table... I mean, he's obviously done some reading on the subject, and probably has a functional knowledge for what he does, but it's painful to listen to him slamming others with such security when he simply lacks the understanding of statistics that would save him the embarrassment. 

Sure, they had a sample of size 3, which makes the whole median thing pretty superficial (but here's a hint: it makes taking an average equally superficial), but to claim that using the median is some "odd choice", that it's somehow inferior to the average and, adding insult to injury, that using the average allows you to "detect and get rid of outliers" (outliers being the main drawback of using an average vs. using the median, which is immune to them), or that using the average instead of the median is somehow connected with the benchmarks "having variance", "there's a reason we have standard deviation markers in our chart" (?), the latter said in a "enough said" tone when actually not meaning anything. Furthermore, his rant led me to think that he may not understand the difference between "standard deviation" and "standard error" and, more importantly, which one to report convey a confidence interval.

 

As I said, none of it invalidates his other points, but when you see him being so assertive and so dismissive while simultaneously being so wrong and, dare I assume his role, pretty clueless, it does make a dent to his credibility (how many people without statistics knowledge will just take his word for it? How often may I be taking his word on issues I don't know about and he speaks equally strongly about? Am I being misled in those cases too?).

 

 

7 hours ago, leadeater said:

Intel: Open BIOS, load XMP

AMD: Open BIOS, load D.O.C.P

7 hours ago, Sakkura said:

Often enough it's just called XMP in an AMD board BIOS anyway.

I always thought D.O.C.P. was an Asus thing, since in other (ASRock) AMD motherboards I've always seen XMP as well.

 

Link to comment
Share on other sites

Link to post
Share on other sites

GamersNexus just did an interview with Principled Technologies if anyone's interested:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, ZacoAttaco said:

GamersNexus just did an interview with Principled Technologies if anyone's interested:

  Reveal hidden contents

 

 

Nice one i'll have to watch later though i'm going out in a bit

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpaceGhostC2C said:

I always thought D.O.C.P. was an Asus thing, since in other (ASRock) AMD motherboards I've always seen XMP as well.

They shouldn't use XMP name/terminology on AMD based products, it's an Intel technology and trademark but if no one's cared yet what does it matter. I'd rather one name for it anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, ZacoAttaco said:

GamersNexus just did an interview with Principled Technologies if anyone's interested:

 

Ohh this is going to be good, just need to grab a drink and some snacks and get a watchin ?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, YouSirAreADudeSir said:

Nice one i'll have to watch later though i'm going out in a bit

spoiler: it's a complete waste of time. The company just didn't knew enough of what they were doing.

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, asus killer said:

spoiler: it's a complete waste of time. The company just didn't knew enough of what they were doing.

Thought as much but i'm still gonna watch it ;) 

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

i hope Tech Jesus now question Intel as to why they did things this way, why using PT and not letting independent reviews go out?

 

 

.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, ZacoAttaco said:

Forgive my ignorance, but why would that be? They both have the same core and thread count? Is just the frequency difference?

Because Intel have like a 5-7%~ advantage in gaming/productivity (non-avx) workloads clock for clock and then have higher clocks, now with the same amount of cores and threads.

 

So if Intel have 15%~ higher clocks and 5%~ better “IPC” as people say, that’s approx 20%~ better performance. It’s not an exact science but you should get the idea.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ZacoAttaco said:

GamersNexus just did an interview with Principled Technologies if anyone's interested:

Finished watching now.

 

First off big respect for the guy sitting down and having an on camera interview, also not requiring review or editing of the footage either. I do feel for the guy, the sheer volume of internet commentators and reviewers is likely something they have never experienced before. Not just that but I doubt they have done much technical investigations during a time where both AMD was competitive and the extent of internet commentary and social media existed, back in Pentium 4 era and earlier all people could really do was yell in the magazine paper or computer monitor with zero feedback.

 

My gut feeling is PT actually had little time to carry out the testing and that is why the 16 total systems were used to accelerate the process to meet the deadline set by Intel. Given more time I suspect they would have done all tests with Game Mode On and Off and I would hope, if they do in fact care so much about correctness and facts, have checked their numbers against multiple other sources where possible i.e. existing products on the market and their reviews.

 

Like Steve I disagree a lot about the memory configuration and choices, 32GB would have been better if keeping the exact same capacity across the systems was deemed to be required however on that I disagree. For the dual channel systems they should have used 2 8GB modules for 16GB and for the quad channel systems used 4 8GB modules as that is more technically correct in terms of memory controller and auto timings.

 

On the point about going with manufacturer specifications should you not do that for the memory, it's rated for something and has that inbuilt in to the auto configuration of the product and the motherboard is also rated for it. You are dropping down from the manufacturer specification by loading XMP/DOCP then manually reducing the frequency, also something no one would ever do, at all.

 

Also Game Mode enable on the 2700X? That is completely illogical, the point was to test Intel's new 8 core CPUs versus AMD's 8 core right? Well you didn't, you compared it to not just a 4 core but a 4 core with a CCX configuration that is not available in that product line, only the APUs have single CCX and microcode to match that.

 

I don't know how much computer performance testing they do but if they do a lot of it they really need to get a standardized and performance optimized specification documented and set for the major/common builds that are used or would be required to compare to. Lack of experience in a new system/product is not good enough.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, leadeater said:

Finished watching now.

 

 

i didn't get how the guys says "i'm doing this for longer than you're alive" and then there was a lot that clearly went over his head. 

But i also get some of is points, i mean when Steve said people would never game with a stock cooler and have 64Gb or RAM, say what!?, people do a lot worst Steve, a lot worst. Even average gamers don't go to the extent of customization that Steve seems to push the bar with PT.

 

The hate should be directed at Intel not this guys.

.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, asus killer said:

The hate should be directed at Intel not this guys.

Considering that we don't know who decided on the final configurations, then we can't say that for certain.  Intel certainly has the lion's share of the blame (in the end, they accepted this and released it), but we don't know yet how much blame should be attributed to this guy and his company for the methodology used.

Link to comment
Share on other sites

Link to post
Share on other sites

i9-9900k is like a RTX 2080 Ti costs way more than performs. Best gaming CPU that performs exactly the same as i7-8700k for 150$ more.

 

Nobody should buy that piece of shit it is not worth it.

 

Intel acts like there was no competition if you look at prices.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, asus killer said:

i didn't get how the guys says "i'm doing this for longer than you're alive" and then there was a lot that clearly went over his head. 

But i also get some of is points, i mean when Steve said people would never game with a stock cooler and have 64Gb or RAM, say what!?, people do a lot worst Steve, a lot worst. Even average gamers don't go to the extent of customization that Steve seems to push the bar with PT.

 

The hate should be directed at Intel not this guys.

Well I know for sure next to nobody buying a 2700X will get 64GB ram though. Sure people do a lot worse but there were some configuration choices that realistically wouldn't happen like the manually lowering of frequency after applying XMP, unless there is a stability problem.

 

I don't think Steve's point was no one would game on the stock cooler it was about reducing variables in the tests and by having different coolers you have introduced a variable. Many do game with the stock cooler, many also buy after market so either configuration is representative of the average consumer however in this situation testing methodology should take precedence.

Link to comment
Share on other sites

Link to post
Share on other sites

BTW. I don't get Steve's point towards this FRAPS vs. ingame benchmark distinction. I mean, as long as every CPU is tested the same way, why does it matter if built-in benchmarks were used where possible and FRAPS where it wasn't?

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Watching the interview now and I don't think this interviewed dude will be able to clarify anything at all. Overall I think he might end up persuading me even more of what I already suspect: Intel hired them and not only told them what they wanted tested but how and probably very strongly suggested to fuck over AMD with the memory timing intentionally.

 

Yes unlike Steve nobody will sue me for calling intel what we all know they are (which I am sure many will claim it's just industry standard): fucking cheaters and liars.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Lathlaer said:

BTW. I don't get Steve's point towards this FRAPS vs. ingame benchmark distinction. I mean, as long as every CPU is tested the same way, why does it matter if built-in benchmarks were used where possible and FRAPS where it wasn't?

His main point, I think was mentioned in the earlier video was that in-game benchmark tools can be a little inconsistent when it comes to the actual FPS counter. FRAPS is more uniform and purpose built so it's generally more widely accepted. I don't think he mentioned in the interview though. Essentially, if every in-game benchmark is a little different, at least FRAPS can be a universal standard.

 

I guess LMG use a combination of FRAPS and in-game benchmarks? Can anyone give me some clarity on that?

50 minutes ago, leadeater said:

First off big respect for the guy sitting down and having an on camera interview, also not requiring review or editing of the footage either. I do feel for the guy, the sheer volume of internet commentators and reviewers is likely something they have never experienced before.

Yeah I think that's pretty underestimated and it deserves respect for the company to come out and admit their mistake. This guy John, clearly doesn't know the technical side of things, he's probably just taking the brunt of the backlash to back up his team.

 

So while I'll admit it's admirable for them to do an interview, it doesn't change the fact they were woefully uninformed. To me the cooling solution between AMD and Intel is what killed it, you can't put a stock cooler against and quality Noctua cooler. Arghh...I digress.

36 minutes ago, asus killer said:

i didn't get how the guys says "i'm doing this for longer than you're alive" and then there was a lot that clearly went over his head. 

Yeah I think he's been in the 'business' 'longer than you've been alive', but there's no way someone who has spent 20+ years hands-on benchmarking would not know that a stock AMD cooler does not compare to an aftermarket one, let alone a Noctua cooler.

 

I think he has been in a more management role, certainly not technical role and I think he admits that.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ZacoAttaco said:

His main point, I think was mentioned in the earlier video was that in-game benchmark tools can be a little inconsistent when it comes to the actual FPS counter.

Yeah but for instance my ingame Tomb Raider benchmarks are consistent with each other (1-2fps variance). They might differ from the fps I get from fraps or from MSI Afterburner but it doesn't matter as long as it gets tested the same.

 

I guess what I mean is that it only matters if you test benchmark vs. fraps. As long as you test benchmarks vs. benchmark or fraps vs. fraps everything is cool.

 

Unless it's been somehow proven that ingame benchmarks favor one platform over the other for some reason.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×