Jump to content

AMD responds to 1080p gaming tests on Ryzen. Supports ECC RAM. Win 10 SMT bug

3DOSH
5 minutes ago, Dietrichw said:

That is taking things a bit out of context (or can be taken out of context, that is how wild rumors start), they did not ask for a GPU bottleneck to be created for all tests. They wanted higher res benchmarks included along with 1080p, not GPU bottlenecked tests only. Jay was asked to disable a feature that showed up on later X99 boards which helps the chips reach single core  turbo across all the cores which is a sort of auto overclock which isn't really the stock performance of X99 chips but is on by default on every new X99 board nowadays. Since it such a default feature Jay has a good reason to not disable it as no real user will disable it or know it even exists, but it is a feature that would not be found on an original X99 board or an Intel made motherboard.

 

I was not taken out of context.  Look at what they posted and listen to what they actually told him.  

 

 

 

4 minutes ago, Tomsen said:

We are 7 pages deep, and I see a bunch of videos posted. Could you do me the favor or linking to them (or the comments), so I don't have to go through all the pages and videos.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dietrichw said:

That is taking things a bit out of context (or can be taken out of context, that is how wild rumors start), they did not ask for a GPU bottleneck to be created for all tests. They wanted higher res benchmarks included along with 1080p, not GPU bottlenecked tests only. Jay was asked to disable a feature that showed up on later X99 boards which helps the chips reach single core  turbo across all the cores which is a sort of auto overclock which isn't really the stock performance of X99 chips but is on by default on every new X99 board nowadays. Since it such a default feature Jay has a good reason to not disable it as no real user will disable it or know it even exists, but it is a feature that would not be found on an original X99 board or an Intel made motherboard.

No that's what AMD has said on an AMA but Steve has clarified that was not the case; it was not at all as innocent as simply requesting different resolutions like 1440p and 4K. (look at 18:06 on their most recent video on reviewing Ryzen again)

That's called MCE/Multi-Core Enhancement that is on by the default settings, it was originally an ASUS feature i believe. That doesn't mean it's ok to disable because it is still Intel performance being gimped.

"If you ain't first, you're last"

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Memories4K said:


That's called MCE/Multi-Core Enhancement that is on by the default settings, it was originally an ASUS feature i believe. That doesn't mean it's ok to disable because it is still Intel performance being gimped.

You overlooked the third part where I pointed out that Jay has the more reasonable view on it. I was not fully defending AMD, I was just pointing out that there is some logic behind the disabling of the feature if one were to do a side by side comparison of the processors with let say the AMD and Intel reference motherboards where the only differences are the most basic features and design of the platforms. In the real world with what consumers will get currently, the feature should be enabled because everyone would get the added performance of MCE. 

 

I was just clarifying before someone thought something as far as disabling Hyper threading or other standard core features. People can really do a lot with a small forum post.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Tomsen said:

Can you name a few of those practices?

AMD requested reviewers to alter testing methodology to favor the results in their direction.

 

In the Launch event they underclocked the Intel CPU's and ran the X99 systems is dual channel instead of quad channel. 

 

AMD insisted on creating a GPU bottleneck to make CPU benchmarking look like an even playing

field.

 

They also differed gameplay footage, more skybox usage, etc. to favor the results. 

 

 to mention the major things. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Alexokan said:

AMD requested reviewers to alter testing methodology to favor the results in their direction.

 

In the Launch event they underclocked the Intel CPU's and ran the X99 systems is dual channel instead of quad channel. 

 

AMD insisted on creating a GPU bottleneck to make CPU benchmarking look like an even playing field.

 

 to mention the major things. 

And IPC is on par with Haswell, not Broadwell:

In Cinebench's Single Threaded test it is 5% slower than the 6900K (both at 4GHz), so it is on par with Haswell...

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Memories4K said:

Steve, from GamersNexus [Source] Over the phone/e-mails

Jay, from JayzTwoCents [Source] Over e-mails

So, Steve was told to include 1440p result because that was what AMD projected the consumer to run with. What exactly is wrong with that?

 

Jay was told to disable multicore enhancement, since that is quite literally an overclock. What exactly is wrong with that?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

And IPC is on par with Haswell, not Broadwell:

In Cinebench's Single Threaded test it is 5% slower than the 6900K (both at 4GHz), so it is on par with Haswell...

I would consider that more typical marketing bullshit, that nearly all launches include some of. (He was asking what unethical practices were made by AMD.)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Alexokan said:

I would consider that more typical marketing bullshit, that nearly all launches include some of. (He was asking what unethical practices were made by AMD.)

Yes, but IPC that is on par with Broadwell was a lie ;)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

And IPC is on par with Haswell, not Broadwell:

In Cinebench's Single Threaded test it is 5% slower than the 6900K (both at 4GHz), so it is on par with Haswell...

 

Must be why it's the 1700X loses to the 6900K both at stock speeds at Andantech then. 

 

1700X = 3.4Ghz max boost of 3.8Hz single core
6900K = 3.2Ghz - 3.7hz on all cores, 4Ghz single core Boost.

https://ark.intel.com/products/94196/Intel-Core-i7-6900K-Processor-20M-Cache-up-to-3_70-GHz

85880.png


85881.png

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, PCGuy_5960 said:

In Cinebench's Single Threaded test it is 5% slower than the 6900K (both at 4GHz), so it is on par with Haswell...

Broadwell, in the best case scenario, is 1% better than Haswell. Skylake/Kaby Lake IPC is roughly 5% better than Haswell.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Alexokan said:

AMD requested reviewers to alter testing methodology to favor the results in their direction.

 

In the Launch event the underclocked the Intel CPU's and ran the X99 systems is dual channel instead of quad channel. 

 

AMD insisted on creating a GPU bottleneck to make CPU benchmarking look like an even playing field.

 

 to mention the major things. 

First of all, all hardware manufacturers includes a reviewers guide. This is a simply guide for reviewers how to showcase the product the ways the manufacturer wanted. Intel and Nvidia are also doing it, and often doing it more.

 

They underclocked it to match ryzens clock, IIRC. Dual channel was just to duplicate it, but that is a fair point since intel x99 platform run 4 channel.

 

AMD insisted on showing actual user cases. Are are you one of the few people with a $329.99-$499.99 processor with 1366x768 resolution screen?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Tomsen said:

First of all, all hardware manufacturers includes a reviewers guide. This is a simply guide for reviewers how to showcase the product the ways the manufacturer wanted. Intel and Nvidia are also doing it, and often doing it more.

 

They underclocked it to match ryzens clock, IIRC. Dual channel was just to duplicate it, but that is a fair point since intel x99 platform run 4 channel.

 

AMD insisted on showing actual user cases. Are are you one of the few people with a $329.99-$499.99 processor with 1366x768 resolution screen?

You're obviously entitled to your opinion. But I think they went too far.

 

They should have just foregone the direct comparison if they had to jerry rig the testing to show the results they wanted, IMO. 

 

(I also added an additional point of the side by side gameplay footage and usage of skybox to improve frame rates.) 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Tomsen said:

First of all, all hardware manufacturers includes a reviewers guide. This is a simply guide for reviewers how to showcase the product the ways the manufacturer wanted. Intel and Nvidia are also doing it, and often doing it more.

 

AMD insisted on showing actual user cases. Are are you one of the few people with a $329.99-$499.99 processor with 1366x768 resolution screen?

A reviewers guide is one thing, but asking a reviewer to DISABLE features on X99 (that benefit X99 and that would never be disabled by anyone) is another thing entirely. 

 

You have to remember one very important point. While someone who's going to buy a $500 cpu to game isn't likely to be on 1080p, any performance issues at 1080p will eventually occur at 4k once GPUs are more powerful. So, while a 1080Ti might be the bottleneck at 4k today (and thus a 7700k ~ 1800x), in the future a 1280Ti could be ten times as powerful and then the 1800x would be a significant bottleneck. That is, of course, assuming that single threaded performance still remains king for gaming. And while we can hope of games being more multithreaded, at the end of the day, we don't know what the future will look like.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Valentyn said:

6900K = 3.2Ghz - 3.7hz on all cores, 4Ghz single core Boost.

No... The 6900K at stock gets around 3.2-3.3GHz on all 8 cores and boosts 1 to 3.7GHz... For 4GHz, you need the Turbo Boost Max 3.0 driver...

5 minutes ago, Valentyn said:

1700X = 3.4Ghz max boost of 3.8Hz single core

Remember though... AMD's boosting technology will boost all 8 cores to a higher frequency, so it's probably not 3.4GHz, but closer to 3.6...

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Tomsen said:

AMD insisted on showing actual user cases. Are are you one of the few people with a $329.99-$499.99 processor with 1366x768 resolution screen?

High refresh rate 1080p monitors exist, FYI...

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Drak3 said:

Broadwell, in the best case scenario, is 1% better than Haswell. Skylake/Kaby Lake IPC is roughly 5% better than Haswell.

Really? Isn't Broadwell a 5% improvement over Haswell?

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, djdwosk97 said:

A reviewers guide is one thing, but asking a reviewer to DISABLE features on X99 (that benefit X99 and that would never be disabled by anyone) is another thing entirely. 

 

You have to remember one very important point. While someone who's going to buy a $500 cpu to game isn't likely to be on 1080p, any performance issues at 1080p will eventually occur at 4k once GPUs are more powerful. So, while a 1080Ti might be the bottleneck at 4k today (and thus a 7700k ~ 1800x), in the future a 1280Ti could be ten times as powerful and then the 1800x would be a significant bottleneck. 

IE. why people are comparing the landscape to that of the release of bulldozer. 

 

The CPU shows passing grades now, but does not fair well when considering future improvements. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, djdwosk97 said:

A reviewers guide is one thing, but asking a reviewer to DISABLE features on X99 (that benefit X99 and that would never be disabled by anyone) is another thing entirely. 

 

You have to remember one very important point. While someone who's going to buy a $500 cpu to game isn't likely to be on 1080p, any performance issues at 1080p will eventually occur at 4k once GPUs are more powerful. 

A feature that quite literally overclocks the CPU... That is what multicore enhancement does.

 

That is on the premise you will run into the same issue when scaling up to 4K, which I personally doubt, since games will be designed around DX12/Vulkan.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Tomsen said:

So, Steve was told to include 1440p result because that was what AMD projected the consumer to run with. What exactly is wrong with that?

 

Jay was told to disable multicore enhancement, since that is quite literally an overclock. What exactly is wrong with that?

1 minute ago, Tomsen said:

A feature that quite literally overclocks the CPU... That is what multicore enhancement does.

 

That is on the premise you will run into the same issue when scaling up to 4K, which I personally doubt, since games will be designed around DX12/Vulkan.

 

You obviously chose to see what you wanted in Steve's video.

 

Also, why is it cool with you for AMD to tell them to disable multi-core enhancement on the Intel chip, but leave XFR enabled on the AMD chip?  You're right, what's wrong with that?  You should disable the overclock on one and not the other because that's the fairest way to do it.  While your at it, run the x99 (6900k) in dual channel memory configuration, which nobody with x99 does.  Why?  To keep it fair.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Tomsen said:

A feature that quite literally overclocks the CPU... That is what multicore enhancement does.

And AMD's XFR doesn't?

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

And AMD's XFR doesn't?

 

That doesn't count man.  xD

 

I should add, because it worthless.  But it is still an overclock just like Intels.  They just couldn't get it to work as well.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Tomsen said:

A feature that quite literally overclocks the CPU... That is what multicore enhancement does.

 

That is on the premise you will run into the same issue when scaling up to 4K, which I personally doubt, since games will be designed around DX12/Vulkan.

*cough* XFR *cough*

And again, no one is going to disable that. 

 

Of course you'll run into the same issue. At some point, you'll end up with a powerful enough GPU that will cause the CPU to be the bottleneck. Whether or not a 7700k or 1800x ends up being better in the long run is a complete guess. Single threaded performance is still a critical factor as a lot of tasks simply cannot be parallelized -- even with DX12/Vulkan. So, you can bet on multi-core becoming more important than single-threaded performance, but that is just as much of a guess as the alternative.

 

It's a complete crapshoot as to which will be better in the long run (7700k vs. 1800x). So with that being the case, you're better off spending less money and getting more performance TODAY. Now, you can PERSONALLY choose to take the risk and go with whichever you PERSONALLY think is more likely, but that doesn't change the fact that a 7700k is a better cpu for gaming thereby making the 1800x a bad choice. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, done12many2 said:

 

That doesn't count man.  xD

Oh, sry. I didn't know that. That's to keep it fair, right?

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PCGuy_5960 said:

Oh, sry. I didn't know that. That's to keep it fair, right?

 

Not only did the hype train completely derail, but now folks are grasping to defend pure bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, done12many2 said:

Not only did the hype train completely derail, but not folks are grasping to defend pure bullshit.

Ikr! But be careful.... AMD Fanboys will call you an Intel Fanboy because of what you just said! :D

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×