Jump to content

AMD responds to 1080p gaming tests on Ryzen. Supports ECC RAM. Win 10 SMT bug

3DOSH
Just now, PCGuy_5960 said:

Really? Isn't Broadwell a 5% improvement over Haswell?

Not in the CPU department. In iGPU, which isn't a factor on X99, Broadwell is a pretty damn good improvement. But that still doesn't have much affect on workloads that don't benefit from iGPU, and those that do, benefit more from a dGPU anyways.

 

1 minute ago, done12many2 said:

While your at it, run the x99 (6900k) in dual channel memory configuration, which nobody with x99 does.  Why?  To keep it fair.

@PCGuy_5960 The idea was to run the X99 chip at stock settings as they came out. MCE isn't a stock setting, XFR is. Also, running the X99 chips in dual channel removes a variable for the fairest test possible (and a decent RAM kit in dual channel won't affect most of these benchmarks to an extent that would change the end results).

 

It might not represent real world scenarios of X99, and we do need those benchmarks as well, but we need benchmarks that demonstrate whether or not Ryzen is capable of being an inexpensive option along side X99 when X99's specific features and spec aren't needed, just the cores are.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, PCGuy_5960 said:

High refresh rate 1080p monitors exist, FYI...

So like tested here, at 1080p with a GTX 1080 at 144hz, and V-Sync and FPS limitations off.
 

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Drak3 said:

MCE isn't a stock setting

It is enabled by default on most X99 motherboards. So why is it not?

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Valentyn said:

So like tested here, at 1080p with a GTX 1080 at 144hz, and V-Sync and FPS limitations off.
 

 

 

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Drak3 said:

Not in the CPU department. In iGPU, which isn't a factor on X99, Broadwell is a pretty damn good improvement. But that still doesn't have much affect on workloads that don't benefit from iGPU, and those that do, benefit more from a dGPU anyways.

 

@PCGuy_5960 The idea was to run the X99 chip at stock settings as they came out. MCE isn't a stock setting, XFR is. Also, running the X99 chips in dual channel removes a variable for the fairest test possible (and a decent RAM kit in dual channel won't affect most of these benchmarks to an extent that would change the end results).

 

It might not represent real world scenarios of X99, and we do need those benchmarks as well, but we need benchmarks that demonstrate whether or not Ryzen is capable of being an inexpensive option along side X99 when X99's specific features and spec aren't needed, just the cores are.

So you don't see an issue with gimping a platform in order to make another platform look better....even though that gimped platform would never be gimped by ANYONE? And if quad channel memory wouldn't make a difference to the end results, then why bother disabling it? 

 

But you could draw those same conclusions by running X99 CPUs as they were intended to be run. Ryzen performs within 10% in these tasks and 5% better in those tasks with the two platforms running at their full/normal potential. Is that performance improvement/deficit worth the cost savings/extra cost? You don't need to gimp a platform to draw any of those conclusions. The ONLY reason to gimp a platform is to make the other platform look better. 

 

I'm all for including a test with the CPUs at the same clock in order to show the IPC as that is something which will be affected on a user-by-user base (based on their overclocks/cooling). But disabling MCE or running dual channel memory will never be a thing. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Drak3 said:

Not in the CPU department. In iGPU, which isn't a factor on X99, Broadwell is a pretty damn good improvement. But that still doesn't have much affect on workloads that don't benefit from iGPU, and those that do, benefit more from a dGPU anyways.

 

@PCGuy_5960 The idea was to run the X99 chip at stock settings as they came out. MCE isn't a stock setting, XFR is. Also, running the X99 chips in dual channel removes a variable for the fairest test possible (and a decent RAM kit in dual channel won't affect most of these benchmarks to an extent that would change the end results).

 

It might not represent real world scenarios of X99, and we do need those benchmarks as well, but we need benchmarks that demonstrate whether or not Ryzen is capable of being an inexpensive option along side X99 when X99's specific features and spec aren't needed, just the cores are.

 

There are some problems with your logic on fair.

 

You claim that the x99 platform should be run in dual channel instead of quad, which happens to boost Cinebench nicely on x99, due to fairness right?

 

You also say that multi-core enhancement is not a built-in CPU feature so it should be disable.  Yet XFR is okay because it is built-in.

 

Last time I check, quad channel was built-in to the 6900k IMC, so I'm not sure why we're picking and choosing built-in features for one, but not the other? 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Drak3 said:

Also, running the X99 chips in dual channel removes a variable for the fairest test possible

That is utter bullshit. You might as well say "when comparing the 1800X to the 7700K you should disable 4 cores on the AMD chip, so that we remove vaibles and make the fairest test possible". It is NOT a fair test if you kneecap the competitor.

Does quad channel RAM make a difference? Then maybe AMD should have implemented that. Don't cripple your competitors because they have a feature you don't.

Does it not make a difference? Then why remove it from the Intel platform?

 

 

By the way, it appears that some motherboards not only overclocks Ryzen chips when benchmarks are detected (but not when other programs are ran) but also overvolts them.

But hey, let's just disable Intel features because clearly it would be "unfair" if we didn't. Don't you dare touch AMD's extremely misleading features though which were specifically designed to inflate benchmark scores without providing any extra performance in real world performance.

 

 

The lengths people go to in order to defend AMD's shady deeds is beyond me. I honestly would not be surprised if quite a few people on this forum were actual AMD shills who got paid to defend them.

 

 

Signed

-Someone who ordered an 1700X build today.

 

 

 

15 minutes ago, Valentyn said:

So like tested here, at 1080p with a GTX 1080 at 144hz, and V-Sync and FPS limitations off.
-video-

I've seen you link that video in every god damn thread today. Stop. Just stop. That video shows results that contradicts essentially every other benchmark out there. If 10 reviewers says one thing, and one reviewer says the opposite, then you would have to be mentally challenged to just go "well, let's ignore these 10 reviewers and just focus on this one who agrees with my predefined conclusion!"

Good to see that I was right though:

Quote

Ryzen will not be as good as people expect, but instead of going "oh that was a shame" people will cherry pick benchmarks like crazy to justify their expectations.
I expect a lot of people to ignore benchmarks from some sites such as Anandtech, and instead focus on outliers or niche benchmarks from other websites.

I hope I am wrong, but I probably am not.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, PCGuy_5960 said:

Videos not using the Gigabyte boards, most of them ASUS ones.
 

 

Already pointed out by Gamer Nexus and AMD on a phone call, and from Golem.de

 

 

ASUS in particular, and MSI to some extent. It explains why reviewers such as Joker, Crit, UFDiciple, and TechDeals had  far better gaming performance.

Golem.de in Germany had this to say in regards to their MSI motherboard.

 

https://translate.google.co.uk/translate?hl=en&sl=de&u=https://www.golem.de/news/ryzen-7-1800x-im-test-amd-ist-endlich-zurueck-1703-125996-4.html&prev=search

 

The MSI board was delivered with BIOS version 113, until last Friday a new one appeared.



Version 117, which is still up-to-date, improved speed and stability. If we were still able to count on sporadic Bluescreens with the older UEFI, the board is currently stable. Much more important, however, is the drastically higher performance in games and the real pack with 7-Zip. The release notes include, among other things, a fixed problem with the memory act and its timing as well as the voltage.

Compared to the original bios, the new UEFI increases the image rate in our game course between plus 4 and plus 26 percent, on the average even plus 17 percent!

Gamer Nexus's phone call with AMD states ASUS had issues with performance, and MSI as well; but the latter got a last minute BIOS update to help remedy it. Just as AMD stated it should, and Golem.de saw.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valentyn said:

Already pointed out by Gamer Nexus and AMD on a phone call, and from Golem.de

Hardware Unboxed was using an AsRock board....

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

I've seen you link that video in every god damn thread today. Stop. Just stop. That video shows results that contradicts essentially every other benchmark out there. If 10 reviewers says one thing, and one reviewer says the opposite, then you would have to be mentally challenged to just go "well, let's ignore these 10 reviewers and just focus on this one who agrees with my predefined conclusion!"

Good to see that I was right though:

 

 

It's the only one like it.  :D  It's going to get circulated a lot!!

 

Glad to hear you've got your chip ordered.  I backed out of my 1800x pre-order, but you better believe that I'm very interested in how all of this turns out.  Still one badass chip for the money.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

Hardware Unboxed was using an AsRock board....

Never heard of them before, got a link please?

 

From all the information out so far it seems Gigabyte boards are getting much better performance in gaming compared to the latter two mentioned. Haven't seen anything from Asrock, but they are under Asus.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valentyn said:

From all the information out so far it seems Gigabyte boards are getting much better performance in gaming compared to the latter two mentioned. Haven't seen anything from Asrock, but they are under Asus.

Still Gaming performance suffers:

This is WITH a Gigabyte board ;)

1 minute ago, Valentyn said:

Never heard of them before, got a link please?

13 minutes ago, PCGuy_5960 said:

 

 

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valentyn said:

Never heard of them before, got a link please?

 

From all the information out so far it seems Gigabyte boards are getting much better performance in gaming compared to the latter two mentioned. Haven't seen anything from Asrock, but they are under Asus.

Well I got a Gigabyte board ordered. Do you know which reviewers used that? It's the Gaming 5 to be precise.

I am very skeptical that future updates will all of a sudden make Ryzen much better. I bought it because I was satisfied with the performance it would give me for my particular workload. Future speculation is nice and all, but you should never under any circumstances buy a product based on future promises. Do you know what people said when Bulldozer was a huge pile of shit? They said it would get better as time went on. Well, spoiler, it didn't.

 

You should always buy the performance you need at the moment you need it, because nobody has any idea what will happen in the future. Absolutely no clue at all. People who say "it will get better with updates" might as well say "the jackpot will be lottery numbers 4, 39, 11, 16 and 22". Both statements holds as much weight.

 

Ryzen should be evaluated at the exact performance it gets today, not what someone with a crystal ball says it might get several months from now.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, PCGuy_5960 said:

Still Gaming performance suffers:

This is WITH a Gigabyte board ;)

 

1 Gigabyte out of the others, might be like many he didn't update the BIOS more often.

 

Got any published articles with Asrock? Most saw the other boards, also no Biostar yet.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

The whole "updates will make Ryzen better" reminds me of when AMD claimed the same thing about Bulldozer. And then I recall Tom's Hardware's conclusion when they tested Bulldozer again with the patches:

 

No amount of software changes are going to make up for poor hardware design.

 

EDIT: Just take that to mean that software cannot fix everything.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Valentyn said:

Got any published articles with Asrock? Most saw the other boards, also no Biostar yet.

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/

Just now, Valentyn said:

1 Gigabyte out of the others, might be like many he didn't update the BIOS more often.

I don't think that this is the case... Can you link a review with a Gigabyte board?

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, PCGuy_5960 said:

It is enabled by default on most X99 motherboards. So why is it not?

It's an aftermarket setting that mainboard manufacturers added onto their highest end boards after X99 launched, and it worked itself into most boards since. But when X99 launched, along with the first boards, MCE wasn't there. It's a function of the mainboard's UEFI, not the CPU. XFR is an actual feature of the CPU.

 

In the absolute interest of total fairness, there should be tests with MCE both disabled, and enabled, to cover all bases. But if not possible, then test for the scenario that covers the minimum all processors carrying the i7-6900K name can do as per actual spec. XFR is part of the 1700X and 1800X actual spec.

 

10 minutes ago, djdwosk97 said:

So you don't see an issue with gimping a platform in order to make another platform look better....even though that gimped platform would never be gimped by ANYONE? And if quad channel memory wouldn't make a difference to the end results, then why bother disabling it? 

I have a problem with one platform being painted as worthless because it doesn't sport all the bells and whistles of the other, especially when the target audience is one that doesn't need them (otherwise, they'd already be on X99 with a 5820K, 5930K, 6800K, or 6850K). That's the outcome of listing spec that X99 can do, but it doesn't make a fucking difference to the target audience of Ryzen in the real world.

 

4 minutes ago, djdwosk97 said:

But you could draw those same conclusions by running X99 CPUs as they were intended to be run. Ryzen performs within 10% in these tasks and 5% better in those tasks with the two platforms running at their full/normal potential. Is that performance improvement/deficit worth the cost savings/extra cost? You don't need to gimp a platform to draw any of those conclusions. 

For a purely scientific testing method, every variable that can be made the same, must be made the same. It doesn't matter if the outcome changes or not. As it stands, only two variables cannot be changed: the mainboard and CPU, and due to the difference of chipsets and sockets.

 

We're here to observe these chips at their stock settings, what they will do regardless of whether they're on a launch day board, or one rich of after market features that came out a year later. As it stands, Ryzen has a RAM bottleneck issue that has a fix coming in a few weeks, so what we've got is not 100% indicative of what Ryzen will do once fixed, so we've got to settle with slightly less than accurate benchmarks for the time being.

 

But in case you, @done12many2, and @LAwLz misses it, or y'all have serious problems with reading comprehension

 

14 minutes ago, Drak3 said:

we do need those benchmarks as well,

 

Why? Because X99 does have advantages that can benefit certain workloads.

But not every workload, where Ryzen can pick up the slack in the market, or where mainstream chips like the 7700K are superior for key tasks (hint: Ryzen octocores are set to fall between those segments where a void has developed).

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

Well I got a Gigabyte board ordered. Do you know which reviewers used that? It's the Gaming 5 to be precise.

I am very skeptical that future updates will all of a sudden make Ryzen much better. I bought it because I was satisfied with the performance it would give me for my particular workload. Future speculation is nice and all, but you should never under any circumstances buy a product based on future promises. Do you know what people said when Bulldozer was a huge pile of shit? They said it would get better as time went on. Well, spoiler, it didn't.

 

You should always buy the performance you need at the moment you need it, because nobody has any idea what will happen in the future. Absolutely no clue at all. People who say "it will get better with updates" might as well say "the jackpot will be lottery numbers 4, 39, 11, 16 and 22". Both statements holds as much weight.

 

Ryzen should be evaluated at the exact performance it gets today, not what someone with a crystal ball says it might get several months from now.

 

Joker Productions had the Gigabyte Gaming 5 just as you ordered, he's put our benchmarks with 1700 3.9Ghz vs 7700K 5Ghz with raw gaming at 1080p and 720p

Also 1800X vs 6800K in gaming and production workloads.

 

I completely agree with, it's why I'm on a 5820K since it launched. I would greatly benefit from an 8 core processor, but I'm not jumping on Ryzen. I'll most likely look for a nice second hand 5960X or 6900K to just plop into my motherboard.

 

You should always buy the performance you need at the moment you need it, because nobody has any idea what will happen in the future. Absolutely no clue at all.


 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Drak3 said:

It's an aftermarket setting that mainboard manufacturers added onto their highest end boards after X99 launched, and it worked itself into most boards since. But when X99 launched, along with the first boards, MCE wasn't there. It's a function of the mainboard's UEFI, not the CPU. XFR is an actual feature of the CPU.

If many motherboards support it, I fail to understand why you should disable it? It is not a feature of the CPU, but it is a feature of the platform...

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PCGuy_5960 said:

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/

I don't think that this is the case... Can you link a review with a Gigabyte board?

Thanks, I haven't looked at techspot before.


Sadly the only reviewers with Gigabyte seem to be Youtube reviewers.

 

Although it seems Wendell from Level 1 Techs got both Gigabyte and Asrock, although they're mostly just talking about it and haven't done raw numbers for gaming for us to see just yet.

Worth keeping an eye on.

 

This entire launch is all over the place though. Hope it clears up soon so we know the real results all across the board.

 

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

If many motherboards support it, I fail to understand why you should disable it? It is not a feature of the CPU, but it is a feature of the platform...

It's not an absolute feature of the platform. It's a later addition that not all people have, and again, it's not a part of the platform or CPU spec. Hence why I say that we should test with it both enabled and disabled.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Drak3 said:

For a purely scientific testing method, every variable that can be made the same, must be made the same. It doesn't matter if the outcome changes or not. As it stands, only two variables cannot be changed: the mainboard and CPU, and due to the difference of chipsets and sockets.

OK, so why not run the Ryzen system with 4 sticks of RAM? That way the RAM has not changed, the Intel system gets quad channel and the AMD system gets dual channel.

 

I am surprised you haven't been banned for posting porn, because your posts contains an obscene amount of AMD dick sucking.

 

 

4 minutes ago, Valentyn said:

Joker Productions had the Gigabyte Gaming 5 just as you ordered, he's put our benchmarks with 1700 3.9Ghz vs 7700K 5Ghz with raw gaming at 1080p and 720p

I am starting to wonder if you have even see any other benchmark than what the Joker posted, because that seems to be the only one you ever post or bring up.

Like I have told you over and over again, Joker's results are outliers and should therefore be ignored. If 10 people gets one result, and one person gets a completely different one, then you ignore the abnormality. Do you have any other reviewers with Gigabyte boards which posted similar results?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Valentyn said:

Joker Productions had the Gigabyte Gaming 5 just as you ordered, he's put our benchmarks with 1700 3.9Ghz vs 7700K 5Ghz with raw gaming at 1080p and 720p

Yes, but still the benchmarks that aren't wrong show that the 7700K beats the 1700 in games....

BTW, Intel does still win in his 1800X review (in games):

 

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Drak3 said:

I have a problem with one platform being painted as worthless because it doesn't sport all the bells and whistles of the other, especially when the target audience is one that doesn't need them (otherwise, they'd already be on X99 with a 5820K, 5930K, 6800K, or 6850K). That's the outcome of listing spec that X99 can do, but it doesn't make a fucking difference to the target audience of Ryzen in the real world.

 

For a purely scientific testing method, every variable that can be made the same, must be made the same. It doesn't matter if the outcome changes or not. As it stands, only two variables cannot be changed: the mainboard and CPU, and due to the difference of chipsets and sockets.

 

We're here to observe these chips at their stock settings, what they will do regardless of whether they're on a launch day board, or one rich of after market features that came out a year later. As it stands, Ryzen has a RAM bottleneck issue that has a fix coming in a few weeks, so what we've got is not 100% indicative of what Ryzen will do once fixed, so we've got to settle with slightly less than accurate benchmarks for the time being.

 

But in case you, @done12many2, and @LAwLz misses it, or y'all have serious problems with reading comprehension

 

 

Why? Because X99 does have advantages that can benefit certain workloads.

But not every workload, where Ryzen can pick up the slack in the market, or where mainstream chips like the 7700K are superior for key tasks (hint: Ryzen octocores are set to fall between those segments where a void has developed).

It looks like your the one who has reading comprehension problems. Your argument is that things like MCE/Quad channel memory/etc... don't matter for everyone.....Those people would also be looking at benchmarks where those elements don't play a role, therefore there is no reason to disable them. 

 

So if I wanted to compare a Porsche Cayman S to a Mustang GT, the only FAIR way to do that would be to add 1000lbs~ and 100HP~ to the Cayman S? Because that's what you're saying.....Platforms should be compared as they are expected to be run. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

Like I have told you over and over again, Joker's results are outliers and should therefore be ignored. If 10 people gets one result, and one person gets a completely different one, then you ignore the abnormality. Do you have any other reviewers with Gigabyte boards which posted similar results?

The first time his benchmarks were wrong... The 2 newer videos have correct benchmarks....

However, I trust Gamers Nexus more than Joker and from what AMD said, Gamers Nexus' results were accurate..

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×