Jump to content

AMD Ryzen R7 1800X performance review - TechPowerUp

6 minutes ago, PCGuy_5960 said:

In their tests, the 1800X is around 20-25% slower on average than the 7700K.... 

Here is a written version of this review:

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/

That's one of the huge issues with Ryzen and TPU highlighted it in their conclusion, benchmarks vary from reviewer to reviewer cause it's not as easy to set the Ryzen platform up properly for testing, they specifically mention it and that's probably the reason why their review of the 1800X came out so late, the review you mention probably has some settings not set to work optimally on Ryzen. The architecture is also brand-new and it was released by AMD which doesn't have too much money compared to Intel so it still needs tweaking, sadly... :/

 

l2q8BbP.png

This IMO is the reason that benchmarks are different from reviewer to reviewer (the highlited parts)

 

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Can some of it be improved by updates?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Morgan MLGman said:

This IMO is the reason that benchmarks are different from reviewer to reviewer (the highlited parts)

Definitely. But this is also something to consider... If you want to get Ryzen, do you want to waste hours upon hours trying to get the memory to its max frequency so that you won't get performance issues. If you leave it at stock, performance will be considerably worse...

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

Definitely. But this is also something to consider... If you want to get Ryzen, do you want to waste hours upon hours trying to get the memory to its max frequency so that you won't get performance issues. If you leave it at stock, performance will be considerably worse...

That's a non-issue for us, but what about an average consumer? That's what pains me the most :/

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Zackbare said:

Can some of it be improved by updates?

Not really... The CCX issue is an issue with the architecture... It is impossible to fix, unless games are coded to use only one CCX.(And thus being limited to 8 threads)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

Not really... The CCX issue is an issue with the architecture... It is impossible to fix, unless games are coded to use only one CCX.(And thus being limited to 8 threads)

Which Intel will design against, again beating AMD to ground?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

Not really... The CCX issue is an issue with the architecture... It is impossible to fix, unless games are coded to use only one CCX.(And thus being limited to 8 threads)

But since most games only use 4 threads removing the latency could probably yield a little bit of improvement while the rig can still do things like streaming in the background benefiting from the extra cores.

 

I mean it's probably not gonna be worth the trouble for 99% of devs but from a technical standpoint I believe it should help, it's not really feasible but it would help.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Morgan MLGman said:

That's a non-issue for us, but what about an average consumer? That's what pains me the most :/

Exactly. AMD has to iron out this issue ASAP. Because for the average consumer, it will be a HUGE deal-breaker....

I mean, many people are afraid to build their PC, imagine if you told them that they have to overclock the memory to get decent performance....

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Misanthrope said:

I mean it's probably not gonna be worth the trouble for 99% of devs but from a technical standpoint I believe it should help, it's not really feasible but it would help.

You just explained why developers won't do it.... :(

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Zackbare said:

Which Intel will design against, again beating AMD to ground?

What? xD

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Zackbare said:

Can some of it be improved by updates?

 

4 minutes ago, PCGuy_5960 said:

Not really... The CCX issue is an issue with the architecture... It is impossible to fix, unless games are coded to use only one CCX.(And thus being limited to 8 threads)

 

1 minute ago, Misanthrope said:

But since most games only use 4 threads removing the latency could probably yield a little bit of improvement while the rig can still do things like streaming in the background benefiting from the extra cores.

 

I mean it's probably not gonna be worth the trouble for 99% of devs but from a technical standpoint I believe it should help, it's not really feasible but it would help.

IMO the CCX issue might be fixed in a way, as we know, RAM speed fix will come out in around a month or so, if overclocking RAM from 2133 to 3200MHz can double Infinity Fabric's bandwidth, then it wouldn't be as big of a bottleneck between the CCXs and the performance should go up in theory, right? Patches should also fix the weird issue when turning SMT off improves performance in games and definitely the HEPT issue resulting in lowered performance with it turned on.

 

There's too many of those issues, sadly. I wonder when exactly and IF they're gonna get fixed... :/

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

What? xD

I mean Intel, having around 90% market share of cpu, if they develop their CPUs in a way that goes against AMD's architecture, it'll push almost every dev. to make their stuff supporting intel, and indirectly making them move away from AMD. (MAYBE)

Link to comment
Share on other sites

Link to post
Share on other sites

i'd happily take that dip in gaming performance for the vastly superior productivity performance of Ryzen 7 CPUs, also a lot of those gaming benchmarks are within margin of error, but it would be fantastic is reviewers stopped comparing them to the 7700K.

Intel i5-6600K@4.2GHz, 16GB Crucial DDR4-2133, Gigabyte Z170X-UD3, Be quiet shadow rock slim, Sapphire RX 480 Nitro+ OC, Fractal design Integra M 550W, NZXT S340, Sandisk X110 128GB, WD black 750GB, Seagate momentus 160GB, HGST 160GB

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Zackbare said:

I mean Intel, having around 90% market share of cpu, if they develop their CPUs in a way that goes against AMD's architecture, it'll push almost every dev. to make their stuff supporting intel, and indirectly making them move away from AMD. (MAYBE)

Intel's architecture is superior right now, because they don't use CCXs. (They use a "simple" 8 core CPU)

They just have to improve IPC, to start having a greater advantage over AMD...

BTW, Intel, while you are at it, lower your prices ;)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

The more Ryzen info and benchmarks come out, the quicker I'm to add an i5 Kaby Lake in the cart.

Let's see how the R5 and even the R3 series perform. That will be the deciding factor.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ashypanda said:

i'd happily take that dip in gaming performance for the vastly superior productivity performance of Ryzen 7 CPUs, also a lot of those gaming benchmarks are within margin of error, but it would be fantastic is reviewers stopped comparing them to the 7700K.

It would be great if reviewers stopped comparing it to a smililarly priced gaming chip during gaming benchmarks....why? Just to deceive people who might not know it's just not a good gaming chip?

 

That would be deceitful and intellectually dishonest.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ashypanda said:

i'd happily take that dip in gaming performance for the vastly superior productivity performance of Ryzen 7 CPUs, also a lot of those gaming benchmarks are within margin of error, but it would be fantastic is reviewers stopped comparing them to the 7700K.

That's the whole dilemma about, should it be compared with 7700k or not ?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Morgan MLGman said:

hen it wouldn't be as big of a bottleneck between the CCXs and the performance should go up in theory, right?

The only problem is that this will still not be enough... Latency is 2x higher than Intel's and 3x higher than AMDs (in the same CCX).... It will improve performance, but don't expect the 1800X to beat the 6900K and the 7700K in games

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

It would be great if reviewers stopped comparing it to a smililarly priced gaming chip during gaming benchmarks....why? Just to deceive people who might not know it's just not a good gaming chip?

 

That would be deceitful and intellectually dishonest.

No it wouldn't as AMD have clearly stated ever since the first Ryzen 7 leaks that it's a HEDT processor to take on Broadwell-E, not to compete with Skylake/Kaby lake, and it may as well be a workstation lineup as it supports ECC memory which Broadwell-E and X99 don't even support, and other than Ryzen you have to jump to Xeon

 

 

5 minutes ago, Zackbare said:

That's the whole dilemma about, should it be compared with 7700k or not ?

It shouldn't unless you're going to compare it with the 6800K and 6900K at the same time to show that the 7700K beat those in gaming as well, not to show that AMD has just released another bulldozer.

Intel i5-6600K@4.2GHz, 16GB Crucial DDR4-2133, Gigabyte Z170X-UD3, Be quiet shadow rock slim, Sapphire RX 480 Nitro+ OC, Fractal design Integra M 550W, NZXT S340, Sandisk X110 128GB, WD black 750GB, Seagate momentus 160GB, HGST 160GB

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, PCGuy_5960 said:

Exactly. AMD has to iron out this issue ASAP. Because for the average consumer, it will be a HUGE deal-breaker....

I mean, many people are afraid to build their PC, imagine if you told them that they have to overclock the memory to get decent performance....

How many "average" consumers use ram with speeds greater than 2400 MHz?Keep in mind that H110 and A320 support only 2133 MHz.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, N3v3r3nding_N3wb said:

The Samsung/GloFo 14nm LPP+ process doesn't lend itself well to 4 GHz+ overclocks, at least not in its current form.

Definitely. I recall GloFo said they are planning 7nm in 2018 or 2019, so I'm curious as to what that could mean for AMD.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ashypanda said:

No it wouldn't as AMD have clearly stated ever since the first Ryzen 7 leaks that it's a HEDT processor to take on Broadwell-E, not to compete with Skylake/Kaby lake, and it may as well be a workstation lineup as it supports ECC memory which Broadwell-E and X99 don't even support, and other than Ryzen you have to jump to Xeon

A reviewer has a moral duty to inform all of this to a consumer that, if looking at the review, probably doesn't know all of that. It also should be shown in tests so that the potential buyer now "This is not the best idea for a pure gamer" and show it with numbers to back it up.

 

So actually I completely disagree: it would be intellectually dishonest to not include because the consumer is looking to inform himself and you would not be providing the full picture under the excuse of "Oh well geeks like me should have already know this" well no, most people buying the products aren't anywhere near as invested as you to know all of that in advance.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zackbare said:

That's the whole dilemma about, should it be compared with 7700k or not ?

It should be compared to X99 blah blah blah.

All X99 CPUs beat Ryzen in games, so whether it should be compared to the 7700K or not doesn't matter. Ryzen is an amazing CPU for productivity and a decent CPU for gaming. End of story :)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MyName13 said:

How many "average" consumers use ram with speeds greater than 2400 MHz?Keep in mind that H110 and A320 support only 2133 MHz.

But Ryzen needs high speed memory to be good at gaming.... 

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ashypanda said:

No it wouldn't as AMD have clearly stated ever since the first Ryzen 7 leaks that it's a HEDT processor to take on Broadwell-E, not to compete with Skylake/Kaby lake, and it may as well be a workstation lineup as it supports ECC memory which Broadwell-E and X99 don't even support, and other than Ryzen you have to jump to Xeon

It shouldn't unless you're going to compare it with the 6800K and 6900K at the same time to show that the 7700K beat those in gaming as well, not to show that AMD has just released another bulldozer.

Both the 6800K and the 6900K beat Ryzen in games....

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×