Jump to content

Can AMD RIP the Core i9? - Ryzen Threadripper

AMD's finally back in the HEDT space with Threadripper - But can they go toe to toe with Team Blue?

 

There is a table showing the Threadripper 1900X as having a 4-core boost clock of 3.7 GHz despite having a base clock of 3.8 GHz. This is an error, and should be 4.0 GHz. Its price should also be $549. Further details are unknown at this time. Sorry about that.

 

 

Buy the Threadripper 1950X:
Amazon: http://geni.us/WuK1aQ
Newegg: http://geni.us/NYAPoe8

 

Buy the Threadripper 1920X:
Amazon: http://geni.us/LIO7Va
Newegg: http://geni.us/tN78M

 

Buy the ASUS ROG Zenith Extreme:
Amazon: http://geni.us/LOSaF
Newegg: http://geni.us/rb4bo6

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

Apparently it can :D.

 

But not in CS:GO xp.

Link to comment
Share on other sites

Link to post
Share on other sites

TBH I’m not impressed. Yes threadripper is faster than the 7900K in most applications, and yes I’m still going to recommend it. But the the gains that it had in rendering applications were not like an massive crushing defeat where it should have been considering it has double the cores. Once again AMD’s poorer IPC compared to Intel is showing.

Link to comment
Share on other sites

Link to post
Share on other sites

What about the EPYC cpu's, they have 32 cores and 64 threads. I understand that they are server chips, but they would outperform intels core i9 lineup when it comes to multi core workloads.

Link to comment
Share on other sites

Link to post
Share on other sites

Now what would be really amazing for team blue is if they develop a processor that can prosses data faster and energy efficient.

Link to comment
Share on other sites

Link to post
Share on other sites

So , Hardware Unboxed with the same CPUs tested shows the 1950x using 20-30W less power than the 7900X , Gamer Nexus shows roughly the same difference , yet LTT shows the 1950X/1920X using more power ? Can someone clarify this 

The Subwoofer 

Ryzen 7 1700  /// Noctua NH-L9X65 /// Noctua NF-P14s Redux 1200PWM

ASRock Fatal1ty X370 Gaming-ITX/ac /// 16GB DDR4 G.Skill TridentZ 3066Mhz

Zotac GTX1080 Mini 

EVGA Supernova G3 650W 

Samsung 960EVO 250GB + WD Blue 2TB

Link to comment
Share on other sites

Link to post
Share on other sites

Are you guys serious? You tested CPU's at 4K. Something that is almost 100% GPU bottlenecked and can get the same performance on low end NAHALEM from 2009. Give me a break.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Hunter259 said:

Are you guys serious? You tested CPU's at 4K. Something that is almost 100% GPU bottlenecked and can get the same performance on low end NAHALEM from 2009. Give me a break.

 

Agreed. RIP methodology. Slap another 1080ti there for SLI at least so you can be less bottlenecked in 4k and as a bonus you'd be able to see that PCI Lanes actually do matter and x16/x16 is better than x8/x8.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lathlaer said:

 

Agreed. RIP methodology. Slap another 1080ti there for SLI at least so you can be less bottlenecked in 4k and as a bonus you'd be able to see that PCI Lanes actually do matter and x16/x16 is better than x8/x8.

It doesn't. 

 

http://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1. It literally says so in the article "we are not sure how this scales with SLI"

 

2. Watch this:

 

 

 

and this:

 

 

Ofc. I am not the maker of those vids but I have seen them and I have talked to the author. I am willing to learn so if you could point out errors in methodology there, I would be grateful.

 

 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Lathlaer said:

-snip-

I'll trust  someone with a much larger track record than some russian youtuber.

 

SLI : https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/

 

Again. It doesn't matter. They had to go to 4k Surround AND run the lowest settings possible to make it have any serviceable difference. 

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1. Polish, not Russian. One of the people behind Extreme Hardware is a guy whose nick is Xtreme Addict and if you'd known a bit about the overclocking community you'd known that he is top tier overclocker (or used to be, I think he is retired now or smth). This is as far as credentials go. Also, they buy their own stuff, so no way is anything skewed towards any company.

 

2. That second link doesn't show any game performance.

 

I will even quote from the article:

 

"Whether you will see lower performance with x8 versus x16 is going to highly depend on the application. Some may see a difference, others won't."

 

So I am waiting for those SLI in games performance results that refute what I posted.

 

If you could show actual game performance in more than 1 game like I did that shows different results, I'd be grateful. 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Lathlaer said:

-snip-

Uh. There were two games tested with single and dual GPU's tested....

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Lathlaer said:

 

Agreed. RIP methodology. Slap another 1080ti there for SLI at least so you can be less bottlenecked in 4k and as a bonus you'd be able to see that PCI Lanes actually do matter and x16/x16 is better than x8/x8.

 

44 minutes ago, Hunter259 said:

GN has a recent video showing x16x16 makes fuck all difference. Can't link from this PC

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ColonelThunder said:

So , Hardware Unboxed with the same CPUs tested shows the 1950x using 20-30W less power than the 7900X , Gamer Nexus shows roughly the same difference , yet LTT shows the 1950X/1920X using more power ? Can someone clarify this 

Their reviews are 25-30 mins and have paper articles to accompany them. Now compare that to the 8 min LTT flash in the pan that contains 4k testing and less depth that the kiddie pool. Who do you think is right?

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, tom_w141 said:

Their reviews are 25-30 mins and have paper articles to accompany them. Now compare that to the 8 min LTT flash in the pan that contains 4k testing and less depth that the kiddie pool. Who do you think is right?

Well , the answer is already obvious , im asking because im curious what could cause 20W +/- results , or more . Cant be the CPU alone ? 

The Subwoofer 

Ryzen 7 1700  /// Noctua NH-L9X65 /// Noctua NF-P14s Redux 1200PWM

ASRock Fatal1ty X370 Gaming-ITX/ac /// 16GB DDR4 G.Skill TridentZ 3066Mhz

Zotac GTX1080 Mini 

EVGA Supernova G3 650W 

Samsung 960EVO 250GB + WD Blue 2TB

Link to comment
Share on other sites

Link to post
Share on other sites

Can someone explain why the value chart is curved for Intel but straight for AMD?

In the vid at 6:29:

 

https://youtu.be/9voQqU73-Mg?t=6m29s

 

Edit:  Or just tell me how to read that graph in general?  Both lines are Intel with one curving and one flat?  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ColonelThunder said:

So , Hardware Unboxed with the same CPUs tested shows the 1950x using 20-30W less power than the 7900X , Gamer Nexus shows roughly the same difference , yet LTT shows the 1950X/1920X using more power ? Can someone clarify this 

Gamers Nexus recorded the power draw at the 12V line to the CPU, while ours was measured at the wall. That means they have a reading that shows the CPU's power draw alone, while ours shows the whole system. Considering otherwise identical hardware, the difference in power draw from idle and CPU-only synthetic load should be accurate.

1 hour ago, Hunter259 said:

Are you guys serious? You tested CPU's at 4K. Something that is almost 100% GPU bottlenecked and can get the same performance on low end NAHALEM from 2009. Give me a break.

RotTR and DXMD testing these days is to compare DX11 and DX12 performance deltas, which is something that's tough to convey, but pay attention to the 97th percentiles, because they tell the bigger story, even at 4K.

 

Normally, we would also have included For Honor and Ghost Recon: Wildlands at 1080p and 4K as well, but issues with our UPlay accounts stymied us. To compensate, we ran more productivity benches.

 

In scenarios like this, we SHOULD have went with 1080p, but we'd already completed 4K testing at that time. Time constraints dictated that we couldn't go back and redo them, so here we are.

8 minutes ago, Alpha297 said:

Can someone explain why the value chart is curved for Intel but straight for AMD?

In the vid at 6:29:

 

https://youtu.be/9voQqU73-Mg?t=6m29s

 

Edit:  Or just tell me how to read that graph in general?  Both lines are Intel with one curving and one flat?  

The curved line is 1920X vs 7900X, while the flat line is the 1950X vs 7900X. Basically, since they're the same price, the value proposition never changes, making the 1950X always a better value overall than the 7900X in our testing. As for gaming, it's basically never a better value from our testing, but as mentioned before.

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

So what was your actual RAM speed and timings? Did you use Samsung B-die? assuming TR loves it as much as Ryzen. Did you test using UMA or NUMA memory settings?


I get that yall maybe just want to put out a quick under 10 minute on everything, but for a product like this I think you should strive for more info and much deeper dives.

Did you go into Game mode for your gaming tests? Gamers Nexus showed a decent gap using it at times especially in 1% and .1% frame times since windows seems to be jumping around all of the threads not knowing what to do, not that it would matter much at 4k.

I edit my posts a lot, Twitter is @LordStreetguru just don't ask PC questions there mostly...
 

Spoiler

 

What is your budget/country for your new PC?

 

what monitor resolution/refresh rate?

 

What games or other software do you need to run?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, GabenJr said:

The curved line is 1920X vs 7900X, while the flat line is the 1950X vs 7900X. Basically, since they're the same price, the value proposition never changes, making the 1950X always a better value overall than the 7900X in our testing. As for gaming, it's basically never a better value from our testing, but as mentioned before.

I appreciate the reply, but I'm still totally confused by the Overall Value graph and the Gaming Value graph a little bit later in the video.  I don't see how the "value" is being calculated in these charts.  The Gaming chart even says Threadripper 1920x is technically a better value at a total system cost of $100 USD, but considering it costs more than that on it's own, I don't see how a value at that amount can be technically obtained.  

 

Did I miss the specific methodology for these value calculations, or are these summations based on the previous charts that are mathemagically compiled into one line for each product?   

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Alpha297 said:

I appreciate the reply, but I'm still totally confused by the Overall Value graph and the Gaming Value graph a little bit later in the video.  I don't see how the "value" is being calculated in these charts.  The Gaming chart even says Threadripper 1920x is technically a better value at a total system cost of $100 USD, but considering it costs more than that on it's own, I don't see how a value at that amount can be technically obtained.  

 

Did I miss the specific methodology for these value calculations, or are these summations based on the previous charts that are mathemagically compiled into one line for each product?   

It's being calculated based on the weighted results across all testing (gaming specifically average FPS and 3DMark+Superposition). It's based on the methods @Enderman and @patthehat pointed out in the Core i9 correction thread. The technically comment is just calling out the oddity there at <$100. :P 

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×