Jump to content

Ryzen possibly held back by Nvidia's drivers

Adored means well, But they're one of the ones that unless multiple other sources agree with (Not Joker) I'll take with a grain of salt.

They're very good for putting out a theory and looking through other peoples data, they just lack the resources to prove things conclusively themselves. Which means if they say something, someone else is going to have to test it for me to 100% Believe it.

That said, they have put a ton of foot work into this. Interesting, and worth looking into.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Terryv said:

Isn't Pascal also just Maxwell with a node shrink?

 

I've long suspected that Nvidia doesn't actually do asynchronous compute on a hardware level. This is further evidence to that.

 

 

Also, DX12 being windows 10 exclusive is crap, I do hope Vulkan becomes the API of choice. I don't like being forced into Microsoft.

Mostly, yes, most of the actual compute stuff (ROPs, Cores, etc.) weren't really touched, though memory compression was improved and some extra features were added, including SMP (though that didn't go anywhere).

 

EDIT: Pascal is sort of like Maxwell refined. The basic stuff is the same but some stuff including the memory controller (for memory compression and to support HBM2 and G5x on some Pascal cards), the display controller (for DP 1.4 support), and a bit of the ROPs (Pascal does tile based rasterization in a bit of a different pattern than Maxwell). The relationship between Pascal and Maxwell, IMO, is more similar to that between Fury X and 290x.

1 hour ago, Terryv said:

Agreed, I've been watching his videos (AdoredTV) for a while. I like the way he does things and his effort to understand why certain things happen.

 

Unfortunately his results don't always match up to popular opinion (witch is why he's getting so much hate ATM).

 

I've yet to see bias in his reviews, always shows good and bad from both sides and uses sound logic to come to conclusions.

 

 

I still think he tilts stuff a bit in some reviews. As well, it's quite obvious he's a bit biased, as evidenced by his masterplan and gameworks videos, though in general his reviews are pretty good, though I make sure to keep my mind open when watching since he is a bit biased IMO.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Mothballs said:

They're very good for putting out a theory and looking through other peoples data, they just lack the resources to prove things conclusively themselves. Which means if they say something, someone else is going to have to test it for me to 100% Believe it.

Except that the data shown is entirely his own. Not sure how creating your own data is somehow more credible though. He can completely fabricate results and you'd be none the wiser. This is the first set of data with these parameters, so ofcourse you should withold judgement until more tests roll out under the same conditions. But being the first set of data, there is no way you can flatout claim they're false or inaccurate.

 

The only data you shouldn't really trust is the data that is done under the same parameters and yields completely different results than is the current consensus. Or at the very least scrutinize heavily. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Humbug said:

in legacy APIs, I.e. Upto directx11 or any version of openGL traditionally Nvidia had lower CPU overhead and better multithreading.

 

There is a claim however that Nvidia GPU architecture is less suitable for modern APIs like Vulkan/DX12 and that compared to AMD, NVidia is attempting to emulate more functions in software. It looks like that is the accusation that adoredtv is trying to expand on. He is saying that this newer part of NVidia's software suite is not yet well optimized or multithreaded.

That make sense. Even PaxwellPascal is kinda crippled in DX12/Vulkan.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, EminentSun said:

This could simply be due to the fact that amd gpus gets far more out of vulkan and dx12 as the rx 480s saw massive gains with either cpu.

yeah but that's not about the GPU per say. The results with the nvidia card would lead you to believe that Ryzen is bottlenecking the 1070 when it is clearly not the case on AMD cards, the processor is still able to push more. If it wasnt, it would have the same kind of results as it did with the Nvidia card.

17 hours ago, Mothballs said:

Adored means well, But they're one of the ones that unless multiple other sources agree with (Not Joker) I'll take with a grain of salt.

They're very good for putting out a theory and looking through other peoples data, they just lack the resources to prove things conclusively themselves. Which means if they say something, someone else is going to have to test it for me to 100% Believe it.

That said, they have put a ton of foot work into this. Interesting, and worth looking into.

AdoredTV is a "newcomer", he doesnt have the outreach that other youtubers do. The whole reason he used rx 480s in Crossfire is because his contact(s) refused to send him a Fury X lol

Link to comment
Share on other sites

Link to post
Share on other sites

This guy(s) also carried out this test after seeing Adored's test and the same thing is happening in The Division.

 

 

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like German magazine "C't" was able to observe similar behaviour in comparing Titan XP and Fury X performance when paired with the R7 1800x. Look at the 720p benchmarks.

OjSIcgM.jpg

 

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

 

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

Seen it, AdoredTV's results were valid. But at the moment an outlier. That said, Steve did mention he rushed it which means shortcuts were made.

Like how two Hawaii chips do not make 2 Polaris chips and the Fury X was being fully maxed out whereas the 1070 wasn't. Meaning the CPU results on the FuryX were bottlenecked.

 

Guess we'll need to wait for Vega and more DX12 games to have this verified some more. 

 

I dont know why he said Tomb Raider was poorly optimized though, it is definitely not poorly optimized. Nixxes did a great job.

Link to comment
Share on other sites

Link to post
Share on other sites

On 31.3.2017 at 6:19 PM, ivan134 said:

This is what has been infuriating me when people say DX 12 is crap so far and they only want Vulkan. Most benchmarks are down with Nvidia's cards which are garbage at low level APIs. Fact is DX 12 has a better track record so far than Vulkan when using AMD GPUs. About 80% DX 12 games work properly and give better performance than DX 11 using and cards. Vulkan is only 50% (everyone seems to forget the Talos principle).

 

This isn't just AMD catching up to Nvidia's DX 11 performance. There 480 beats the 1060 no matter the API the 1060 uses in all DX 12 games except BF1 and quantum break. The 480 is also faster in DX 12 than DX 11 in all these games, including games that had bad DX 12 launches like deus ex MD and the division. Reviewers aren't retesting these games after patches and we have to rely on Reddit and tech forums to get this info.

 

Also, for people who still don't believe that Pascal is only slightly less garbage at low level APIs than Maxwell, there you go. Software emulation will never match up having proper hardware.

DX12 IS crap - but not really from a performance standpoint. You will be completely locked in to Windows 10. If you want to use DX12, you must use Windows 10. Vulkan may have only 50% - so be it, i'll take 50% and cross platform compatibility any day for those (rationally thinking pretty irrelevant) 20%. For me personally, that's also an instant game over for DX12.

 

Vulkan is also OpenSource. Nvidias Linux drivers are IMHO absolute garbage (although i grudgingly admit, they have a lot better OpenGL performance), Nvidia doesn't really care about OpenSource and will lock you in to their eco system no matter what - GameWorks, G-Sync and so on.

 

About AdordedTV: Yes, he's a bit biased, he's admitted to being an AMD Fan, but he does darn good work and still the Intel CPUs won. He could have manipulated those last few percent in favour of AMD, but he didn't. Also, it has been proven by others, that his numbers are "the real deal", since they are pretty much the same.

 

 

Good news everyone...!

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, David89 said:

DX12 IS crap - but not really from a performance standpoint. You will be completely locked in to Windows 10. If you want to use DX12, you must use Windows 10. Vulkan may have only 50% - so be it, i'll take 50% and cross platform compatibility any day for those (rationally thinking pretty irrelevant) 20%. For me personally, that's also an instant game over for DX12.

 

Vulkan is also OpenSource. Nvidias Linux drivers are IMHO absolute garbage (although i grudgingly admit, they have a lot better OpenGL performance), Nvidia doesn't really care about OpenSource and will lock you in to their eco system no matter what - GameWorks, G-Sync and so on.

 

About AdordedTV: Yes, he's a bit biased, he's admitted to being an AMD Fan, but he does darn good work and still the Intel CPUs won. He could have manipulated those last few percent in favour of AMD, but he didn't. Also, it has been proven by others, that his numbers are "the real deal", since they are pretty much the same.

 

 

Cool story

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mihle said:

 

Watched this and made me wonder if SLI would get similar gains that crossfire does going from DX11 to DX12.

 

And ROTR is not a consistent benchmark

Link to comment
Share on other sites

Link to post
Share on other sites

Hardware Unboxed tested this to but he do not think Nvidia are to blame tough... o3o

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one thinking the comparison between Crossfire GPUs and single GPU (ie 2x480 vs 1x1070) is kinda wrong to look at CPU performance ? I understand the reasoning behind the video, but multi-GPU setups are supposed to dramatically increase CPU overhead, no matter the API/GPU/CPU, thus creating a CPU bottleneck, making this test less than ideal. I might be completely wrong though, but it seems really really "un-scientific" to make this kind of test with multi-GPU configuration, especially if you compare the results to a single GPU config. Not saying the conclusions are wrong, but it feels weird 

CPU : i7 8700k @5GHz, GPU : ASUS GTX 1080 Ti STRIX, RAM : 2x8Go 3000MHz Corsair Vengeance, MB : ASUS Prime Z370-A, PSU : CM V850, Case :  NZXT S340, CPU Cooler : NZXT Kraken x62, Monitor : Acer Predator XB271HU 27" 1440p 165Hz, OS : Windows 10 Home 64 bits  

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mihle said:

-snip-

boy that Fury-X working hard in DX12, GPU usage almost 100% all the times but it probably render the scene inefficiently with it inferior geometry engine. Polaris improved that quite significantly and i hope Vega will improve that even more.

 

Polaris-primative-discard-accelerator-ps

 

P9.png

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, roylapoutre said:

Am I the only one thinking the comparison between Crossfire GPUs and single GPU (ie 2x480 vs 1x1070) is kinda wrong to look at CPU performance ? I understand the reasoning behind the video, but multi-GPU setups are supposed to dramatically increase CPU overhead, no matter the API/GPU/CPU, thus creating a CPU bottleneck, making this test less than ideal. I might be completely wrong though, but it seems really really "un-scientific" to make this kind of test with multi-GPU configuration, especially if you compare the results to a single GPU config. Not saying the conclusions are wrong, but it feels weird 

Yes, you're the only one who doesn't get it. Not sure what's wrong with comparing them both under the same conditions. CPU overhead should only affect absolute numbers, not relative numbers.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Nena360 said:

Hardware Unboxed tested this to but he do not think Nvidia are to blame tough... o3o

Unfortunately, he only made a few useful tests (295x2 vs Titan X).

The Fury / 1070 tests were kind of useless, since he kept settings constant, as if it was a GPU review. As a consequence, there were circumstances in which one system was CPU-bound, while the other was GPU-bound. Overall, I think he missed the main point, that is, that the relative changes between DX11 and DX12 depended on the GPU used in CPU-bound scenarios, which may imply different settings for different GPUs. By chance, the 295 vs Tittan comparison suffered less from this problem, but he didn't use these cards with as many games.

Link to comment
Share on other sites

Link to post
Share on other sites

It's looking more like Ryzen works better with Crossfire, or possibly a glitch? We're not sure yet, there's a ton of inconsistency.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mothballs said:

It's looking more like Ryzen works better with Crossfire, or possibly a glitch? We're not sure yet, there's a ton of inconsistency.

Was already posted mate, I'll add it to the OP i guess.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Majestic said:

Was already posted mate, I'll add it to the OP i guess.

Ah, sorry, Missed it by going too fast. Whoops.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Majestic said:

Yes, you're the only one who doesn't get it. Not sure what's wrong with comparing them both under the same conditions. CPU overhead should only affect absolute numbers, not relative numbers.

I know we're not looking at relative numbers here, and that was not the point I tried to make. My point was that CPU workload is pretty different from single-GPU to dual-GPU setups. And it seems possible that with a dual-GPU setup, one would see limitations on both Intel and AMD CPUs, closing the performance gap between them when using 2xRX480.

CPU : i7 8700k @5GHz, GPU : ASUS GTX 1080 Ti STRIX, RAM : 2x8Go 3000MHz Corsair Vengeance, MB : ASUS Prime Z370-A, PSU : CM V850, Case :  NZXT S340, CPU Cooler : NZXT Kraken x62, Monitor : Acer Predator XB271HU 27" 1440p 165Hz, OS : Windows 10 Home 64 bits  

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh look, another bollocks video from AdoredTV where he posts outrageous claims.

AMD not performing as expected? Gotta be Nvidia crippling AMD! Surely it can't be that AMD might simply not be as good as the hype train, which AdoredTV certainly played a big role in, got out of hand and set their expectations too high. Nope, gotta be a conspiracy theory that Nvidia is behind it all!

 

One gaming that is known to perform poorly shows a bit of inconsistent results, actually showing Nvidia cards performing worse than they should? Yep, I am sure this is Nvidia crippling their own cards, making AMD cards look better, just so that Intel can get more sales!

Because crippling your own performance, so that both Intel CPUs and AMD GPUs appear to be better is a great strategy...

 

I wonder how much time AdoredTV spent on finding this one game where this happens, and how many games he had to ignore because it didn't align with the theory he wanted to push.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

Oh look, another bollocks video from AdoredTV where he posts outrageous claims.

AMD not performing as expected? Gotta be Nvidia crippling AMD! Surely it can't be that AMD might simply not be as good as the hype train, which AdoredTV certainly played a big role in, got out of hand and set their expectations too high. Nope, gotta be a conspiracy theory that Nvidia is behind it all!

 

One gaming that is known to perform poorly shows a bit of inconsistent results, actually showing Nvidia cards performing worse than they should? Yep, I am sure this is Nvidia crippling their own cards, making AMD cards look better, just so that Intel can get more sales!

Because crippling your own performance, so that both Intel CPUs and AMD GPUs appear to be better is a great strategy...

 

I wonder how much time AdoredTV spent on finding this one game where this happens, and how many games he had to ignore because it didn't align with the theory he wanted to push.

 

I didn't even know who the hell AdoredTV was until seeing this thread, but after looking at his home page, I can understand your post a bit more.  Seems like a very biased reviewer.  

 

 

Capture.JPG

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×