Jump to content

Frustration over the slow death of multi GPU

Zodwraith

Wanting 4k performance without spending enough for a car lead me here.

 

SLI was never a bad idea. It died because no one supported it. GPU makers gained 30-50% performance each gen so there was little reason for customers to embrace it, GPU makers wanted you buying their new cards, and game devs didn't want to waste resources optimizing configurations for single digit % customers. You got 50-70% boost at best, negative impact at worst. Double the power consumption and thermals. Stuttering in some games. Straight up crashes in others. But at least you had a stopgap measure you could employ to breathe new life into a system. Every single con in "why SLI died" is correct, but solved with support which never happened aside from thermals and power.

 

Fast Forward to today. My how times have changed. Many core CPUs are now heralded as premium when they looked just as foolish before because they're finally getting SUPPORTED. GPU growth has stagnated and doubled in price for marginal performance boosts. Enthusiasts are under an Nvidia monopoly. Mid range cards are $4-500. The GPU market is in a sad state when the new 5700xt is AMD's flagship and applauded for only costing $450 but struggles to keep up with a 2.5 year old 1080ti. This is unacceptable. GPUs are where CPUs were when Intel was trying to push the P4 as hard as possible and pretend multi core was not the future.

 

Twin 5700xts would make the 2080ti's price seem even more ludicrous and easily annihilate it if there was real support. I understand Nvidia's lack of support, they can point at their more profitable cards as your option. The higher you move up every stack you get diminishing returns for your money. AMD dropping CF entirely is completely stupid when they have ZERO options for enthusiasts. Should I be madder at Nvidia for price gouging but at least giving me an option or AMD for saying they don't even want my business?

 

Imagine many lower power GPU cores spread across a large heat sink instead of one insanely hot one? It's all support, design, support and no one is even looking down that multi GPU street vs trying to keep paving the single GPU one that's climbing an increasingly steep and expensive hill. The anemic trickle of SLI support looks like it was tacked on by an intern 6 months after launch.

 

Moore's law fell apart long ago but for the life of me I can't understand why people are blind to multi GPU while embracing multi CPU. I've ran SLI on VooDoo2s, 8800GTs, GTX 970s, and now I'm considering a 2nd 1080ti combing forums to see tricks and workarounds enthusiasts are employing to get better performance. There was at least mediocre support before with a lot of games seeing well above 50% boost and 2 good tier cards annihilated the flagship for similar cost. I have zero doubt as we hit physics and heat limitations multi GPU will return and finally be embraced and supported so us old school SLI/CF guys will be vindicated. Until that day enjoy that 2nd mortgage if you're an enthusiast because I'm not sure AMD is competent enough to compete or wouldn't charge just as much or more with how their current pricing is vs their own Polaris.

 

Just my .02

Link to comment
Share on other sites

Link to post
Share on other sites

I thought both AMD and Nvidia is testing multi-chip-module GPU? That'll be similar to what multi cores processors is. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

This is obviously not written by anyone in the industry

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with multi video card setups is they can't combine their VRAM pools. The first (and only) time I did SLI was with two GTX 560 Ti's. Sure I got the performance of a GTX 580 for less money, but the 1GB of VRAM between the two of them started to make games stutter more often. I swapped both out for a single GTX 670. I didn't gain anything performance wise, but the experience was much smoother. This is one of the major problems why multi-GPU setups aren't very practical.

 

The biggest reason I can think of why this problem hasn't been solved yet is because GPUs operate on a workload that's considered soft real-time. It has to complete work within a certain amount of time, otherwise the user experience degrades. And unless the VRAM between GPUs can transfer at basically the same bandwidth and latency as on-board VRAM, combining each GPU's VRAM pool is likely going to make things worse due to any external communication being a lot slower.

 

The only reason why this isn't a problem with doing something on a system with NUMA nodes and clusters is they have no real-time constraint. We just care they get something done and the faster they do it, the better.

 

Outside of that, there's the practical side of things. When I had my SLI setup, it consumed more power than if I had a single GTX 580 and when I wanted to upgrade, I had two video cards to get rid of. To me, it only makes sense to double up on graphics cards if you're at the top-end. Otherwise just get a single card.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Jurrunio said:

This is obviously not written by anyone in the industry

Ok, please enlighten me. Because I would love to research something that has obviously gotten past me instead of just scoffing at me. I was under the misconception the forums were to help each other and spread ideas. Are GPU makers NOT working towards multi core as Acid said and I predicted?

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Mira Yurizaki said:

combining each GPU's VRAM pool is likely going to make things worse due to any external communication being a lot slower

THIS is what I always understood was the real problem that caused the stuttering. I understand the increased bandwidth in Nvidia's new bridge helps but doesn't negate it completely. Which only goes back to having proper support in creating a broader link between GPUs. Thank you for giving a well thought answer instead of a snide remark.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

I thought both AMD and Nvidia is testing multi-chip-module GPU? That'll be similar to what multi cores processors is. 

Just checked on that and yes, I remember hearing something about it a couple years ago and both camps are only experimenting with it with very little noise for a while. Their big hurdle? 

 

“You’re talking about doing CrossFire on a single package, the challenge is that unless we make it invisible to the ISVs [independent software vendors] you’re going to see the same sort of reluctance.” - David Wang, senior VP of Radeon Technologies Group. You know, someone that actually is in the industry.

 

So I guess my point stands. SUPPORT is still the hurdle to multi GPU. If devs weren't reluctant to support it Nvidia or AMD would invest more into the problem of fatter pipes between GPUs. Heck, look how long it's taken devs to support more than 2 cores of a CPU. I still say it's got to happen eventually, but probably not until you see a console embrace it.

Link to comment
Share on other sites

Link to post
Share on other sites

even if multi-gpu was to come back, the user base wouldn't be more than 1-2% of users. It's simply not worth it for devs to release extra drivers (also gamers waiting on the drivers), There are just way too many hurdles for multi-gpus and at the end of the day the demand isn't there. I'm confident enough to say that it'll never happen, it's dead.

 

The majority of gamers play on 1080p or 1440p, 4k is where we are gonna settle for a decade+. and single gpus are already slowly getting there (even IF 4k 144hz become standard)

 

necessity, demand, memory bandwidth, support, none of it's there. I'm glad i tried SLI, but i don't miss it.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Zodwraith said:

SLI was never a bad idea

it is a bad idea, or rather a bad idea when things are going well for the company as it's a desperate measure to compare their products to something worth comparing to, e.g. RX 480 2-way CF v.s. GTX 1080. multiGPU scaling is just like RT in RTX and DLSS in which they take a huge effort and a lot of resources to code for, if it's not something your customers can use most of the time then it's not a good idea to have it, let alone a major marketing point.

 

3 hours ago, Zodwraith said:

But at least you had a stopgap measure you could employ to breathe new life into a system.

I dont see anything wrong from selling the current card to fund the next. You'll eventually do this if you choose to go SLI/CF this time anyways.

 

3 hours ago, Zodwraith said:

Fast Forward to today. My how times have changed. Many core CPUs are now heralded as premium when they looked just as foolish before because they're finally getting SUPPORTED.

Coding for SLI/CF resembles coding for multi-socket CPU platforms which is much more difficult (memory content, communication latency etc on top of scheduler problem), not multicore CPUs. That's why multi-die GPUs are actively researched and developed since they can share the same memory buffer and communicate directly through the package (e.g. Zen's infinity fabric)

 

3 hours ago, Zodwraith said:

GPU growth has stagnated and doubled in price for marginal performance boosts.

Nvidia threw all their time to RTX, sadly not much of a success overall. Now they're making you pay for their investment and increased production cost (larger die for the extra hardware). Makes sense even if you get the bum deal at the end

 

3 hours ago, Zodwraith said:

The GPU market is in a sad state when the new 5700xt is AMD's flagship and applauded for only costing $450 but struggles to keep up with a 2.5 year old 1080ti. This is unacceptable.

Too bad AMD couldnt make something better, or maybe they could but TSMC doesn't have the room to produce them. 7nm production is highly stressed

 

3 hours ago, Zodwraith said:

GPUs are where CPUs were when Intel was trying to push the P4 as hard as possible and pretend multi core was not the future.

It's not that they don't want to (since AMD is competitive now), but they can't do better. too bad you want a new GPU at the wrong time.

 

3 hours ago, Zodwraith said:

The higher you move up every stack you get diminishing returns for your money. AMD dropping CF entirely is completely stupid when they have ZERO options for enthusiasts.

My guess is AMD wants to deal with less problems with their software. AMD software's reliability has long been their achilles heel, why suffer more when you could just cut it off. Trying to use CF to keep up with Nvidia is not sensible, but a pointless struggle.

 

3 hours ago, Zodwraith said:

 

Imagine many lower power GPU cores spread across a large heat sink instead of one insanely hot one? It's all support, design, support and no one is even looking down that multi GPU street vs trying to keep paving the single GPU one that's climbing an increasingly steep and expensive hill. The anemic trickle of SLI support looks like it was tacked on by an intern 6 months after launch.

Mutli-die GPUs are a thing, both companies have leaks about plans for them (AMD doesnt really need a leak for this to be known since Zen 2 tbh)

 

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Zodwraith said:

Wanting 4k performance without spending enough for a car lead me here.

 

<snip>

 

Just my .02

That depends on what an individual would consider an expense worth spending their money on. My machine was built around the Intel 5960X chip, and at that time a monitor that could be driven by it cost silly amounts of money. So I waited until the RTX card arrived and now I can afford a decent monitor. The machine is more than capable of supporting a 27" IPS monitor at 144Hz (165Hz OC), admittedly the resolution is only 2560 x 1440 and is only 2K (QHD) not 4K 3840 x 2160 UHD.
But then I consider the cost of a 55" or 65" 4K OLED UHD TV to be equal to the value of a car!

Those who deny freedom to others deserve it not for themselves (Abraham Lincoln,1808-1865; 16th US president).

Link to comment
Share on other sites

Link to post
Share on other sites

 

2 hours ago, xg32 said:

I'm glad i tried SLI, but i don't miss it.

What generation did you try it? I just realized my first VooDoo2 SLI was 1998. Guess that dates me. The VooDoo2 was awesome, the 8800GTs were awesome, but the GTX 970s definitely showed SLI was losing support. Part of it was I just loved the tinkering aspect. We were building liquid cooling back around 2000 with a danger den block and an automotive tranny radiator and aquarium pumps. 8D This is where you guys get to say "OK boomer" and I get to reply "you kids wouldn't even have AIOs if it wasn't for us pushing the boundaries 25 years ago. Get off my lawn!"

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, SydneySideSteveSomewheres said:

But then I consider the cost of a 55" or 65" 4K OLED UHD TV to be equal to the value of a car!

Decent name 50" HDRs like Samsung and LG are as cheap as $350. OLED isn't necessary but nice. Not worth tripling the price IMHO. When I'm doing a build for someone the one place I tell them never to skimp is KB/Mouse/Monitor. If they don't understand what's going on inside the magic box choose the pieces they have to touch and see with care. Go hit Fry's and touch everything. I used to roll my eyes at "cAn I gEt RGB?!?!?!" but if that excites someone about their computer more power to them. I'll build the tackiest crap you've ever seen. My daughter's PC looks like a unicorn puked across her desk. I'm the only one in the house that cares about what's inside the PCs and I won't berate someone for using a TV instead of a monitor, especially now that their response times have gotten far better than early LCDs.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mira Yurizaki said:

The problem with multi video card setups is they can't combine their VRAM pools. The first (and only) time I did SLI was with two GTX 560 Ti's. Sure I got the performance of a GTX 580 for less money, but the 1GB of VRAM between the two of them started to make games stutter more often. I swapped both out for a single GTX 670. I didn't gain anything performance wise, but the experience was much smoother. This is one of the major problems why multi-GPU setups aren't very practical.

 

The biggest reason I can think of why this problem hasn't been solved yet is because GPUs operate on a workload that's considered soft real-time. It has to complete work within a certain amount of time, otherwise the user experience degrades. And unless the VRAM between GPUs can transfer at basically the same bandwidth and latency as on-board VRAM, combining each GPU's VRAM pool is likely going to make things worse due to any external communication being a lot slower.

 

The only reason why this isn't a problem with doing something on a system with NUMA nodes and clusters is they have no real-time constraint. We just care they get something done and the faster they do it, the better.

 

Outside of that, there's the practical side of things. When I had my SLI setup, it consumed more power than if I had a single GTX 580 and when I wanted to upgrade, I had two video cards to get rid of. To me, it only makes sense to double up on graphics cards if you're at the top-end. Otherwise just get a single card.

Let's just put a gpu socket on the mobo with vram slots. haha

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jurrunio said:

I dont see anything wrong from selling the current card to fund the next. You'll eventually do this if you choose to go SLI/CF this time anyways.

When I upgrade my old parts hand down to wife's/kids' computers. Keeping 4 PCs current is why I can't afford to buy the latest & greatest every cycle. That's on top of phones, tablets, cars and clothes. O.o

 

I think you're right about multi card GPUs not being utilized anymore but multi chip GPUs are inevitable. Especially if panel manufacturers can brainwash the public that 8k is necessary in any way/shape/form.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

Nvidia threw all their time to RTX, sadly not much of a success overall. Now they're making you pay for their investment and increased production cost (larger die for the extra hardware). Makes sense even if you get the bum deal at the end

That makes no sense. They'll charge whatever price is most profitable. Their investment and cost of inventory is a sunk cost. Especially their investment.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, LOST TALE said:

Let's just put a gpu socket on the mobo with vram slots. haha

Let's not give intel any ideas on yet another platform change.

Link to comment
Share on other sites

Link to post
Share on other sites

I too lament the stagnation/death/end of twin GPU card setups.  It was always something I wanted to do in a special build.  A project I have been saving for a rainy day.  However it just isn't going to happen now.

 

There is just something awesome about having two of something performance related be it two GPU's, two carbies, two outboard motors or two girls, really doesn't matter what,  just having two always looks and feels awesome.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LOST TALE said:

That makes no sense. They'll charge whatever price is most profitable. Their investment and cost of inventory is a sunk cost. Especially their investment.

Then why didnt they do this with the 1080ti? Vega cards came 2 years later and only challenged the 1080.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Jurrunio said:

Then why didnt they do this with the 1080ti? Vega cards came 2 years later and only challenged the 1080.

How do you know if they did it or not? do you have access to the data and calculations used for their decision and reasoning?

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Jurrunio said:

Then why didnt they do this with the 1080ti? Vega cards came 2 years later and only challenged the 1080.

While I don't think they charge whatever they want I will point out the 1080ti really was too good for it's generation but consumers were scoffing at $750-800 flagship GPUs at the time. It already WAS $100 more than the 980ti. I don't know how old you are but enthusiasts weren't thrilled with it then either.

 

You seem to forget the Bitcoin debacle shifted everything up in price after that. Even entry level cards. Used 1080tis were going for >$1k. Bitcoin ended but prices only partially recovered. People had adjusted to ridiculous prices so they were happy to pay for high prices vs the year of straight rape gamers endured. Enter RTX at post-bitcoin prices and that 1080ti shows just how "too good" it was. It was so good even with masses of mining farms dumping them on ebay they still held MSRP value while being used vs RTX pricing. Even NOW they demand $500 used with Ampere on the horizon.

 

So if you do the math they WERE charging more because there was no competition. It just didn't stand out as badly because bitcoin hadn't shifted everything to the point your flagship pierced that magic 4 figures. Saying they weren't gouging because AMD couldn't match it 2 years later makes no sense. They can't see 2 years in the future. That only shows how bad AMD failed at making competitive GPUs. RTX only proved even Nvidia struggled to beat the 1080ti so you can only blame AMD so much.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×