Jump to content

RX Vega Review and benchmarks "leaked"

MoonlightSylv
4 hours ago, Liltrekkie said:

And you beat the 1080 and you weren't that far off from the 1080ti.

 

It's a dual high end card, dual high end cards ALWAYS beat high end single cards for a few years. See the 6990, for example. 

6990 was WLIV5 and only ever competed against 32 and 28nm. 295x2 is competing against a 20% refinement on the 28nm node (Maxwell) and a further 30% boost on a 14nm node. 28 TSCM vs 16 TSCM is a far greater leap in density and also design refinement (the only benefit of the 28nm stall was that it forced the designers to properly learn how to improve their designs without relying on just manufacturing processes progress) 

 

Infact, the sheer insanity here is that a product which was inferior at launch (2x780Ti wrecked 2x290x), got even more inferior once the next generation arrived (stock 295x2 is just barely equal to a single Maxwell Titan X) and still with just 7% oc on the core and +25mhz on the memory is still fighting products that is flat out faster than the Maxwell Titan X.... Only conclusion is that AMD is supporting a old card for a ridiculously long time and that GPU technology simply isn't scaling that well with today's games. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, PCGuy_5960 said:

It's because of HBM2, AFAIK more bandwidth helps at higher resolutions ;)

 

But as I said it trades blows with the 1080 depending on the game and the resolution.

No. It is due to the design of the GCN architecture. 1080p simply doesn't allow AMDs pipeline design to be fully utilized so they have hardware overhead. Already back with the 290 and 290X we saw that once you reach a certain number of shaders, the GCN architecture requires higher resolutions or tougher workloads (async compute) to fully draw out the performance.

 

Another example is the 380 vs 380x in 1080p vs 1440p. 

 

And especially the fury lineup. Which due to hbm and bandwidth will keep scaling with resolutions until you reach 4.7-5GB VRAM usage, that is where you get cut off due to VRAM limits

 

This is a driver and pipeline "flaw" /tradeoff AMD has made in order to make their GPU architecture as flexible as possible 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Master Disaster said:

So all the people who said Linus was wrong and jumped to AMD defense when he sat on WAN show and said that RX Vega won't be significantly better that Vega FE because its the same silicon were wrong.....

You should probably go check pcper review as they include Vega FE figures, RX Vega is fairly consistently 15% faster sometimes up to 20% and as low as 2% depending on resolution etc. The big downside to their review is power draw, they only do efficiency and don't include Vega FE which is a little annoying. Gamers Nexus is likely to be one of the best reviews to look at when it comes out.

 

https://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-RX-Vega-Review-Vega-64-Vega-64-Liquid-Vega-56-Tested

 

This likely just doing to turn in to a debate about what significant is though, so if you don't think the difference is significant I'm not really going to argue.

 

I'm actually more interested in seeing Vega FE retested since AMD said they would unlock the RX features for FE at the launch of RX.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

You should probably go check pcper review as they include Vega FE figures, RX Vega is fairly consistently 15% faster sometimes up to 20% and as low as 2% depending on resolution etc. The big downside to their review is power draw, they only do efficiency and don't include Vega FE which is a little annoying. Gamers Nexus is likely to be one of the best reviews to look at when it comes out.

 

https://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-RX-Vega-Review-Vega-64-Vega-64-Liquid-Vega-56-Tested

 

This likely just doing to turn in to a debate about what significant is though, so if you don't think the difference is significant I'm not really going to argue.

 

I'm actually more interested in seeing Vega FE retested since AMD said they would unlock the RX features for FE at the launch of RX.

It's funny when people whole heartedly say something like that based on a rumor

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, XenosTech said:

It's funny when people whole heartedly say something like that based on a rumor

To me it doesn't really matter, if someone was never going to buy RX Vega then it doesn't matter if they don't really go looking in to stuff like that. It kind of matters when they spread false information but it's still up to buyers to do proper research and not blindly listen to internet commentators, depending on perceptions of each individual the difference between product performance and value proposition can be quite different. Even the type of product itself can change the view of things i.e. a CPU that is actually 20% faster than last generation.

 

I just really want to see Vega FE with the 'unlocked features' as I'm quite interested in seeing what the architecture feature improvements on Fiji are that were previously disabled. I don't quite trust looking at Vega FE vs RX Vega, I'd rather see the same cards retested.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

To me it doesn't really matter, if someone was never going to buy RX Vega then it doesn't matter if they don't really go looking in to stuff like that. It kind of matters when they spread false information but it's still up to buyers to do proper research and not like blindly listen to internet commentators, depending on perceptions of each individual the difference between product performance and value proposition can be quite different. Even the type of product itself can change the view of things i.e. a CPU that is actually 20% faster than last generation.

 

I just really want to see Vega FE with the 'unlocked features' as I'm quite interested in seeing what the architecture feature improvements on Fiji are that were previously disabled. I don't quite trust looking at Vega FE vs RX Vega, I'd rather see the same cards retested.

Agreed but it's looking like I may end up selling my 1070 for vega or sell it amd get a 580 not like I need anything above that for 1080p lol (before ppl start being rabid dogs, when I got this 1070 in feb like 8 people wanted to buy it from be before I even opened to box to confirm it was functional. Even now 4 people want to buy it from me at full price.)

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, XenosTech said:

Agreed but it's looking like I may end up selling my 1070 for vega or sell it amd get a 580 not like I need anything above that for 1080p lol (before ppl start being rabid dogs, when I got this 1070 in feb like 8 people wanted to buy it from be before I even opened to box to confirm it was functional. Even now 4 people want to buy it from me at full price.)

I was going to ask why even sell it but ^. Can't argue with profit lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

I was going to ask why even sell it but ^. Can't argue with profit lol.

I know right ? funny thing is I didn't even buy this full price, a friend of mine found this on sale somewhere on the net at 50% off so he bought like 5 of em lol

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Prysin said:

My 295x2 is about the same perf, and at 7% OC it is running around 540w.... 2x290X is equal to Vega 64.

 

P A T H E T I C

that's a dual GPU card that cost around double what this will cost at launch.... it's like comparing a 6 litre v8 to a 'modern' v4 lol. calm down, it's crappy but, at least compare it to relevant crap

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I wonder how much custom cards will cost in the end though, despite mining craze too and performance with more games and drivers. I plan to upgrade by the end of year so hopefully it's priced well.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, hammer3339 said:

that's a dual GPU card that cost around double what this will cost at launch.... it's like comparing a 6 litre v8 to a 'modern' v4 lol. calm down, it's crappy but, at least compare it to relevant crap

who cares what it DID cost, i only paid around 550$ for mine

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, hammer3339 said:

that's a dual GPU card that cost around double what this will cost at launch.... it's like comparing a 6 litre v8 to a 'modern' v4 lol. calm down, it's crappy but, at least compare it to relevant crap

 

2 hours ago, Prysin said:

who cares what it DID cost, i only paid around 550$ for mine

 

It's relevant in the sense that many people including me have dual 290X's or 295x2 and are looking to upgrade. If you happen to be aiming to go from a dual GPU configuration to a single GPU configuration it actually need to be faster, nobody downgrades their performance. It also needs to be worth spending what ever amount that single GPU will cost and right now I'd say no it's not.

 

Much fear of dual GPU there is, much over stated issues with dual GPU there is. Out of the 301 games I have the only one I can think of right now where I had to specifically disable crossfire was for Act of Aggression because it had major texture flicking and did it for AMD and Nvidia, it was just poorly made and likely did some not to standard stuff.

 

The list of games with no performance gain is also similarly small, that's not to say all games scale well but I'd rather have my two 290X's than not have two of them. One 290X can't drive 2560x1600 in any semi recent games on ultra settings.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, PCGuy_5960 said:

It's because of HBM2, AFAIK more bandwidth helps at higher resolutions ;)

 

But as I said it trades blows with the 1080 depending on the game and the resolution.

Indeed, this was for a few reasons including high memory bandwidth (due to wide memory buses) which includes R9 290/390 series, R9 Fury series and now, Vega.
I suppose Vega would be a better buy for the future than a 1080 at the moment seeing as Volta is not that far off, I wouldn't recommend getting a 1080 at the moment with Vega out if the prices are similar (if you don't mind the huge power consumption difference, that is ^_^ ).

I suppose the power consumption is fairly justified due to all the additional compute power and features Vega 64 has, but that is not what a vast majority of gamers are looking for in a strictly gaming card ;) It's also funny how big the difference in power draw is between Vega 56 & Vega 64 Liquid Cooled, I've read somewhere that it's likely that Vega 56 has a sweet spot of shaders for that architecture and there wasn't so much "juice" squeezed out of it because it beats the 1070, Vega 64 however had to be at least competitive with a 1080 so people even consider it considering power requirements so they had to push it.

 

16 hours ago, XenosTech said:

Agreed but it's looking like I may end up selling my 1070 for vega or sell it amd get a 580 not like I need anything above that for 1080p lol (before ppl start being rabid dogs, when I got this 1070 in feb like 8 people wanted to buy it from be before I even opened to box to confirm it was functional. Even now 4 people want to buy it from me at full price.)

Though with your current CPU, you might be better off sticking to a 1070 as it has lower CPU overhead than AMD GPUs (I didn't see Vega tested in that regard yet, but that's almost sure at the moment).

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Morgan MLGman said:

Indeed, this was for a few reasons including high memory bandwidth (due to wide memory buses) which includes R9 290/390 series, R9 Fury series and now, Vega.
I suppose Vega would be a better buy for the future than a 1080 at the moment seeing as Volta is not that far off, I wouldn't recommend getting a 1080 at the moment with Vega out if the prices are similar (if you don't mind the huge power consumption difference, that is ^_^ ).

I suppose the power consumption is fairly justified due to all the additional compute power and features Vega 64 has, but that is not what a vast majority of gamers are looking for in a strictly gaming card ;) It's also funny how big the difference in power draw is between Vega 56 & Vega 64 Liquid Cooled, I've read somewhere that it's likely that Vega 56 has a sweet spot of shaders for that architecture and there wasn't so much "juice" squeezed out of it because it beats the 1070, Vega 64 however had to be at least competitive with a 1080 so people even consider it considering power requirements so they had to push it.

 

Though with your current CPU, you might be better off sticking to a 1070 as it has lower CPU overhead than AMD GPUs (I didn't see Vega tested in that regard yet, but that's almost sure at the moment).

For the future a 1080 is better. @Dackzy have a Palit Jetstream 1080, we tested it in my rig (Link to post here)... Those are the results between a 1080 boosting to nearly 1900MHz with GPU Boost 2.0. All whilst being DEAD QUIET. fans barely turning....

 

Thing is, that thing barely uses ANY power, we are talking somewhere around 150-200w during firestrike... my 295x2 has a 7% OC to core and +25MHz to memory, running at +50% power limit, and it draws nearly 630w...

VEGA is shown to draw around 400-500w of power.

 

Thus a we are talking a HUGE power saving, we argued over 50w difference of a 390 and 970, well, over a few years that amounted to only a few dollars, but now we are talking 250-350w difference, this means there IS a legitimate loss of money when factoring in power usage, so much infact, that it is a outright disservice to suggest a VEGA GPU to anyone at all. Simply because in the long run, it isnt going to be "fine wine", it is going to "waste" a "fine wine" worth of money just to run it over a 1080.

 

VEGA

IS

A

FLOP

 

I called out VEGAs performance back in Febuary-March. I was right, best Vega is within 10% of the 1080, and not even close to a 1080Ti. If my 295x2, a soon 4 year old GPU core design, is still viable, then that means AMD has failed. No GPU, even in dual config, should ever be able to survive two generations (can argue 3 generations, as whilst the Polaris core is meant to compete  at 290X levels, but werent a actual "flagship" GPU design) AND a die shrink. 

 

There is no justification for a failure of this monumental scale. NONE.

 

Volta will release around summer next year, announcement of the Titan XYZ or whatever around Q1-Q2, Volta is easily going to be another 15-20% flat improvement over Pascal. Thus further causing AMD to lag behind. AMD is now TWO full generations behind Nvidia. And unless Raja does miracles with NAVI, it is going to keep on sucking.

 

GCN, flexible as it is, doesnt work above ˜3000 shaders. It is amazing up until around that point. After that point, leakage, pipeline saturation issues and scaling goes out the window.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Prysin said:

SNIP

Though as JayzTwoCents pointed out, if you're a content creator, Vega might be an appealing choice due to all of the compute enabled, and not cut-down like with GTX GPUs, his point being that if you're a content creator, your PC draws a lot of power anyway so that shouldn't be that much of a burden.

 

Have we gone back in time? The future was supposed to be fast and efficient... I mean, imagine the power draw of two Vega LC cards in CF setup paired with an overclocked Intel i9-7900X or the monstrous i9-7980XE... I suppose EVGA and Corsair might start selling more of those 1500W PSUs in the near future :|

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Some people may think the 56 is good value but it really is not at 400$, the 1070 is 350$ MSRP while having 35% smaller die. Nvidia can cut prices any time they feel like it.
Nvidia is waiting for 28th to make move and I'm sure they are not gonna leave the Vega 56 alone.
Mining dose not matter since both cards are not gonna be at MSRP anyway.

Slowly...In the hollows of the trees, In the shadow of the leaves, In the space between the waves, In the whispers of the wind,In the bottom of the well, In the darkness of the eaves...

Slowly places that had been silent for who knows how long... Stopped being Silent.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Prysin said:

 

Thus a we are talking a HUGE power saving, we argued over 50w difference of a 390 and 970, well, over a few years that amounted to only a few dollars, but now we are talking 250-350w difference, this means there IS a legitimate loss of money when factoring in power usage, so much infact, that it is a outright disservice to suggest a VEGA GPU to anyone at all. Simply because in the long run, it isnt going to be "fine wine", it is going to "waste" a "fine wine" worth of money just to run it over a 1080.

Unless the thing costs over 60 bucks to run a year it's really not a concern.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Morgan MLGman said:

Though as JayzTwoCents pointed out, if you're a content creator, Vega might be an appealing choice due to all of the compute enabled, and not cut-down like with GTX GPUs, his point being that if you're a content creator, your PC draws a lot of power anyway so that shouldn't be that much of a burden.

 

Have we gone back in time? The future was supposed to be fast and efficient... I mean, imagine the power draw of two Vega LC cards in CF setup paired with an overclocked Intel i9-7900X or the monstrous i9-7980XE... I suppose EVGA and Corsair might start selling more of those 1500W PSUs in the near future :|

We have to be realistic here. 50-100w difference ISNT a big deal, even with lots of usage. But we are crossing into a territory where you are looking at 2x power draw, then it doesnt matter, we are now talking ACTUAL financial impact over time. And in theory, with fewer "compute" enabled CUDA cores, the GTX cards should draw less during compute then during max gaming load, as it wouldnt need to power up all the shaders.

 

If you are doing video editing, there is ONLY one software worth using with AMD, that is Sony Vegas, but we also know from the past that a Polaris 480 is JUST as fast as a FURY X in that software, because it relies on the internal encode/decode engine, not the shader clusters.

 

If you actually use compute, you are better off with nvidia as almost all professional software is optimized for CUDA. Sure you can use AMDs translator software. But all these things comes at a cost of time, cost of debugging, cost of stability, cost of testing.

 

All in all. Radeon VEGA has no place outside of mining and "FirePro" market. People who buy FirePro versions generally have AMD Optimized software, or is willing to take the time and effort to optimize for it.

 

 

Vega has a lot of promise, or rather, HAD. I have even less confidence in the APUs now... leaked specs show that the Ryzen based APUs will feature up to 700 something shaders. Well, if VEGA need this much power, we are looking at 100w or higher APUs... pointless for small form factor as you need a dual tower air cooler OR a AIO to cool it quietly.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ravenshrike said:

Unless the thing costs over 60 bucks to run a year it's really not a concern.

http://www.rapidtables.com/calc/electric/energy-consumption-calculator.htm

 

Assume 1080 is around 200w, and VEGA is around 375-400W for AIR COOLED version (most sales and AIB cards)

 

If used 5 hours a day, we are looking at around 700KWh vs 365KWh for the 1080....

Average cost per KWh in the US is 12cent (as of 2011, just grabbed the first google result)

That means the formula is "Cent x KWh / 100 (100cent per dollar)"

GTX 1080 costs 43.8$ per year

RX Vega costs 84$ per year

 

over a 5 year period the difference is 201$

 

Probably worse these days as energy prices has prob gone up

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

12 minutes ago, Prysin said:

over a 5 year period the difference is 201$

So less than the cost of a cup of starschmucks coffee per month. Not something that the people building computers with these components will be worried about.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Prysin said:

If you actually use compute, you are better off with nvidia as almost all professional software is optimized for CUDA. Sure you can use AMDs translator software. But all these things comes at a cost of time, cost of debugging, cost of stability, cost of testing.

There's still a lot of pro apps that work very well on AMD GPUs, a few much better. They just aren't typically rendering applications and more down the CAD and fluid dynamics/math applications.

 

The only time you actually do buy an AMD GPU for workstation use though is if it's only going to be used for those professional applications and nothing else and are working on some seriously large projects, most hobby home study actually works fine on Intel iGPU and for a lot of other apps any dedicated GPU will do and I do meany piece of crap garbage which is why there are passively cooled cheap workstation cards.

 

Yes there are a ton of people out there that would actually benefit quite well from Vega but Nvidia has a death grip on the professional sector so won't get a look in this generation even in the cases where it is clearly better in every way. The only people in the professional space using AMD are partners and places with direct relationships with AMD etc.

 

Professionals aren't 'fanboys' they just simply don't care and buy what gets recommended to them by their vendors or stick to trusted brands, once AMD convinces hardware partners that their products are serous professional products with the engineering support behind them then they'll enter the market. Having an AMD part number on CTO list for workstations and servers doesn't actually mean anyone configures their systems with them.

 

AMD EPYC will do more to make hardware partners look at all AMD products than anything else will, those are actually going to sell and get used and gain mind share. I would say Threadripper will to but only if that starts appearing in products like the HP Z600/Z800 and other custom workstation outfits.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Morgan MLGman said:

Though with your current CPU, you might be better off sticking to a 1070 as it has lower CPU overhead than AMD GPUs (I didn't see Vega tested in that regard yet, but that's almost sure at the moment).

Switching to 1600 or 1700 before the end of the year, so might not be an issue

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, 3DOSH said:

Some people may think the 56 is good value but it really is not at 400$, the 1070 is 350$ MSRP while having 35% smaller die. Nvidia can cut prices any time they feel like it.
Nvidia is waiting for 28th to make move and I'm sure they are not gonna leave the Vega 56 alone.
Mining dose not matter since both cards are not gonna be at MSRP anyway.

MSRP has nothing to do with current selling price. Show me a 1070 selling for that right now or a 1060/rx580/480/570/470

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, XenosTech said:

MSRP has nothing to do with current selling price. Show me a 1070 selling for that right now or a 1060/rx580/480/570/470

That wasn't my point. You are not gonna find both cards at MSRP looking at the 64 availability right now.

Slowly...In the hollows of the trees, In the shadow of the leaves, In the space between the waves, In the whispers of the wind,In the bottom of the well, In the darkness of the eaves...

Slowly places that had been silent for who knows how long... Stopped being Silent.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, ravenshrike said:

 

So less than the cost of a cup of starschmucks coffee per month. Not something that the people building computers with these components will be worried about.

That is 1/3rd the price of the GPU "wasted" over 5 years...

 

Just because i used to be able to buy a Titan X each month for lulz didnt mean i did it... money matters, and when you put shit in perspective, VEGA just becomes incredibly unappealing to anyone other then the hardest of fanboys... and as @leadeater @App4that @Morgan MLGman @Dackzy @Notional @LAwLz @MageTank and many more can attest to. Iv'e been a staunch AMD "fanboy" for ages. I simply moved away from the fanboyism as AMD got progressively worse (may coincide with the company being more and more progressive and "diversity" fixated. Although, McDonalds and Bank of America has proven that diversity can increase profits massively, time will show i guess).

 

Truth is, there is some benefits to what AMD is doing, but they are too late to the party, they chose the wrong path when it comes to memory for the CONSUMER market. the 1080Ti already proves it can compete with HBM2 with its lightning fast GDDR5X. But more importantly, it is EASIER to get ahold of, and costs slightly less. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×