Jump to content

Is vega a failure?

MrAlbertrocks

The 1070 and the 1080 came out over a year ago. Now Vega is released and the price/performance is virtually the same. So, why exactly would anyone buy any Vega GPU? It came out way too late, you can't compare a GPU that came out a year ago with a new one, realize the performance is almost the same and call it a win for AMD, it just doesn't make any sense. And now Volta is just around the corner, what the hell is AMD thinking? Can someone please explain why they did that? I just can't understand AMD decisions regarding Vega.

Link to comment
Share on other sites

Link to post
Share on other sites

Vega 64? Eh, it's kinda DoA primarily since it's basically a 1080 that uses more power.

Vega 56 is a different story. It seems to handily make the 1070 kind of a bad buy especially with its retarded pricing from sellers at the moment.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

It's just to give strictly amd users a upgrade, performance is lacking but the limited edition white vega cards look cool

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's a good choice if you have a free sync monitor. Free sync monitors are much cheaper than the g sync ones and people who want that buttery smooth variable refresh rate technology will/should look into a vega card.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, MrAlbertrocks said:

The 1070 and the 1080 came out over a year ago. Now Vega is released and the price/performance is virtually the same. So, why exactly would anyone buy any Vega GPU? It came out way too late, you can't compare a GPU that came out a year ago with a new one, realize the performance is almost the same and call it a win for AMD, it just doesn't make any sense. And now Volta is just around the corner, what the hell is AMD thinking? Can someone please explain why they did that? I just can't understand AMD decisions regarding Vega.

Volta won't launch for consumers this year so i wouldn't call it "around the corner".

And the vega 56 competes well with the 1070 performance wise so that's at least something.

 

The thing is, for gaming Vega might look a bit disappointed. But it's an absolute monster when it comes down to compute power.

There's a reason why Nvidia suddenly launched a big update that boosted compute performance a lot.

 

I think the benchmarks explain what i mean:

https://techgage.com/article/a-look-at-amds-radeon-rx-vega-64-workstation-compute-performance/2/

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

Vega64 and especially that liquid cooled one are just silly. Basically a team red 1080 with HBM2 instead of GDDR5X and it also uses a FUCK ton more power for nearly identical performance. The Vega56 is essentially in a similar spot to my own R9 Fury. The Fury was at the 980 price point and was pretty much a better buy AS LONG AS you didn't care about heat or power. Now the Vega56 is pretty handily slapping a 1070 and provided heat/power are of NO concern, theres just no reason to buy a 1070.  

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, MrAlbertrocks said:

The 1070 and the 1080 came out over a year ago. Now Vega is released and the price/performance is virtually the same. So, why exactly would anyone buy any Vega GPU? It came out way too late, you can't compare a GPU that came out a year ago with a new one, realize the performance is almost the same and call it a win for AMD, it just doesn't make any sense. And now Volta is just around the corner, what the hell is AMD thinking? Can someone please explain why they did that? I just can't understand AMD decisions regarding Vega.

Vega is a success... 

And it does NOT matter how much power it draws. All other reviews are unreliable. Don't trust in-game benchmarks. 

Volta is NOT around the corner. There is no announcement of any consumer volta GPU.

AMD targeted the people who buy mid-high range GPUs like the GTX 1080 and 1070. Not many would pay $700 on 1080Ti. At least now you have the choice. Maybe an year late, but there IS competition.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Vegetable said:

Vega64 and especially that liquid cooled one are just silly. Basically a team red 1080 with HBM2 instead of GDDR5X and it also uses a FUCK ton more power for nearly identical performance. The Vega56 is essentially in a similar spot to my own R9 Fury. The Fury was at the 980 price point and was pretty much a better buy AS LONG AS you didn't care about heat or power. Now the Vega56 is pretty handily slapping a 1070 and provided heat/power are of NO concern, theres just no reason to buy a 1070.  

Power consumption is a irrelevant metric for GPUs and CPUs. Look at what they deliver as performance. Electricity is too cheap for you to care about power consumption. If you bought a $500 card, then you can afford to pay even $20 for the power bill JUST for that PC, or just get a solar panel and feed it.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WhiteSkyMage said:

Power consumption is a irrelevant metric for GPUs and CPUs. Look at what they deliver as performance. Electricity is too cheap for you to care about power consumption.

If you take it as a glass half full approach, sure. But more power consumption also translates into higher thermals and higher ambient temperatures. People who want things to stay cool and quiet while being comfortable in their own room won't be going for the higher TDP equivalent of a lower TDP card that puts out as many frames.

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

another one of those threads explaining what im saying. Just because vega can only match a 1080 at best in gaming and everyone is saying its a terrible card. A GPU doesnt have to be the fastest for gaming for it to be decent. Look at the decent performance and pricing, the higher memory bandwidth, its a good option for freesync. The past gen however AMD only managed mid end performance with the rx 480/580.

 

A lot of people dont even consider that gsync monitors cost more than freesync giving AMD an even better deal when it comes to performance and sync, did anyone even bother to consider this before posting why vega exists?

Link to comment
Share on other sites

Link to post
Share on other sites

Power consumption in the consumer desktop market is usually a non-issue. If power consumption really was an issue, you'd be buying a laptop because those consume a lot less power than desktops.

 

If it were for businesses and HPC markets, then sure, there's a problem. But only if the performance Vega provides outclasses that of Pascal. Sometimes more power consumption is fine if the work time can be cut. i.e, if you were to plot power consumed over time, the area under the curve is more important than the raw value.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd mostly consider Vega a flop for 2 reasons.

 

  • It doesn't compete with the 1080 Ti, like how most people wanted
  • The architecture is clearly being pushed to its limits with its insanely high TDP

If you look at FPS per the dollar, it gives Nvidia some competition, but most people who wanted 1070/1080 levels of performance have likely upgraded already.

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, WhiteSkyMage said:

Power consumption is a irrelevant metric for GPUs and CPUs. Look at what they deliver as performance. Electricity is too cheap for you to care about power consumption. If you bought a $500 card, then you can afford to pay even $20 for the power bill JUST for that PC, or just get a solar panel and feed it.

No power consumption is a VERY relevant metric my dude. My old SLI Gtx 580's setup pulled 850w from the wall when paired with my FX 8350 at 4.9Ghz. The heat output was ABSURD. Seriously my room gained 5-10*f by running some games for more than an hour. Never again. My 4790k/Fury pull 350-400w from the wall and i get nearly 2x fps over my old rig. RIP to those trying to Xfire vega is all I have to say...

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Frankenburger said:

I'd mostly consider Vega a flop for 2 reasons.

 

  • It doesn't compete with the 1080 Ti, like how most people wanted
  • The architecture is clearly being pushed to its limits with its insanely high TDP

If you look at FPS per the dollar, it gives Nvidia some competition, but most people who wanted 1070/1080 levels of performance have likely upgraded already.

1. AMD did not say anything about it competing with the GTX 1080Ti, they did not compare it to the 1080Ti in the first place so it was not meant to be a 1080Ti competitor.

People "assumed" that if it is after the release of 1080Ti, it would be competing with it and I am not blaming AMD for releasing it so late, I am blaming the fans for EXPECTING it to compete with the 1080Ti.

2. We don't know if it is the architecture pushing or is it the architecture ITSELF that trades compute performance for efficiency. In the end of the day, VEGA is only just 1 GPU. AMD doesn't have the cash to make multiple like nVidia and to optimize them for efficiency/performance/server. AMD has to deal with that on software level.

 

5 minutes ago, Vegetable said:

No power consumption is a VERY relevant metric my dude. My old SLI Gtx 580's setup pulled 850w from the wall when paired with my FX 8350 at 4.9Ghz. The heat output was ABSURD. Seriously my room gained 5-10*f by running some games for more than an hour. Never again. My 4790k/Fury pull 350-400w from the wall and i get nearly 2x fps over my old rig. RIP to those trying to Xfire vega is all I have to say...

I have this whenever I do video editing. MY GTX 980 and my 5820K OCed is enough to heat my room to summer hot temperature. Do you know what I do - open the window :) Yup....

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WhiteSkyMage said:

I have this whenever I do video editing. MY GTX 980 is enough to heat my room to summer hot temperature. Do you know what I do - open the window :) Yup....

Good luck with that when it's 90+ degrees outside ;)

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think it is a bit of a failure so far as Vega 64 price gouging was in full effect at launch.

 

Vega 64 is NOT worth $599 for the base model.

 

We will have to wait and see how the AIBs are and what the Vega 56 launch looks like.

Desktop:

AMD Ryzen 7 @ 3.9ghz 1.35v w/ Noctua NH-D15 SE AM4 Edition

ASUS STRIX X370-F GAMING Motherboard

ASUS STRIX Radeon RX 5700XT

Corsair Vengeance LPX 16GB (2x 8GB) DDR4 3200

Samsung 960 EVO 500GB NVME

2x4TB Seagate Barracuda HDDs

Corsair RM850X

Be Quiet Silent Base 800

Elgato HD60 Pro

Sceptre C305B-200UN Ultra Wide 2560x1080 200hz Monitor

Logitech G910 Orion Spectrum Keyboard

Logitech G903 Mouse

Oculus Rift CV1 w/ 3 Sensors + Earphones

 

Laptop:

Acer Nitro 5:

Intel Core I5-8300H

Crucial Ballistix Sport LT 16GB (2x 8GB) DDR4 2666

Geforce GTX 1050ti 4GB

Intel 600p 256GB NVME

Seagate Firecuda 2TB SSHD

Logitech G502 Proteus Spectrum

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't read all comments but here's what i think. I'd consider myself an Nvidia fanboy, i've owned multiple high end Nvidia cards (newest of the new from that generation, though not upgrading until the next one so 580/680-1080 now). 

I am genuinely dissapointed with Vega, Nvidia has been beating their own meat since the 680 days when they started to pull ahead with AMD being about a generation behind (this is very true now). Nvidia has been pumping up prices because they can. So last year they put out the 1080, fanboy here was quick about spending about 6000dkk (nearly 950 usd) on a GTX 1080 Strix. Now they cost about 5000 dkk, roughly 800 usd. A Vega 64 starts at 4700dkk (750 usd).

My issue is that for 50 bucks less you get: 1 year old performance from Green Team, higher temps because its a ref card (and custom Vega's are going to cost about the same as custom 1080'ies). Then you might think, well its 50 bucks less expensive for more or less the same performance good deal right? NO! The Vega 64 has a TDP of 345W compared to 180W from the 1080. A beefy PC might set you back 100W so thats 445W compared to 280W for the same performance, those 50 bucks will be gone in a couple of gaming sessions (atleast in Denmark) 

 

TL:DR 

For once i hoped AMD could give Nvidia some good competition and not just be able to play with them on 1 year old consumer products. For once i hoped that Nvidia would shiver a bit maybe pick up the pace and deliver some good performing cards at good prices (The GTX Titan XP was about 15000dkk, about 2370 usd). 

 

That said i am glad to see that Threadripper is making Intel revisit some of their latest decissions.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, megaemil said:

I haven't read all comments but here's what i think. I'd consider myself an Nvidia fanboy, i've owned multiple high end Nvidia cards (newest of the new from that generation, though not upgrading until the next one so 580/680-1080 now). 

I am genuinely dissapointed with Vega, Nvidia has been beating their own meat since the 680 days when they started to pull ahead with AMD being about a generation behind (this is very true now). Nvidia has been pumping up prices because they can. So last year they put out the 1080, fanboy here was quick about spending about 6000dkk (nearly 950 usd) on a GTX 1080 Strix. Now they cost about 5000 dkk, roughly 800 usd. A Vega 64 starts at 4700dkk (750 usd).

My issue is that for 50 bucks less you get: 1 year old performance from Green Team, higher temps because its a ref card (and custom Vega's are going to cost about the same as custom 1080'ies). Then you might think, well its 50 bucks less expensive for more or less the same performance good deal right? NO! The Vega 64 has a TDP of 345W compared to 180W from the 1080. A beefy PC might set you back 100W so thats 445W compared to 280W for the same performance, those 50 bucks will be gone in a couple of gaming sessions (atleast in Denmark) 

 

TL:DR 

For once i hoped AMD could give Nvidia some good competition and not just be able to play with them on 1 year old consumer products. For once i hoped that Nvidia would shiver a bit maybe pick up the pace and deliver some good performing cards at good prices (The GTX Titan XP was about 15000dkk, about 2370 usd). 

 

That said i am glad to see that Threadripper is making Intel revisit some of their latest decissions.

How much is 1kW in Denmark? For me it's 0.25 CHF in Switzerland, and that is 0.26 USD. If I did the math id be paying $7/month for 2 hours gaming a day (which is all i have time for these days), if my PC consumes 470W power with RX Vega. Hell my lunch is 10chf a day. So there is no reason for me to care about power. If you are that afraid that your power bill would go up by $50 USD if you were to buy RX Vega, then by all means, go with nVidia. I don't see a problem if you were to pay $10 a month for the high power consumption of Vega, because $10 is NOTHING.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, WhiteSkyMage said:

If I did the math id be paying $7/month for 2 hours gaming a day if my PC consumes 470W power with Vega. Hell my lunch is 10chf a day. So there is no reason for me to care about power.

2 hours of gaming a day is pretty much nothing, which is why your figures is so low. I'd venture to say that most gamers play about twice that much on average. I typically only play games 3 days out of the week, and I'm still pulling 30 hours of game time in a week.

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, WhiteSkyMage said:

How much is 1kW in Denmark? For me it's 0.25 CHF in Switzerland, and that is 0.26 USD. If I did the math id be paying $7/month for 2 hours gaming a day (which is all i have time for these days), if my PC consumes 470W power with RX Vega. Hell my lunch is 10chf a day. So there is no reason for me to care about power. If you are that afraid that your power bill would go up by $50 USD if you were to buy RX Vega, then by all means, go with nVidia. I don't see a problem if you were to pay $10 a month for the high power consumption of Vega, because $10 is NOTHING.

Your personal opinion and preference doesn't change the fact that higher TDP is a big downside for some people. The reality is that Vega came out a whole year after the release of the 1070/1080, only to deliver higher temperatures, higher TDP, for the same price and performance. Even if it had a better TDP and better temps, i would consider it a failure. People really waited a whole year for performance they could have gotten a year earlier?. The only reason i would pick up an Rx Vega is if i'm planning on getting a freesync monitor. But the recent failures of AMDs GPUs have forced me to get a Gsync monitor already. (What's the point of 144hz Freesync monitors if their GPUs can't get to those framerates).

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MrAlbertrocks said:

The only reason i would pick up an Rx Vega is if i'm planning on getting a freesync monitor. But the recent failures of AMDs GPUs have forced me to get a Gsync monitor already.

There's also the fact that there's no standard of quality for FreeSync. FreeSync monitors range from garbage to almost as good as GSync. Not that I'm saying there aren't good FreeSync displays out there, but FreeSync is buyer beware territory, and that's probably not going to change for a while.

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Frankenburger said:

If you take it as a glass half full approach, sure. But more power consumption also translates into higher thermals and higher ambient temperatures. People who want things to stay cool and quiet while being comfortable in their own room won't be going for the higher TDP equivalent of a lower TDP card that puts out as many frames.

Also considering the electric bill in certain countries. Where I live, electricity is not cheap in the summer, and my bill averaged over USD $150 through June, July, and part way through August. And that's with 6 computers (two of them AMD FX-8350's, one AMD APU (I think A8), one with an Intel Core i7-6700K (overclocked to 4.4 GHz 1.32 volts), and my Intel Core i7-5930K based computer with a GTX 1080 Ti all stock, and one is an old Intel computer), 2 refrigerators, and some fans running constantly.

 

So power consumption is not a useless metric. It may not be to you but to a lot of others, it is a useful metric to look at, especially if you're trying to maximize your performance-per-watt.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Can't say yet.  I think I commented months ago that AMD didn't need to beat Nvidia, they just need to find their niche.  They did a really good job with their CPU lineup, and despite what benchmarks may show, they won't be able to meet demand because of this crazy market.  It's kind of a win win for them right now even if Vega isn't spectacular.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Biggerboot said:

Can't say yet.  I think I commented months ago that AMD didn't need to beat Nvidia, they just need to find their niche.  They did a really good job with their CPU lineup, and despite what benchmarks may show, they won't be able to meet demand because of this crazy market.  It's kind of a win win for them right now even if Vega isn't spectacular.  

That's the thing though, Vega doesn't really have a niche. There's nothing about Vega that makes it stand out from the competition as a gaming product. Also, the fact that it's bundled with a game that's making retailers sell it for over MSRP is making a lot of people scratch their heads. The only reason to go with Vega, considering how old the 1070 and 1080 are, is if someone really REALLY wants FreeSync over Gsync.

 

Gaming Rig
Spoiler

CPU: Intel i7-6850k @ 4.2GHz

GPU: 2x FE GTX 1080Ti

Memory: 16GB PNY Anarchy DDR4 3200MHz

Motherboard: ASRock X99 Extreme 4

 

Encoding Rig
Spoiler

CPU: Ryzen 7 1700 @ 3.7GHz

GPU: GTX 1050

Memory: 8GB Curcial Ballistix DDR4 2133MHz

Motherboard: Gigabyte AB350M-DS3H

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Frankenburger said:

That's the thing though, Vega doesn't really have a niche. There's nothing about Vega that makes it stand out from the competition as a gaming product. Also, the fact that it's bundled with a game that's making retailers sell it for over MSRP is making a lot of people scratch their heads. The only reason to go with Vega, considering how old the 1070 and 1080 are, is if someone really REALLY wants FreeSync over Gsync.

TBH I think that's been the case for a while minus one card in each generation (R9390/RX480).  It could be the Vega56, but Nvidia's architecture has just always been ahead.

Again, we'll see.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×