Jump to content

Top Tier RX Vega card rumored to be more expensive then the GTX 1080TI

16 minutes ago, valdyrgramr said:

Are there people on the forum who bought one?  I've only seen people with more money than brains by them, and they try to use 4 in one rig.

I bought a R9 295x2 before I had my sli Titan X water cooled rig. Was a good card in fps but coil whine, huge power draw, and temps we're meh. If they make a new one with Vega it'll be terrible. A 120mm aio wouldn't be enough to cool it and it would require way too much power. Not to mention you'd be better off just getting a 1080ti if it's more then a 1080 and just upgrade when next gen comes out without taking a massive hit resale value when the bottom tier cards match it's single GPU performance while being cheaper to crossfire then getting the dual GPU. Case in point the r9 295x2 was like what 1000-1200 bucks when it first released I picked mine up new for 650 and back then two of the single gpu's from it were cheaper think you could buy 3 for the price of the R9 295x2. It wasn't as big of a kick in the dick as the titanZ I'll give it that. Lol you can find a Titan Z for 600 bucks now. RIP monies on resale.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Okjoek said:

To be fair I never had any interest invested in Vega aside from their integrated graphics for notebooks and barebones. >200 USD cards are just a waste of money IMO. I can wait for 4K to reach the mainstream. When the time comes to replace my RX 460 I might jump ship to Volta since a GTX 1050 is already more power efficient by ~10-15%. My biggest concern is giving up the Radeon control panel, Relive, Freesync and other software-side things I've grown accustomed to. You clearly have experience with the experience nVIDIA offers so what would you tell someone with my concerns?

Nvidia has the same functionality or better at every step of the way, but sometimes (free-sync being a good example) accessibility to that functionality is more restrictive/expensive.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, tom_w141 said:

2 gpus linked via IF is on the roadmap for navi not vega. If vega was a performer "leaks" would be appearing left right and centre.

There's apparently some unused equipment on the vega die. For all you know, Vega and Navi are the same die but with one being connected to another chip via infinity fabric and the other isnt.

Link to comment
Share on other sites

Link to post
Share on other sites

Thread cleaned.

 

Please keep the fanboy accusation out of it, as per the CS it's not allowed;

Quote

No harassment, discrimination or abuse of any kind.

  • This includes insults, fanboy, or troll accusations

 

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

As far as I know, this has never been confirmed. Everything I can find speaks against it, unless you assume that AMD's driver team has just been sitting on their asses and plucked lint out of their belly buttons for about half a year. If that's the case then don't expect Vega RX to have any better drivers.

The entire concept of Vega FE running a "Fiji driver" is kind of ridiculous if you ask me, but I'd rather not go down that rabbit hole again.

 

Yes, TBR appears to be disabled, but you also have to remember that it is not a performance increasing feature. If it increases performance at all, then you should only expect maybe ~5%, and only in scenarios where there is a lot of pressure on the memory like 4K or above.

TBR actually helps with vRAM bandwidth, with the Kyro II (pretty much the first GPU with proper TBR).
This article while old, is still relevant in regards to TBR: http://www.anandtech.com/show/735/3
 

TL;DR: Tile Based Rendering is the efficient approach which can allow GPU to reach their theoretical performance, while the current brute force method relies on fast memory (and lots of it) and plenty of GPU power to spare. (TBR is also used in mobile GPU and apparently the XboneX).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, RagnarokDel said:

There's apparently some unused equipment on the vega die. For all you know, Vega and Navi are the same die but with one being connected to another chip via infinity fabric and the other isnt.

No...just no. Navi'll probably be the same architecture on the 7nm GoFlo process and AMD will probably go for power savings over performance increase which means 50-60% less power used for the same performance. Then they will cut down the dies to whatever makes the most sense from an IF performance and yield perspective and then toss them on a single board. So a hell of a lot less power usage and, in theory at least, the ability to scale to as much performance as needed for any individual market segment for much cheaper than going for large dies would be.

Link to comment
Share on other sites

Link to post
Share on other sites

I was just reading about this on Reddit, and one commenter said that rumours from Finnish sources claim it to cost around/almost 1000 euros as well. Quick search reveals that GTX 1080 Ti goes for around 1000€ as well, both sides of thousand really.

 

If this is indeed the case, then AMD will have a huge problem, because nobody in their right mind would buy a Vega, if they can get the TI for the same price, or even lower. I hope there is something that we are missing (or AMD is leaving out) or that those prices are just completely off, but damn...

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, VagabondWraith said:

Who in the hell is gonna buy a card that expensive except for the most die hard AMD fankids? I mean, you need a 1500w PSU for that thing.

AMD Users are like Apple users, no matter what they come out with, they will buy it.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RagnarokDel said:

There's apparently some unused equipment on the vega die.

Where did you hear that?

 

2 hours ago, Dabombinable said:

TBR actually helps with vRAM bandwidth, with the Kyro II (pretty much the first GPU with proper TBR).
This article while old, is still relevant in regards to TBR: http://www.anandtech.com/show/735/3
 

TL;DR: Tile Based Rendering is the efficient approach which can allow GPU to reach their theoretical performance, while the current brute force method relies on fast memory (and lots of it) and plenty of GPU power to spare. (TBR is also used in mobile GPU and apparently the XboneX).

Yes that's what I said. In scenarios where there is a memory bandwidth limitation we might see a small increase in performance, but outside of that there most likely won't be any performance increase at all from TBR.

It is much more about efficiency than performance.

Link to comment
Share on other sites

Link to post
Share on other sites

WTF? If this is true then they may as well just fucking quit. Vega is gonna be a joke. A bad joke. The only chance it has is if it offers 1080 performance at 1070 prices. If it achieves anything less than that, then AMD failed big time, because we sure as hell know Vega will not hold the performance crown, not even close.

 

Welp, if this turns out to be true then my upcoming rig will definitely be a r7 1700 + gtx 1070. I mean, I pretty much already settled on that, but now I'll know for sure.

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RagnarokDel said:

There's apparently some unused equipment on the vega die. For all you know, Vega and Navi are the same die but with one being connected to another chip via infinity fabric and the other isnt.

Doesn't really matter and that seems like a waste of fabrication and design cost to put something in that will stay disabled. If you're going to use it in Navi then add it during Navi development.

 

If there is an IF interconnect on the die it will get used on Vega products, most likely in servers and is tied in with HBCC which more than likely was developed for HPC and less for gaming. If all these technologies that AMD has been developing are actually what they say they are I don't think we will be seeming them in use as advertised until end of life of Vega at the soonest, what they are talking about is some pretty big design/philosophy changes.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Max_Settings said:

AMD Users are like Apple users, no matter what they come out with, they will buy it.

That's a stupid generalization... You probably meant "fanboys" instead of users. But even then, isn't that what all fanboys do?

 

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RagnarokDel said:

There's apparently some unused equipment on the vega die. For all you know, Vega and Navi are the same die but with one being connected to another chip via infinity fabric and the other isnt.

That's pretty much what I'm saying... Vega would be a game changer if it was Navi but its not so its basically DOA. an expensive 1080 right before volta lmao... Navi however if it was 2 Vega like GPUs on the same card linked via IF then that would be very exciting.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Max_Settings said:

AMD Users are like Apple users, no matter what they come out with, they will buy it.

I only have ever had 1 AMD product and that is because they released something worth buying. So fail goes your generalisation right there.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Masada02 said:

Technically, from start to finish: 3870x2, 4870x2, 6990, 7990, (8990 if you count OEMs), 295x2, Pro Duo (Fiji chips), Pro Duo (Polaris chips). 

If the pricing rumour isn't wrong, then a dual-GPU sounds like a plausible possibility. Two ~GTX 1080-performing chips on one card should cost more than a GTX 1080 Ti.

 

16 hours ago, Valentyn said:

HardOCP "Feels" test between Vega and GTX 1080Ti

 

 

The two systems were running 3440x1440p, and one of the players questioned says that the 1080 Ti stayed between 60 and 120 FPS, while the Vega system sometimes dipped below 60 FPS when there was a lot of stuff on the screen. Neither of those performance descriptions sound like they should, considering a GTX 1080 gets 80 - 140 FPS while running DOOM at 3440x1440.

 

14 hours ago, LAwLz said:

That test should be seen as an insult to everyone who watches it.

Does HardOCP TV think their viewers are mentally handicapped? What a useless test.

It was all that AMD allowed them to do before taking the Vega GPU back home with them at the end of the day. Kyle says at the end of the video that it is entirely non-scientific and subjective, but it sounds like it was either that, or nothing. And that's better than nothing. AMD had also wanted to compare it to a GTX 1080, but Kyle insisted on comparing it to a GTX 1080 Ti. I think of it more as a fun teaser, and also an example of how people's perceptions, and game runs can differ.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Delicieuxz said:

The two systems were running 3440x1440p, and one of the players questioned says that the 1080 Ti stayed between 60 and 120 FPS, while the Vega system sometimes dipped below 60 FPS when there was a lot of stuff on the screen. Neither of those performance descriptions sound like they should, considering a GTX 1080 gets 80 - 140 FPS while running DOOM at 3440x1440.

6 people said no difference

3 people said rx Vega was better

1 person said 1080ti was better

 

Still though, feels like AMD is being very indirect.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Humbug said:

Still though, feels like AMD is being very indirect.

They pulled this shit with Ryzen too through. They kept saying 40% IPC improvement right up till launch day and then they were like nope, it's 52%; which they had to have known was the case months earlier. If it's actually at that price point they've got to have something up their sleeve. As for the guy stating he knew what the FPS was, I call bullshit, especially on Freesync/Gsync enabled monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

if its true i truly feel sorry for the amd graphics division... i have not bought an amd card and unfortunately i have too much invested in nvidia to consider the change the only thing that would've peaked my interest would've been performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

Here are some posts from Hard Forum about the RX Vega pricing rumour:

 

https://hardforum.com/threads/amd-radeon-rx-vega-prices-rumored-to-be-around-850.1940627/

 

Quote

Nvidia 1080's here in Belgium cost 799€ or more that is 932$ may keep that in mind with conversions.

 

Quote

Ex-Swede here.

 

Keep the following in mind.

 

Computer parts cost more in Sweden than they do here, so simply converting the exchange rate is not going to get you there.

 

Current pricing for GTX 1080's go up to 7250KR for a a hybrid AIO model, but they do come cheaper than that as well. See here.

 

So, 7000kr still places it above most 1080's, but not by as much as simply converting it to USD using the current exchange rate would imply.

 

In that context, a rumoured Swedish pricing for Vega 64 of 7,000 SEK ($850 USD) doesn't sound extremely unreasonable, IMO, since it is sort-of competitive for the region, when factoring in the async-compute advantage Vega will have over the GTX 1080.

 

I would then expect North American prices to be comparable to local GTX 1080 prices.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Delicieuxz said:

Here are some posts from Hard Forum about the RX Vega pricing rumour:

 

https://hardforum.com/threads/amd-radeon-rx-vega-prices-rumored-to-be-around-850.1940627/

 

 

 

In that context, a rumoured Swedish pricing for Vega 64 of 7,000 SEK ($850 USD) doesn't sound extremely unreasonable, IMO, since it is sort-of competitive for the region, when factoring in the async-compute advantage Vega will have over the GTX 1080.

 

I would then expect North American prices to be comparable to local GTX 1080 prices.

To be fair it's the same price as their 1080ti. In France you can get the cheapest ti at 750€ and the cheapest 1080 at 650€, so it's hard to compare..

Link to comment
Share on other sites

Link to post
Share on other sites

I'm thinking distribution or availability problems: Not uncommon for any new launch but given the not infamous HBM2 problems Vega probably won't be at a decent price until hopefully holiday season later this year.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

If Vega ends up being anywhere near competitive to Nvidia's top-end that'll be a good thing in my eyes. So it being close to a 1080 is promising, hopefully the price is right.

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Delicieuxz said:

The two systems were running 3440x1440p, and one of the players questioned says that the 1080 Ti stayed between 60 and 120 FPS, while the Vega system sometimes dipped below 60 FPS when there was a lot of stuff on the screen. Neither of those performance descriptions sound like they should, considering a GTX 1080 gets 80 - 140 FPS while running DOOM at 3440x1440.

Idk, I kinda feel like "feels" tests don't tell much. They used a 100Hz monitor iirc, so the cards that can maintain 100FPS will "feel" the same. I bet that if they used a GTX 1080, it would "feel" similar to the 1080 Ti/Vega.

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×