Jump to content

Technical issues holding back custom RX Vega cards

Questycasos

http://www.tomshardware.com/news/amd-vega-custom-graphics-cards-problems,35514.html

 

It would appear that Vega's issues aren't over just yet.

 

Quote

AMD’s Vega is the first high-end GPU to come out of Team Red in two years, so you would think that its board partners would be quick jump in with new custom products. However, that doesn’t seem to be the case. We’re hearing from sources that AMD’s AIB partners have found several issues with Vega chips that prevent them from creating custom cards for now. 

 

As is often AMD’s strategy, the company rolled out Vega in a staggered release. It first launched the professional workstation-level and air-cooled Vega Frontier Edition in early July, followed by the liquid-cooled version a couple of weeks later. In August, AMD and its partners released the Radeon RX Vega 64 gaming cards, followed shortly after that by the RX Vega 56 cards. All of them were reference designs, and AMD promised custom third-party implementations of Vega later in Q3 ‘17 or early in Q4.

 

Indeed, most of AMD's partners are working on custom Vega cards--but not all--and for those who are, there seem to be issues.

Exclusive board partners are getting along, though facing delays

 

Quote

XFX and Sapphire confirmed that they both have custom boards in the works, but they could not say when they might be ready. PowerColor said that it will have its own custom cards, with mass production scheduled for the beginning of November, but it hasn't yet received the DRAM it needs. (VisionTek didn’t immediately reply to our queries about their future offerings.)

While others aren't getting as lucky

Quote

AMD also has partnerships with Asus, Gigabyte, and MSI to build Radeon graphics cards, but these three companies don’t have exclusive deals with AMD. As such, they aren’t driven by necessity and have the luxury of choosing which components to support. We spoke with all three companies, and their responses indicated that their support for the Vega architecture is less definitive than AMD’s exclusive partners.

ASUS is moving ahead with plans, though they're delaying their cards until October. Gigabyte and MSI on the other hand...

Quote

Although a Gigabyte rep said it’s likely that the company would be producing a custom Vega card, they would not or could not confirm with 100% certainty that it will. If it does, we likely won’t see it until the end of the year, or later.

MSI’s response surprised us. The company traditionally offers re-engineered graphics cards with custom PCB designs for all high-end GPU platforms, but it appears to be skipping the Vega lineup. A company representative told us that MSI “won’t be making a custom card anytime soon,” but could offer no additional information.

And the problems seem to be quite deep, at least for now.

Quote

So what gives? Sources tell us that there is too much variance in the quality of the chips AMD is providing. AIB partners are unable to figure out a stable overclocked GPU frequency that works for all cards, and therefore cannot provide any sort of warranty on factory-tuned cards. Further, there continues to be discrepancies between the temperatures the GPU is reporting and what AIB partners are finding in actual measurements. This is true of the actual GPU and the capacitors below the GPU. We have some follow-up testing that will reveal more about these issues.

 

Finally, as we reported last month, there have been issues due to the different packages for Vega, making it difficult to efficiently mass produce custom Vega cards. We were seeing Vega with molded and unmolded packages, which we noted impacted package height. We were even seeing a third package--we assume, using SK hynix HBM. As we wrote then:

AIB partners face new challenges, since the HBM2 is about 40 μm lower in the unmolded packages, and the third variant's underfill obviously differs somewhat.

For one, the mass production and use of a common cooler for several models must take the applied heat conducting material into account. The thickness must be optimized for the unmolded packages, the viscosity must be high enough, and the resulting contact pressure can't damage anything after being bolted together.

And to conclude

Quote

Generally speaking, AIB partners seem optimistic about shipping Vega cards in 2017, and some pointed out that custom Polaris cards came a couple months after the reference card launch. By that timing, we should be seeing some custom Vega cards at the end of September, or at least in October. We’re not getting a strong feeling that will be the case, however.

We’ve reached out to AMD for comment, but the company didn’t immediately reply.

I don't have a horse in this race, folks. I was hoping that Vega would come out swinging so that I could nab a nice upgrade for my 1060 on the cheap. But this whole mess just has me baffled at how this could have gone so bad. It's probably going to clear up, but if that doesn't hurry up and happen, I feel like the next hype cycle for Volta is going to bury any traction Vega could get up when the prices settle.

Link to comment
Share on other sites

Link to post
Share on other sites

RIP Vega even more.

Want to know which mobo to get?

Spoiler

Choose whatever you need. Any more, you're wasting your money. Any less, and you don't get the features you need.

 

Only you know what you need to do with your computer, so nobody's really qualified to answer this question except for you.

 

chEcK iNsidE sPoilEr fOr a tREat!

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Questycasos said:

But this whole mess just has me baffled at how this could have gone so bad

I wouldn't go so far as to say it's "bad", but it's not good either.  I'm holding out for a custom cooled V56, personally.  Guess I'll be waiting a little longer.  Ah well, I'm broke right now anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

I too was hoping Vega would hit it out of the park :( so we'd have a repeat of the price drop that was the GTX 285.  (GTX 280 launched for $649, GTX 285 launched for $359; was hoping the GTX 2080 (Ti?) would launch for like $329 or something like that.)  That, and other things too, like Nvidia coming out with a GT 2010 that would blow SLI 1080 Ti's out of the water would be nice too. :P

 

I'm personally comparing Vega with, for example, Bulldozer, and Fermi. :/ At least in the power consumption department, and maybe also performance vs competition.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, PianoPlayer88Key said:

I'm personally comparing Vega with, for example, Bulldozer, and Fermi. :/ At least in the power consumption department, and maybe also performance vs competition.

 

 

You don't need to think personally, that's the reality.

There is more performance left in vega if you would increase the power limit, but that would make it go past the 300W number which is basically not-done because that would hurt the product a LOT.

 

The GTX 480 (fermi) also had a 300W limit and basically with that card both Nvidia and AMD decided that's the ultimate limit. Nobody wants to go over that 300W number because that would mean whatever the product is (regardless how good it actually is) it would mean that the product is the new "hottest" card ever and take over the bad reputation the GTX 480 has.

 

The reason for the high power consumption is for the 3 technologies the same tho (Vega, Bulldozer and Fermi), competition. All 3 of them had a very strong competitor they had to battle against, requiring them to push their tech to the limit causing efficiency to be completely gone and power consumption through the roof.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, samcool55 said:

You don't need to think personally, that's the reality.

There is more performance left in vega if you would increase the power limit, but that would make it go past the 300W number which is basically not-done because that would hurt the product a LOT.

 

The GTX 480 (fermi) also had a 300W limit and basically with that card both Nvidia and AMD decided that's the ultimate limit. Nobody wants to go over that 300W number because that would mean whatever the product is (regardless how good it actually is) it would mean that the product is the new "hottest" card ever and take over the bad reputation the GTX 480 has.

 

The reason for the high power consumption is for the 3 technologies the same tho (Vega, Bulldozer and Fermi), competition. All 3 of them had a very strong competitor they had to battle against, requiring them to push their tech to the limit causing efficiency to be completely gone and power consumption through the roof.

That gets me thinking about something ...

 

I really wish we could again have flagship GPUs (__80 Ti / __90 Ti / Titan / etc. class) with low enough power consumption, so that they would ship with heatsinks and fans like on these cards. :)

 

598_elsa_erazor_x2-a32_top_hq.jpg

 

166_asus_agp-v3800_32m_tv_tnt2_ultra_top_hq.jpg

 

37_ati_rage_lt_pro_agp_pn.109-47200-00_top_hq.jpg

 

 

Also, back in the days when the upper end 486s and Pentiums were coming out, I seem to remember some talk about wishing we could avoid putting fans / heatsinks on those CPUs, or something like that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, samcool55 said:

You don't need to think personally, that's the reality.

There is more performance left in vega if you would increase the power limit, but that would make it go past the 300W number which is basically not-done because that would hurt the product a LOT.

 

The GTX 480 (fermi) also had a 300W limit and basically with that card both Nvidia and AMD decided that's the ultimate limit. Nobody wants to go over that 300W number because that would mean whatever the product is (regardless how good it actually is) it would mean that the product is the new "hottest" card ever and take over the bad reputation the GTX 480 has.

 

The reason for the high power consumption is for the 3 technologies the same tho (Vega, Bulldozer and Fermi), competition. All 3 of them had a very strong competitor they had to battle against, requiring them to push their tech to the limit causing efficiency to be completely gone and power consumption through the roof.

 

The problem for AMD  though is that Fermi was at the start of it's micro architecture evolution (it was used along side Kepler architecture to make the 600,700 and 800 series) and Nvidia managed to move forward from there, while Vega, unfortunately, is at the end of the GCN evolution and it seems AMD has nowhere to go with it outside of higher power draw. 

 

I think this is why so many people didn't find it too hard to predict how well vega was going to perform.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Questycasos said:

PowerColor said that it will have its own custom cards, with mass production scheduled for the beginning of November, but it hasn't yet received the DRAM it needs.

I'm definitely missing something here.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, mr moose said:

 

The problem for AMD  though is that Fermi was at the start of it's micro architecture evolution (it was used along side Kepler architecture to make the 600,700 and 800 series) and Nvidia managed to move forward from there, while Vega, unfortunately, is at the end of the GCN evolution and it seems AMD has nowhere to go with it outside of higher power draw. 

 

I think this is why so many people didn't find it too hard to predict how well vega was going to perform.

AMD's actual problem is the scaling of the GCN uArch. Nvidia gets some brilliant performance scaling with Maxwell/Pascal because they actually designed their "big chip" first and everything is cut down from there. They also have some Gaming advantages due to some subtle parts of the uArch and the massive amount of money they toss at Drivers. (The Intel + Nvidia's gaming performance advantage is mostly due to that Driver team. They've worked magic there.)

 

Given some of the slides out of AMD's Financial Analyst day some months ago, it's also pretty clear what AMD is up to. They've ceded the high-end Gaming GPU market to Nvidia. They'll still put out cards there, but the ~200 USD market is where they'll mostly stay because they can take out Nvidia in the rest of the market. The Vega SSG is, far & away, the most interesting part of the entire Vega lineup. It's a $7000USD card with 2TB of NVMe storage on it. Why? High-res video editing. Those professional level cards are all margin, which is where AMD will target.

 

It makes sense. GCN is the best iGPU uArch around. At low clocks and on a CPU package, it's phenomenal. The problem has been, since the Bulldozer, that the CPU was just crap at low clocks. If the Bulldozer disaster didn't happen, you'd have had great "gaming" Laptops 3 years ago. (There's a reason Raven Ridge is really, really interesting for some of us. It's going to be a real game-changer for AMD in the mobile space.) But GCN doesn't scale up to the high-end GPU market for Gaming. For compute? Sure, it's great, which is exactly where AMD is going with it.

 

The next step for AMD is really going further with Infinity Fabric. Take a 75w GPU, say as a RX 560/570 replacement. Great! But what happens when they stack 4 of those on the same card? That's where Navi is going. AMD will be able to simply put more die space on a Compute Card than Nvidia will be able to, at something like 1/3rd the walkaway cost to produce.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Taf the Ghost said:

AMD's actual problem is the scaling of the GCN uArch. Nvidia gets some brilliant performance scaling with Maxwell/Pascal because they actually designed their "big chip" first and everything is cut down from there. They also have some Gaming advantages due to some subtle parts of the uArch and the massive amount of money they toss at Drivers. (The Intel + Nvidia's gaming performance advantage is mostly due to that Driver team. They've worked magic there.)

 

Given some of the slides out of AMD's Financial Analyst day some months ago, it's also pretty clear what AMD is up to. They've ceded the high-end Gaming GPU market to Nvidia. They'll still put out cards there, but the ~200 USD market is where they'll mostly stay because they can take out Nvidia in the rest of the market. The Vega SSG is, far & away, the most interesting part of the entire Vega lineup. It's a $7000USD card with 2TB of NVMe storage on it. Why? High-res video editing. Those professional level cards are all margin, which is where AMD will target.

 

It makes sense. GCN is the best iGPU uArch around. At low clocks and on a CPU package, it's phenomenal. The problem has been, since the Bulldozer, that the CPU was just crap at low clocks. If the Bulldozer disaster didn't happen, you'd have had great "gaming" Laptops 3 years ago. (There's a reason Raven Ridge is really, really interesting for some of us. It's going to be a real game-changer for AMD in the mobile space.) But GCN doesn't scale up to the high-end GPU market for Gaming. For compute? Sure, it's great, which is exactly where AMD is going with it.

 

The next step for AMD is really going further with Infinity Fabric. Take a 75w GPU, say as a RX 560/570 replacement. Great! But what happens when they stack 4 of those on the same card? That's where Navi is going. AMD will be able to simply put more die space on a Compute Card than Nvidia will be able to, at something like 1/3rd the walkaway cost to produce.

That's just a fancy way of saying what I said.   Except I don't think all of Nvidia's performance is in better drivers.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

That's just a fancy way of saying what I said.   Except I don't think all of Nvidia's performance is in better drivers.

That extra 5-10% is their driver team, though. And that matters, as that extra bit is what causes AMD to always overvolt their cards recently to meet performance targets.

 

There's also the issue of the Sandy Bridge + Nvidia combo that simply can get more FPS out of a single thread because of some supremely impressive driver tricks. (It's something that's come up with all of the post-Ryzen launch testing. Nvidia found ways to just squeeze more out of the Intel uArch.)

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Taf the Ghost said:

That extra 5-10% is their driver team, though. And that matters, as that extra bit is what causes AMD to always overvolt their cards recently to meet performance targets.

 

There's also the issue of the Sandy Bridge + Nvidia combo that simply can get more FPS out of a single thread because of some supremely impressive driver tricks. (It's something that's come up with all of the post-Ryzen launch testing. Nvidia found ways to just squeeze more out of the Intel uArch.)

Nvidia optimizing for Intel processors sounds interesting but I have never seen any solid evidence for this.

Where did you learn about it?

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, LAwLz said:

Nvidia optimizing for Intel processors sounds interesting but I have never seen any solid evidence for this.

Where did you learn about it?

Was something that kept cropping up with all of the Ryzen testing, especially by the time the Ryzen 5 dropped and the early motherboard issues were worked out.

 

And longer reply got eaten. Doh. 

 

Shorter version: think of Game performance as A + B + C + D. Nvidia found a way to make the "C" section larger on Intel's uArch. It would take some intense testing to really tease out where this performance lands for total effect, but in games that Ryzen has tested badly in, Intel performance doesn't normally scale quite as you would expect it. It points to a Fixed Portion + Scaling Portion differential between AMD & Intel platforms. 

 

It was something I noticed after seeing so many charts and trying to figure out what was going on. It's a compliment to Nvidia's driver team, as they were able to give end-users more performance for their dollar.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The Vega RX launch has really been one giant cluster fuck.

 

The question I'd have is why this huge discrepancy between identical chips? Is it a design problem, a fab problem, a firmware problem? If it's not easily fixable then does this mean we won't be seeing factory OCed Vega GPUs at all?

 

RTG really dropped the ball with Vega, and after Ryzen was such a successful launch too.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

Was something that kept cropping up with all of the Ryzen testing, especially by the time the Ryzen 5 dropped and the early motherboard issues were worked out.

 

And longer reply got eaten. Doh. 

 

Shorter version: think of Game performance as A + B + C + D. Nvidia found a way to make the "C" section larger on Intel's uArch. It would take some intense testing to really tease out where this performance lands for total effect, but in games that Ryzen has tested badly in, Intel performance doesn't normally scale quite as you would expect it. It points to a Fixed Portion + Scaling Portion differential between AMD & Intel platforms. 

 

It was something I noticed after seeing so many charts and trying to figure out what was going on. It's a compliment to Nvidia's driver team, as they were able to give end-users more performance for their dollar.

 

So your saying there is a general trend of better FPS when Intel + Nvidia versus any other combination?

 

Because to me it just sounds like the old adage about polishing turds.  For games GCN  just isn't there and can't be polished as well as Nvidia's arch.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

AMD needs to set a standard for how the HBM2 is put on there tbh...

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, mr moose said:

Because to me it just sounds like the old adage about polishing turds.  For games GCN  just isn't there and can't be polished as well as Nvidia's arch.

Well Mythbusters did prove that you can polish a turd, it just has to be the right kind of turd lol.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, mr moose said:

So your saying there is a general trend of better FPS when Intel + Nvidia versus any other combination?

 

Because to me it just sounds like the old adage about polishing turds.  For games GCN  just isn't there and can't be polished as well as Nvidia's arch.

There's 4 CPU + GPU combos: Intel + Nvidia; Intel + RTG; AMD + Nvidia; AMD + RTG.  What I'm saying is in the market dominant Intel + Nvidia combo, Nvidia's driver team was able to find some very clever tricks to bump up their performance in DX9 and DX11. The issues in a lot of DX12 titles is that bump seems to go away. It's also, likely, part of the reason that the results are so sensitive to inter-core latency and is, thus, something Nvidia is working to lessen in their drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Questycasos said:

ASUS is moving ahead with plans, though they're delaying their cards until October. Gigabyte and MSI on the other hand...

Doesnt matter to me if MSI backs out of Vega. Their Gaming X coolers havent been performing on the higher end of the spectrum anyways, forget about "armor" 

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, JediFragger said:

AMD just give up the high-end, you're just too shit at it.

if this is how you feel then you are very unaware of the full Vega Product Stack VS the Full Pascale Product Stack. You are also extremely inept if you think its fine to have only one manufacturer of high end gaming gpus.

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Taf the Ghost said:

There's 4 CPU + GPU combos: Intel + Nvidia; Intel + RTG; AMD + Nvidia; AMD + RTG.  What I'm saying is in the market dominant Intel + Nvidia combo, Nvidia's driver team was able to find some very clever tricks to bump up their performance in DX9 and DX11. The issues in a lot of DX12 titles is that bump seems to go away. It's also, likely, part of the reason that the results are so sensitive to inter-core latency and is, thus, something Nvidia is working to lessen in their drivers.

I doubt that very much, unless you have evidence. In the past, it's been shown that there's no particular affinity between Intel and Nvidia, and in fact AMD cards benefitted slightly more from Intel CPUs than Nvidia cards did.

 

http://www.tomshardware.co.uk/crossfire-sli-scaling-bottleneck,review-32668.html

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×