Jump to content

UPDATED* AMD announces the Radeon VII - but it's $699 | Nvidia calls it "Lousy"

YoloSwag
14 minutes ago, Trixanity said:

They are decent products if you look at specific parameters but if you look at the whole picture it's not really anything impressive. Mediocre. 

 

Gamers gravitate towards Nvidia though. There's no denying it.

Only way they would be impressive would be if we had seen them come out in msrp prices.

Link to comment
Share on other sites

Link to post
Share on other sites

@leadeater @Blademaster91 

 

His main source for the leaks did say announced on CES. But for all we know, the source might just have known something about it would happen on CES and therefore thought that it would be an announcement. Or that there was some missunderstanding somewhere. But I don't know. Who knows.

"Announced on CES" was wrong, we see that now.

 

I don't think those spesific leaks said anything about Radeon 7 at all? It was about Navi.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Blademaster91 said:

Their leaks were claiming "RIP Intel" with false expectations that we would see cheap 12 core CPU's at CES, and Radeon 7 would be more competitively priced than it turned out to be. Interesting enough those connected sources are never even hinted at who they might be, but the media all followed his clickbait rumors.

And it doesn't make sense why they would hold back 3rd gen motherboards, and not announce it during all the CES hype.

r/AyyMD might have gone that, but that was never the claims from Adored. As I said to someone else, like or dislike the man's work, but don't lie about it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Taf the Ghost said:

Considering Turing sales have been slow relative to Pascal, it seems most aren't taking up even Nvidia on "It just works".

For sure, 100% agree, I just don't see many except maybe die hard AMD fans, who are REALLY looking to upgrade, buying this over Nvidia at this price

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Blademaster91 said:

I don't doubt that Nvidia knew AMD has a 4K capable card, Nvidia would have had to expand their monitor support and it seemed like a convenient time for it, but decent 4K monitors aren't cheap with Freesync either.

Didn't he have a video claiming that Vega II and Navi would be at CES? He lied himself with the clickbait "leaked" Ryzen 3000 specs also.

believe me they knew, in this kinds of industries knowing what the other guy is working on is quite important 

1 hour ago, Misanthrope said:

This is kind of what I expected and quite disappointing: they're one node smaller at 7nm and they still can't reach the 2080ti performance at one node higher it shows just how behind AMD is and Navi isn't going to help them either because AFAIK is just the same architecture and they really need something ground up to catch up at this point.

 

I get that this is a usable product and AMD purist will not be quite happy with my comment but it's a sharp contrast to what they're able to pull off on Ryzen by having the upper hand on the newer(ish) node vs getting first to a node and still losing the top spot to the competition.

gcn with only 4 shader engines can't scale well at the high end right now, but navi wont be high end and thus should be able to compete quite a bit better without even taking into account its new features whatever they are, this product will be great though to extrapolate how ryzen and navi will overclock and how much better 7nm is versus Samsung's old 14nm. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Humbug said:

 

interesting to see that many times the bigger gains are on games that are usually give better results on nvidea, i think this might be due to the higher clocks as that is the only way to increase pixel throughput for example 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Was he saying Ryzen 3000 would be 12/16 cores and Ryzen 3000 would be unveiled at CES or was he saying there would be a Ryzen 3000 12/16 demonstration or product launch details at CES. These are quite different things.

 

Ryzen 3000 could still be 12/16 cores so if we're talking general leaks and information what was shown at CES doesn't disprove in any way that it will not be the case, I'm sure you've seen the pictures of the blank space for a second chiplet. If there was no intention to fill that space the first chiplet would be centered not offset.

 

And yes I do dislike adored and I don't like his videos, or his voice for that matter which is a large reason I don't watch even just to find out what other people are hearing from him.

 

He specifically said his leak had said they would be providing information on some of the Ryzen 3000 series at CES. Which they did. He never claimed they would be launching at CES, Hardware Unboxed made that one up. He also never claimed we'd get the full info n everything. He did speculate we'd be getting more info than we did, as well as some other speculation.

 

What basically happened with hsi leak was that a whole bunch of people took his speculation as fact and took "announcing information" as "releasing", (not helped by Hardware Unboxed parroting that).

 

5 hours ago, Misanthrope said:

This is kind of what I expected and quite disappointing: they're one node smaller at 7nm and they still can't reach the 2080ti performance at one node higher it shows just how behind AMD is and Navi isn't going to help them either because AFAIK is just the same architecture and they really need something ground up to catch up at this point.

 

I get that this is a usable product and AMD purist will not be quite happy with my comment but it's a sharp contrast to what they're able to pull off on Ryzen by having the upper hand on the newer(ish) node vs getting first to a node and still losing the top spot to the competition.

 

The thing you have to understand is that VEGA is a datacenter compute based card, (the data center versions have no outputs on the back and aren't even capable of running a monitor), for doing stuff like AI and GPU accelerated simulation tasks, (amongst many other things).

 

In simple terms if you were to run a GPU focused compute task on a Vega 7 and a 2080Ti, Vega 7 would probably end up outperforming the 2080Ti by as much as the 2080Ti outperforms the RX580.

 

The fact that Vega can even get into shouting distance of the 2080 let alone match it is impressive as all hell in reality. And i suspect there's more than enough grunt under the hood to do raytracing, but enabling it would likely undermine AMD's own plan for providing an RT solution down the road.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Agost said:


mAkE a cHeApeR GdDr5 VErSioN

The meme is getting old, please

How is it a meme? It's completely legitimate criticism. HBM makes the cards more expensive to manafacture, and most people don't need it. The card could be more competitively priced.

 

AMD blows my mind. Ryzen is killing it, they're checking all the boxes, but they can't get GPUs right. Sure, their budget is tight so it's harder to get new stuff out, but with stuff like HBM, they're doing it to themselves.

 

(Typed from my phone, if I made any typing errors oh well I'm too lazy to fix it.)

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, CarlBar said:

The thing you have to understand is that VEGA is a datacenter compute based card, (the data center versions have no outputs on the back and aren't even capable of running a monitor), for doing stuff like AI and GPU accelerated simulation tasks, (amongst many other things).

That actually makes sense: I remember that Lisa herself  presented the card (Followed by the cringe requests for applause) and proceeded to immediately talk first and foremost about the compute and data center stuff and only after that point was abundantly clear she said "But also, games!" and then brought up the gamer dudes for a round of mutual ass-kissing.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

@Taf the Ghost i just noticed, vega VII has 128 Rops, Naniii?, was it just me that missed that, and what does this mean, did they simply add more rops to each shader engine or is this actually a card with more of them, so much questions 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, 2Buck said:

How is it a meme? It's completely legitimate criticism. HBM makes the cards more expensive to manafacture, and most people don't need it. The card could be more competitively priced.

 

AMD blows my mind. Ryzen is killing it, they're checking all the boxes, but they can't get GPUs right. Sure, their budget is tight so it's harder to get new stuff out, but with stuff like HBM, they're doing it to themselves.

This discussion is long and old. Enjoy this video
 

 

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, cj09beira said:

@Taf the Ghost i just noticed, vega VII has 128 Rops, Naniii?, was it just me that missed that, and what does this mean, did they simply add more rops to each shader engine or is this actually a card with more of them, so much questions 

And a 4096-bit memory bus.

 

Officially, it's not GCN cores. It's "Next-Generation Compute Unit". We just never saw what that actually meant.

 

Reading between certain lines and what people have gleaned from GPUIDs & drivers, it appears AMD is doing an iteration move with GCN. GCN will remain the ISA but the architecture will become something different step by step. The assumption, from my end, is that AMD is stuck with GCN and Nvidia is stuck with CUDA. The ecosystems are now so big that they can't really run new architectures every 2 years like they used to. This also would explains Raja's statements about drivers being hard. They had to rework everything, in place, for what was to come next.

 

That "next" is actually Navi, but Vega was the transition phase. However, the execution of their GPUs has been pretty rough. No clue why, but hopefully they get it fixed. We need the competition in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Misanthrope said:

That actually makes sense: I remember that Lisa herself  presented the card (Followed by the cringe requests for applause) and proceeded to immediately talk first and foremost about the compute and data center stuff and only after that point was abundantly clear she said "But also, games!" and then brought up the gamer dudes for a round of mutual ass-kissing.

"Creators" was used a lot with Radeon 7. However, there's going to be a section of the GPU market that's going to swallow up these cards. Kind of like one of the big selling points with Epyc is that you can stick far more Memory into 2U than with Intel platforms. For some companies, that matters more.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, VanayadGaming said:

They launched a card that competes with the best from nvidia, except the 2080TI. I think they are ok. What did you guys expect? a 300$ card that competes with the Titan ? It is expensive though, and I would have liked it a bit cheaper...but still, it will be a great productivity/gaming card, just as the 2080. (maybe even better on the productivity side)

There'lll probably be a cheaper option eventually

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

 

Reading between certain lines and what people have gleaned from GPUIDs & drivers, it appears AMD is doing an iteration move with GCN. GCN will remain the ISA but the architecture will become something different step by step.

I don't get why people say that a ground up redesign is the only way for AMD to breakthrough the bottlenecks of gcn.

 

We have already seen so many massive advances and the new vega and Polaris stuff is in a totally different league to the original gcn Tahiti parts.

 

No doubt some subsystem redesigns are required but I don't see why it cannot be an evolutionary process rather than a revolutionary one. The latter is far more challenging in terms of ecosystem and drivers etc and also risky.

 

Even Vega VII which we expected just to be a die shrink seems to have numerous architectural improvements. The clock speeds are only 15% higher than Vega 64 but the performance is 30% faster.

 

Navi for the last few years has had supposedly the majority of engineering resources at RTG and will be a bigger architectural departure. We cannot predict how this will turn out, it may be awesome or absolutely suck. But the gcn lineage is not necessarily a problem.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Humbug said:

I don't get why people say that a ground up redesign is the only way for AMD to breakthrough the bottlenecks of gcn.

 

We have already seen so many massive advances and the new vega and Polaris stuff is in a totally different league to the original gcn Tahiti parts.

 

No doubt some subsystem redesigns are required but I don't see why it cannot be an evolutionary process rather than a revolutionary one. The latter is far more challenging in terms of ecosystem and drivers etc and also risky.

 

Even Vega VII which we expected just to be a die shrink seems to have numerous architectural improvements. The clock speeds are only 15% higher than Vega 64 but the performance is 30% faster.

 

Navi for the last few years has had supposedly the majority of engineering resources at RTG and will be a bigger architectural departure. We cannot predict how this will turn out, it may be awesome or absolutely suck. But the gcn lineage is not necessarily a problem.

I've only slowly gotten into GPU tech over the last year or so, thus, don't quote me on this. That being said, I believe the issue is that GCN was going to require a complete redesign of the entire front-end and pipeline systems to expand beyond a certain point. I guess the issue is, now, how much money it would cost to make an entirely new Driver side of the Ecosystem, and that was the issue. 

 

Even to an extent, the changes that it looks like AMD are doing to the architecture look a lot like they're planning to bring in something akin to Hyperthreading. 

 

Navi was paid for by Sony, according to the rumors. Much like Vega was for Apple. They go in certain directions to fit design profiles to the company's specifications. We'll have to wait a bit longer to see what that means for what the GPUs do.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Trixanity said:

 

Sometimes people just want to get fucked and fucked good. Maybe even choked a little bit. And that's just when shopping. Now when dating....

Well since those graphics card are bigger than last gen, some will have trouble walking in 2019. You're right about that.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, 2Buck said:

How is it a meme? It's completely legitimate criticism. HBM makes the cards more expensive to manafacture, and most people don't need it. The card could be more competitively priced.

 

AMD blows my mind. Ryzen is killing it, they're checking all the boxes, but they can't get GPUs right. Sure, their budget is tight so it's harder to get new stuff out, but with stuff like HBM, they're doing it to themselves.

 

(Typed from my phone, if I made any typing errors oh well I'm too lazy to fix it.)

The thing is though, assuming AMD could just glue on some GDDR, which they can't, it would be so bandwidth starved that you'd be complaining that they didn't use HBM.

As far as I can tell, it took AMD essentially doubling the bandwidth of the Vega cards to get enough bandwidth on the memory to get 'half decent' performance out of it.

 

Plus, as it's been said many times before, when AMD decided to use HBM over GDDR, it was expected that the price would dramatically decrease, which it obviously hasn't. Nvidia would be in a similar situation if HBM had took off and the price of GDDR was super high (not that it isn't, but you get the idea). 

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, yolosnail said:

The thing is though, assuming AMD could just glue on some GDDR, which they can't, it would be so bandwidth starved that you'd be complaining that they didn't use HBM.

As far as I can tell, it took AMD essentially doubling the bandwidth of the Vega cards to get enough bandwidth on the memory to get 'half decent' performance out of it.

 

Plus, as it's been said many times before, when AMD decided to use HBM over GDDR, it was expected that the price would dramatically decrease, which it obviously hasn't. Nvidia would be in a similar situation if HBM had took off and the price of GDDR was super high (not that it isn't, but you get the idea). 

lets all blame the fpgas, why do they need hbm, the worst part is that some of them use 6 stacks per fpga, amd could do 3 vega cards with that many stacks.

seriously now hbm has been through such a ride, hbm 1 was all right, then hbm 2 took way too long to come to us, then it came slower than it should, then the memory market inflated like a face stung by african bees, and hbm took most stings due to its high value low volume nature, now only volume will help things get better. it is the superior tech it just needs to be nurtured for a while 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, cj09beira said:

lets all blame the fpgas, why do they need hbm, the worst part is that some of them use 6 stacks per fpga, amd could do 3 vega cards with that many stacks.

seriously now hbm has been through such a ride, hbm 1 was all right, then hbm 2 took way too long to come to us, then it came slower than it should, then the memory market inflated like a face stung by african bees, and hbm took most stings due to its high value low volume nature, now only volume will help things get better. it is the superior tech it just needs to be nurtured for a while 

There's HBM3. Should show up around 2020. AMD is probably going to swear it off unless it's a semi-custom job.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, yolosnail said:

The thing is though, assuming AMD could just glue on some GDDR, which they can't, it would be so bandwidth starved that you'd be complaining that they didn't use HBM.

As far as I can tell, it took AMD essentially doubling the bandwidth of the Vega cards to get enough bandwidth on the memory to get 'half decent' performance out of it.

I must admit that I was ignorant on this topic.

 

If you'll excuse me...

Spoiler

suicide.png.294023bda72a8435ce2dafdb6d0625e4.png

 

I do stand by my POV that Radeon has been incredibly disappointing though. Vega came WAY too late and now once again, AMD is late to the party. Finally, they might match that pesky 1080ti... A generation late. And again, I don't care about the high end, and AMD not having leadership performance doesn't mean much to me, since I can't afford high end stuff anyway. But it's all about their image, about mind share. They really need something huge to get everyone's attention, this isn't it. Not that they have a good chance anyway, because even when they had leadership performance, everyone still bought Nvidia... But then again, they did come back from the dead in the CPU market, so maybe it's possible.

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, 2Buck said:

I must admit that I was ignorant on this topic.

 

If you'll excuse me...

  Hide contents

suicide.png.294023bda72a8435ce2dafdb6d0625e4.png

 

I do stand by my POV that Radeon has been incredibly disappointing though. Vega came WAY too late and now once again, AMD is late to the party. Finally, they might match that pesky 1080ti... A generation late. And again, I don't care about the high end, and AMD not having leadership performance doesn't mean much to me, since I can't afford high end stuff anyway. But it's all about their image, about mind share. They really need something huge to get everyone's attention, this isn't it. Not that they have a good chance anyway, because even when they had leadership performance, everyone still bought Nvidia... But then again, they did come back from the dead in the CPU market, so maybe it's possible.

I don't think anybody is denying the fact that Radeon has been disappointing lately, in the high end in particular, but in the low-mid range they've been fairly competitive.

 

Like you say, it's not actually about who's product is better, it's about who's product people think is better. If you think of a quality car, the first cars that come to mind are probably the big German manufacturers like BMW, Audi and Mercedes. But if you actually get into a lot of these cars, they're not really any better than any other brand. If you think of a bad quality car, you think of French cars, like Peugeot, but Peugeot's new cars are up there in terms of quality, but because for years they've been known (and rightly so) for making absolutely appalling cars nobody gives them a chance! 

That's the problem with AMD at the minute, they've managed with the CPU's, sure they don't 'beat' Intel by a mile, but they're up there, and Intel just can't match them on price. Even if the best AMD can do at the minute is match the 2080, that's still a heck of an achievement considering where they were a couple years ago, especially when most of the R&D went to the CPU side

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, yolosnail said:

I don't think anybody is denying the fact that Radeon has been disappointing lately, in the high end in particular, but in the low-mid range they've been fairly competitive.

 

Like you say, it's not actually about who's product is better, it's about who's product people think is better. If you think of a quality car, the first cars that come to mind are probably the big German manufacturers like BMW, Audi and Mercedes. But if you actually get into a lot of these cars, they're not really any better than any other brand. If you think of a bad quality car, you think of French cars, like Peugeot, but Peugeot's new cars are up there in terms of quality, but because for years they've been known (and rightly so) for making absolutely appalling cars nobody gives them a chance! 

That's the problem with AMD at the minute, they've managed with the CPU's, sure they don't 'beat' Intel by a mile, but they're up there, and Intel just can't match them on price. Even if the best AMD can do at the minute is match the 2080, that's still a heck of an achievement considering where they were a couple years ago, especially when most of the R&D went to the CPU side

Toyota is probably the first one i would think of, mercedes second (there is nothing after that).

i think amd would be in a much better position had they launched a vega polaris replacement, it would have walked all over the 1060, but i guess they are very tight in R&D side, i would probably cancel vega 12 at the very start and make it into a polaris replacement card, the cost to do so would be around the same unless apple helped there, other than that there aren't many things i would do differently.

11 hours ago, Taf the Ghost said:

There's HBM3. Should show up around 2020. AMD is probably going to swear it off unless it's a semi-custom job.

i was first expecting hbm 3 to be here by now, hopefully there will be something in the middle like an 3Gbps hbm 2, 2.4Gbps is fun but not a large jump. I also find it strange that Vega VII doesn't use it.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cj09beira said:

Toyota is probably the first one i would think of, mercedes second (there is nothing after that).

i think amd would be in a much better position had they launched a vega polaris replacement, it would have walked all over the 1060, but i guess they are very tight in R&D side, i would probably cancel vega 12 at the very start and make it into a polaris replacement card, the cost to do so would be around the same unless apple helped there, other than that there aren't many things i would do differently.

i was first expecting hbm 3 to be here by now, hopefully there will be something in the middle like an 3Gbps hbm 2, 2.4Gbps is fun but not a large jump. I also find it strange that Vega VII doesn't use it.

So you wouldn't think of Porsche as a quality car brand?

 

I wonder if there will be a gaming version of the Radeon VII with cut down HBM, or if this is what we're getting for this generation.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Roen said:

So you wouldn't think of Porsche as a quality car brand?

 

I wonder if there will be a gaming version of the Radeon VII with cut down HBM, or if this is what we're getting for this generation.

when you just say quality no, if you had said sports car, then yes, 

you wont see anything else from vega probably, they don't really want to sell many of them as they can sell them to the server market instead.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×