Jump to content

Lisa Su Confirms 7nm Radeon RX Graphics Cards For Gamers In 2019

7 minutes ago, GMWolf said:

To all who say Vega was a disappointment. 

What?

Seriously, what?

Vega 56 performs better than a GTX 1070.

And Vega 64 slightly better than a 1080.

And always within about the same MSRP (let's ignore miners for a bit)

 

To me that's quite a good GPU! Sure, the 1080 ti is faster... But how many of you are rocking that?!?

 

35% faster 7nm Vega is great! A 7nm Vega 64 should beat out a 1080 ti!

We can't know for sure, but I think that once again, 7nm Vega will be on par with whatever Nvidia is about to show.

 

, miners are 100% affecting Nvidia too!

Not too long ago the 1070 was the most popular GPU on steam. Now it's the 1060.

After all, alt coins have made Nvidia quite sought after too.

The heat and voltage difference between them doesn't justify getting one. 

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Humbug said:

So Vega FE boosts up-to 1600Mhz , 35% more means we are looking at  2100-2200Mhz. Impressive.

If amd is saying the truth about 35%, then yeah, it is.

 

It's hard to tell, since 7nm hasn't even reached CPUs yet.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GMWolf said:

To all who say Vega was a disappointment. 

What?

Seriously, what?

Vega 56 performs better than a GTX 1070.

And Vega 64 slightly better than a 1080.

It was disappointing because it still has issues leveraging the potential performance in to actual performance. There are a few titles that excelled at getting the performance but by in large most of it goes underutilized. AMD knows how long it takes for API transitions so counting on DX12 was a mistake, better to focus on DX11 for Vega and DX12 for Navi, which may even be still too soon.

 

AMD won't break in to the server and compute market until every man and his dog is talking about their GPUs in a good way for gaming, if you have a general perception of having good performing graphic cards then people will want to use them and look in to themselves without AMD trying to push their product.

 

I've never once had a research even mention AMD at all, it's never come up in any discussions about server hardware and there is no point even bringing it up, they want Nvidia and you give them what they want because that also happens to be what they need.

 

I do not care that much about power draw or heat generated in my gaming system, it's easily worked around and most AIB do it for us but few truly actually preference AMD graphic cards when buying them because they just do not deliver on the performance they should have. At least when you buy Nvidia you don't have to worry about any games not liking your GPU and performing badly, that's why it doesn't matter that the RX 580 is often a better buy than a GTX 1060 because the RX 580 still has those potential problems looming over it.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, GMWolf said:

To all who say Vega was a disappointment. 

What?

Seriously, what?

Vega 56 performs better than a GTX 1070.

And Vega 64 slightly better than a 1080.

And always within about the same MSRP (let's ignore miners for a bit)

 

To me that's quite a good GPU! Sure, the 1080 ti is faster... But how many of you are rocking that?!?

 

Vega56 beats the gtx 1070. Would have been good if available at MSRP. It's the best Vega product.

Vega64 trades blows with gtx 1080 on performance yes.

But the vega64 was pushed to the edge of the architecture's limit on this node. People were rightfully disappointed because when you launch so much later than the competition you are expected to leapfrog them. But Vega came late and couldn't soundly beat the 1080 let alone the 1080ti. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/7/2018 at 10:37 PM, CTR640 said:

Don't be like that. AMD and nVidia simply could have blocked mining by modifying drivers but they simply refuse

its not that easy , its like saying ford could stop people from driving the car offroad by installing some software 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Space Reptile said:

its not that easy , its like saying ford could stop people from driving the car offroad by installing some software 

Just put an offroad mode button in the car that actually does nothing other than put a warranty void flag in the ECU xD

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

At least when you buy Nvidia you don't have to worry about any games not liking your GPU and performing badly, that's why it doesn't matter that the RX 580 is often a better buy than a GTX 1060 because the RX 580 still has those potential problems looming over it.

I agree with the rest of your post. But don't think this is a problem. It tends to get overstated. If there is an issue it generally gets quickly squashed with a driver. AMD tends to dominate on some games and Nvidia in others. But AMD is very reliable.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, sof006 said:

The heat and voltage difference between them doesn't justify getting one. 

That's a good point.

 

2 minutes ago, leadeater said:

It was disappointing because it still has issues leveraging the potential performance in to actual performance. There are a few titles that excelled at getting the performance but by in large most of it goes underutilized. AMD knows how long it takes for API transitions so counting on DX12 was a mistake, better to focus on DX11 for Vega and DX12 for Navi, which may even be still too soon.

 

AMD won't break in to the server and compute market until every man and his dog is talking about their GPUs in a good way for gaming, if you have a general perception of having good performing graphic cards then people will want to use them and look in to themselves without AMD trying to push their product.

 

I've never once had a research even mention AMD at all, it's never come up in any discussions about server hardware and there is no point even bringing it up, they want Nvidia and you give them what they want because that also happens to be what they need.

 

I do not care that much about power draw or heat generated in my gaming system, it's easily worked around and most AIB do it for us but few truly actually preference AMD graphic cards when buying them because they just do not deliver on the performance they should have. At least when you buy Nvidia you don't have to worry about any games not liking your GPU and performing badly, that's why it doesn't matter that the RX 580 is often a better buy than a GTX 1060 because the RX 580 still has those potential problems looming over it.

Gaming benchmarks show the Vega 56 beating out 1070. Battlefield one, for instance. As always, some titles will behave differently.

 

I've found a few papers, but mostly older ones, when AMD still had a lead, on things like geometry shaders, etc.

Then,  AMD have struggled even getting close to Nvidia. Which is why Vega was exciting to me.

Another reason for research to reference Nvidia cards is because of its greater market share.

 

Again, look at gaming benchmarks, or user benchmarks. Vega holds its own.

 

I think there is still a lot of myth in AMDs poor performance... 

 

Why would server stuff care about gaming performance? What is important is gpgpu performance. 

You could remove the rasterization units from a GPU and the gaming performance would suck, but it's GP performance would stay the same.

 

Given the directing AMD is heading it, I don't think it's inconceivable that they will make major waves, not in the server market, but in the workstation market. Their push for high amounts of fast memory, and good double precision floating point performance has already made AMD GPUs preferable for many tasks (unfortunately, that also includes mining).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Humbug said:

I agree with the rest of your post. But don't think this is a problem. It tends to get overstated. If there is an issue it generally gets quickly squashed with a driver. AMD tends to dominate on some games and Nvidia in others. But AMD is very reliable.

I agree but the sentiment is there so that's exactly what happens and the only way to change that is to not have products that have large drawbacks like the Vega 64. If it's going to hurt more than help then don't make it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Space Reptile said:

its not that easy , its like saying ford could stop people from driving the car offroad by installing some software 

More importantly why would AMD want to... Their GPUs have generally been good at compute. Now a lot of people around the world adopted a compute application and started buying up the GPUs in droves. AMD would be stupid to say no.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

I agree but the sentiment is there so that's exactly what happens and the only way to change that is to not have products that have large drawbacks like the Vega 64. If it's going to hurt more than help then don't make it.

Large drawbacks being?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, GMWolf said:

Why would server stuff care about gaming performance? What is important is gpgpu performance. 

Because word of mouth and industry reputation, it's a real thing and it's actually the field I work in and literally zero people I work with know anything about AMD GPUs even for their personal computers at home. Fix that before trying to sell it to them in their day job, you won't get a look in edge wise.

 

24 minutes ago, GMWolf said:

Gaming benchmarks show the Vega 56 beating out 1070. Battlefield one, for instance. As always, some titles will behave differently.

And some have very large differences and that's the problem, the close ones don't matter it's the existence of the ones with issues.

 

24 minutes ago, GMWolf said:

not in the server market, but in the workstation market.

They are directly linked and one in the same, you won't get one without the other.

 

Also you don't have to convince me, I'm still running my dual 290X's and before that a 6970. Right now this second I would buy an Nvidia GPU, but I would give the Vega 56 a serious look since I would be running it under water.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, GMWolf said:

and good double precision floating point performance has already made AMD GPUs preferable for many tasks (unfortunately, that also includes mining).

Double precision isn't used in mining and AMD actually cut out the full DP units in Vega, performance for that sucks.

 

Vega:

image.png.3ffa9bba26113d2238def0baa3060257.png

 

Hawaii was that last architecture with full DP units and AMD still cut it from the gaming GPUs

image.png

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, leadeater said:

Double precision isn't used in mining and AMD actually cut out the full DP units in Vega, performance for that sucks.

 

Vega:

image.png.3ffa9bba26113d2238def0baa3060257.png

 

Hawaii was that last architecture with full DP units and AMD still cut it from the gaming GPUs

image.png

Oh that is interesting! I didn't know that! I thought they stuck with full dp.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, GMWolf said:

Oh that is interesting! I didn't know that! I thought they stuck with full dp.

The race down AI and deep/machine learning meant full DP wasn't required as they don't actually use them. The Tensor cores in Nvidia Volta is actually half-precision. It's a race for FP16 performance now.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm so glad that's a GPU graph, I panicked thinking this was the AMD Threadripper thread, and we were getting a repeat of the "half core" thing. ;)

In a GPU a mix of precision is no worry, as games don't need it, and those using it for compute, can adjust their code accordingly.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, TechyBen said:

I'm so glad that's a GPU graph, I panicked thinking this was the AMD Threadripper thread, and we were getting a repeat of the "half core" thing. ;)

Well two of the dies don't have directly attached memory so they aren't "real cores".... I'll see myself out.

 

Also it worries me that this type of comment could actually catch on, hmm regret saying it? Nah.....

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Well two of the dies don't have directly attached memory so they aren't "real cores".... I'll see myself out.

 

Also it worries me that this type of comment could actually catch on, hmm regret saying it? Nah.....

how about cuda cores? they are 100% not cores! They are vector execution lanes, yet no one ever mentions that!

 

But that is hardly the point.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, leadeater said:

Well two of the dies don't have directly attached memory so they aren't "real cores".... I'll see myself out.

 

Also it worries me that this type of comment could actually catch on, hmm regret saying it? Nah.....

Oh, I was only worried as to the performance aspect. There are lots of things that can affect performance.

 

Which is why we are seeing Intel/NVidia have such far leads at times. It might just be one little thing they can do better. Then AMD does one other thing better all of a sudden (multicore sellotaping together that actually works?) and they steam ahead.

 

Also, did Linus not do a video where a Canadian server farm used a ton of Vegas? So some advertising/corporate marketing must be being done somewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, TechyBen said:

Also, did Linus not do a video where a Canadian server farm used a ton of Vegas? So some advertising/corporate marketing must be being done somewhere?

Nah those were Nvidia GPUs, was the video covering a university over there putting in a new HPC cluster.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, TechyBen said:

Whelp... there was THIS one 

 

I guess it was AMDs own one then?

Holy crap I swear they were using Tesla's, EPYC epic fail.

 

Edit:

Oh ffs wrong video >.<, double fail lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, sorry my bad, the Canadian one was just Xeons... were there any GPUs? 

[edit]

Out of the 3 or 4 different server racks, they used 1 with Teslas in it. That makes sense, as you said, for the "AI" cores.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, TechyBen said:

Yeah, sorry my bad, the Canadian one was just Xeons... were there any GPUs? 

Honestly wasn't really paying much attention, totally my mistake.

Link to comment
Share on other sites

Link to post
Share on other sites

So it seems AMD have been aiming for the bang per buck, and NVidia just for the bang.

 

So we will have to see what the pricing is in 2019? Will AMD come in where the 1050ti is currently? Cheap and performance? Or the 1070? With NVidia just having 1080ti v2 at even more cost, but even more performance?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×