Jump to content

AMD Polaris Refresh? AMD "North Star" this year?

In the end, its not as easy as it seems.

 

And this Polaris Refresh makes sense because of low cost for adopting the other manufacturing process while getting higher clockrates or lower power consumption...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, valdyrgramr said:

Did somebody say North Star?  Also to how likely powerful this card is...

See the source image

I really do hope they go with the North Star name. The memes will be great.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/6/2018 at 10:08 PM, leadeater said:

The MSRP of those cards wouldn't have been much different GDDR5X or HBM. HBM costs more but that's only one part of the total equation. AMD also went with much lower margins on Vega than they normally would, making the card cheaper would likely have meant increased margins not lower MSRP, would be lucky to see $30-$50 difference in MSRP with GDDR5X plus we all know how meaningless MRSP is anyway.

Gamers nexus did a cost comparison of HBM and GDDR5X. HBM was around a $100 premium compared the GDDR5X. I know msrp is mostly meaningless, its more optics to the consumer. Still if a Vega 64 or 56 could have been brought down in price $50-$100 by using GDDR5X they would have been almost game changer products for AMD like Ryzen is. They still wouldn't have been competing at the ultra high end 1080ti and above spaces but jeez faster than a 1070 and 3/4 of the way to a 1080 for $399-$449. That would have been a wow moment  

 

On 7/6/2018 at 11:26 PM, Stefan Payne said:

Wait and see...
The Frontend Design they have right now doesn't scale that well with more Shaders and so on. 

But there are examples where VEGA 64 beats 1080ti.

 

And even a Fury X beats a 1070ti in that game...

So it seems that the Problem isn't in general with the AMD GPUs but the Software. That's also what Mining showed.

 

 

Theres one example and its Forza 7 and its an article from WCCFTECH. 

 

I'm sorry but you are bat $%#% crazy if you think Vega 64 is even in the same neighborhood let alone planet as a 1080ti, that's just crazy talk. 

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has their focus on next gen consoles so don't expect much.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, jasonc_01 said:

Gamers nexus did a cost comparison of HBM and GDDR5X. HBM was around a $100 premium compared the GDDR5X. I know msrp is mostly meaningless, its more optics to the consumer. Still if a Vega 64 or 56 could have been brought down in price $50-$100 by using GDDR5X they would have been almost game changer products for AMD like Ryzen is. They still wouldn't have been competing at the ultra high end 1080ti and above spaces but jeez faster than a 1070 and 3/4 of the way to a 1080 for $399-$449. That would have been a wow moment  

Yes I saw that, point was getting the cards cheaper to make may do nothing other than allow AMD to have a higher margin on it. Reducing build cost by $100 won't equate to $100 lower MSRP, particularly when your margin on the product is lower than you would like.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, jasonc_01 said:

Theres one example and its Forza 7 and its an article from WCCFTECH. 

 

I'm sorry but you are bat $%#% crazy if you think Vega 64 is even in the same neighborhood let alone planet as a 1080ti, that's just crazy talk. 

What do you say about this then?

https://gamegpu.com/rpg/ролевые/ni-no-kuni-ii-revenant-kingdom-test-gpu-cpu

 

And looking at the raw power:

https://www.techpowerup.com/gpudb/2871/radeon-rx-vega-64

https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti

 

There isn't soo much difference.

In some areas the 1080ti is a bit better, on others the VEGA is by far better. So the crazy thought is to think that one has to be far superior than the other and there is no way for both to be on the same level or the VEGA to be faster.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Stefan Payne said:

What do you say about this then?

https://gamegpu.com/rpg/ролевые/ni-no-kuni-ii-revenant-kingdom-test-gpu-cpu

 

And looking at the raw power:

https://www.techpowerup.com/gpudb/2871/radeon-rx-vega-64

https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti

 

There isn't soo much difference.

In some areas the 1080ti is a bit better, on others the VEGA is by far better. So the crazy thought is to think that one has to be far superior than the other and there is no way for both to be on the same level or the VEGA to be faster.

When you look at raw compute power they are very similar, why this doesn’t extend to most games though is beyond my realm of knowledge.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, schwellmo92 said:

When you look at raw compute power they are very similar, why this doesn’t extend to most games though is beyond my realm of knowledge.

Its kinda typical AMD tbh, "look at all this raw power (doesn't translate to games)"

 

36 minutes ago, Stefan Payne said:

Ok so now that's two websites that claim Vega 64 victory over the 1080ti, let alone a 1080.

 

Wccftech is about as credible as Donald Trump, and then theres the sketchy one all in Russian. Seriously? This website has a Ryzen 7 1800x and a Ryzen 5 1600x at the top of the stack in gaming performance with a 5960x and beating out a  7600k, 6850k, 6700, 4770k, 6600 then a R5 1400, 4670k, 2600k and then an R3 1300x.

 

How can you take that seriously? There also using a title that no one uses as a benchmarks, and i would say few people know of or play. Also if you dig into there results deeper you'll find the 1080ti beating the Vega 64 under every circumstance EXCEPT where the cpu has more than 4 cores/8 threads. In the higher core count tests the 1080ti scores do not change yet the Vega 64 results do change.

 

Hmmmmm fishy, WAIT the Vega 56 results do not change either. So we can basically throw the Vega 64 results out the window. Play with the drop down menus and you'll see. 

 

 1080ti>1080>1070ti/Vega64>1070>Vega56. 

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, schwellmo92 said:

When you look at raw compute power they are very similar, why this doesn’t extend to most games though is beyond my realm of knowledge.

Yes, exactly.
That's why its assinine to claim that they are universes apart - they are not.

And there are some instances where the VEGA can be faster than the 1080ti, one of those is compute, one is certain games. 

Its not a general thing and depends on how its used.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, jasonc_01 said:

Its kinda typical AMD tbh, "look at all this raw power (doesn't translate to games)"

Yes, because that is what the card is able to do, theoretically.

That means that, if nothing goes wrong, everything is as it should be and nothing hinderes the performance, both should be in the same ballpark.

 

8 minutes ago, jasonc_01 said:

Ok so now that's two websites that claim Vega 64 victory over the 1080ti, let alone a 1080.

There are a couple of games where VEGA are above the 1080 or at least very close.

 

For example:

https://gamegpu.com/rts-/-стратегии/jurassic-world-evolution-test-gpu-cpu

https://gamegpu.com/rpg/ролевые/unravel-two-test-gpu-cpu

https://gamegpu.com/action-/-fps-/-tps/far-cry-5-hours-of-darkness-test-gpu-cpu

https://gamegpu.com/action-/-fps-/-tps/desolation-of-mordor-test-gpu-cpu

 

8 minutes ago, jasonc_01 said:

Wccftech is about as credible as Donald Trump, and then theres the sketchy one all in Russian. Seriously? This website has a Ryzen 7 1800x and a Ryzen 5 1600x at the top of the stack in gaming performance with a 5960x and beating out a  7600k, 6850k, 6700, 4770k, 6600 then a R5 1400, 4670k, 2600k and then an R3 1300x.

For gods sake, keep the politics out of this thread!

That has nothing to do with that.

 

And the russians are usually the ones who do the grinding work. If you see a non russian site that does the same stuff as GameGPU does, pls post it. But I haven't found one of those.

 

And with the CPUs, they are all equally fast, there is noone beating ot anything.

i7-9560X, i3/6100, i5-6600, Ryzen 7/1800X and a couple of others are equally fast. That means that there is some kind of frame limit in the game. And the minfps might be caused by the graphics card. Nothing wrong with that, that is to be expected in such situations.

 

8 minutes ago, jasonc_01 said:

How can you take that seriously?

How can you attack their reputation when you have NO facts that support your case.

Facts don't care about your feelings!

And what I've posted were facts.

If you have proof that they were wrong, post it!

 

8 minutes ago, jasonc_01 said:

There also using a title that no one uses as a benchmarks, and i would say few people know of or play.

1. Exactly!
That is what they do! 

They take a new game, benchmark them in their way and write an article about is, compare what the details do.

 

And how many people that play or not is irrelevant. Its a console port and primary developed for Playstation 4. And very well optimized for that plattform it seems - as was the predecessor!

 

 

And that it wasn't played by people is just something you have no proof of!

 

According to steam it is quite the opposite:

https://store.steampowered.com/app/589360/Ni_no_Kuni_II_Revenant_Kingdom/

 

~1400 notes and very high ratings!

https://en.wikipedia.org/wiki/Ni_no_Kuni_II:_Revenant_Kingdom

 

8 minutes ago, jasonc_01 said:

 1080ti>1080>1070ti/Vega64>1070>Vega56. 

Now that's just a lie and total bullshit based on your beliefs.


In reality it doesn't look that way and the VEGA64 regularly beats the 1080 regularly and comes close to the 1080ti. So its not as bad as you claim it to be.


Well, if the developer of the game doesn't seem too incompetent or uses that Gameworks Shit...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Are we really discussing whether Vega 64 or 1080 Ti is the better gaming card? Seriously?

 

1080 Ti is the better card for gaming. Unanimously. 

 

The only area where they trade blows is in some compute workloads.

 

Theoretical performance is a poor metric especially since it hasn't been realized after a year. Tflops != Gaming performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/6/2018 at 10:26 PM, Stefan Payne said:

Wait and see...
The Frontend Design they have right now doesn't scale that well with more Shaders and so on. 

But there are examples where VEGA 64 beats 1080ti.

 

And even a Fury X beats a 1070ti in that game...

So it seems that the Problem isn't in general with the AMD GPUs but the Software. That's also what Mining showed.

Most of those perform better on AMD.

Yeah, if nobody buys the AMD Products, its not worth to invest in real refreshes and you have to reuse what you already have. Especially since that is what the OEMs expect.

Can't blame AMD for the Market, can you?

 

AMDs actual flagship already trades blows with the 1080ti. 
So your Prediction is already wrong.

There are already more games out there that prove that, where VEGA 64 is very close to the 1080ti

Makes no sense...

You don't know how much AMD pays for this Refresh, what their plan is and what they can do with the Refresh.

 

Without those information, we can't assume anything.

BUT: We know that the changes made to Ryzen+ are pretty minor and the biggest one is the Switch to 12nm instead of 14nm. And we have a way higher clock rates.

 

There are some Pictures out there that show the advantages of the new 12nm vs. 14nm.

And you can use the same libary from 14nm with 12nm as well. That means that you can use the same design without much work on the newer process.

 

With a good deal from GF (wich is pretty likely), it makes sense to move to this process and keep Polaris in stock.

 

Right now we have two possibilitys:
a) use the Process for performance enhancements, possibily using the best of the best GDDDR5 SDRAM memory there is

b) use the Process for lower power consumption and don't use faster GDDR5 SDRAM but slower ones for maximum efficiency.

 

Or both, one SKU for higher efficiency, one for better performance.

 

Then you have to buy more AMD Products, so that they have money for R&D and are able to develop new things.

Even with the Budget AMD has, they are somewhat able to be competitive...

 

But when AMD was actually better, like in 2012, people still didn't buy them!

ANd if you did, you didn't need to replace the GPU up until 2016 or later.

 

...with another 50-60W Power Consumption, loss of the HBM Caching thing. Ähm, no.

 

HBM is a good thing, the only Problem is that the cost needs to come down.

But that's always the case with new technology. And AMD did codevelop this new technology and has been marketing it for years.

 

 

NO, not really.

Because VEGA 64 has +50% Memory Bandwith.

 

And going for another 512bit Memory Interface Chip doesn't seem to make too much sense to me either.and that is what HBM is designed to replace.

At a lower footprint + power conssumption.

Vega 64 isn't trading blows with 1080ti and NEVER will. And why would anyone buy a product with higher price tag while the competition offer lower price for roughly the same performance? The answer is FANBOYISM.

 

And please stop using shady russian site that no one ever heard of. There are tons of credible and known source out there and yet you chose that site. What's that smell? Ahh the smell of desperation.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Aldoarip said:

Vega 64 isn't trading blows with 1080ti and NEVER will.

Yet I've shown some Benchmarks where exactly that is the case....

 

Quote

And please stop using shady russian site that no one ever heard of. There are tons of credible and known source out there and yet you chose that site. 

As I said:
If you have a better alternative, pls feel free to post them!
But discrediting it just because its russian, well, do I need to say the R-Word?!

But they seem to be the only ones who bench everything they can. Nobody else does that.

 

And to be blunt:
If you've ever played Ni No Kuni: Wrath of the Witch on Playstation 3, you know how good the work that Level 5 did in case of optimization.

Because it looks beautiful (for the PS3) and it runs really really well. Something you don't see all day, especially on the trainwreck of Playstation 3 Hardware.

 

And if you know that, you can assume that Bandai (excuse me: Level 5!) did the same for Ni No Kuni II on Playstation 4 and optimized the shit out of it for the Hardware they had in front of them.

And then some time later, Bandai thought: Hey, we can bring it to PC, its like a Weeks Work, lets do that!

 

And the result of this is this heavily AMD Optimized Game that runs on PC Hardware like it does.

 

So these are the facts. What do YOU have?

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Stefan Payne said:

Yet I've shown some Benchmarks where exactly that is the case....

 

As I said:
If you have a better alternative, pls feel free to post them!
But discrediting it just because its russian, well, do I need to say the R-Word?!

But they seem to be the only ones who bench everything they can. Nobody else does that.

 

And to be blunt:
If you've ever played Ni No Kuni: Wrath of the Witch on Playstation 3, you know how good the work that Level 5 did in case of optimization.

Because it looks beautiful (for the PS3) and it runs really really well. Something you don't see all day, especially on the trainwreck of Playstation 3 Hardware.

 

And if you know that, you can assume that Bandai (excuse me: Level 5!) did the same for Ni No Kuni II on Playstation 4 and optimized the shit out of it for the Hardware they had in front of them.

And then some time later, Bandai thought: Hey, we can bring it to PC, its like a Weeks Work, lets do that!

 

And the result of this is this heavily AMD Optimized Game that runs on PC Hardware like it does.

 

So these are the facts. What do YOU have?

You literally cherrypicked this one title that has 1400 reviews on steam (I know amazing right?) and use it as your base of argument. In statistics, it's called outlier. You can't use outlier as your argument. No one cares if it's optimized on consoles. And your other links even showed that vega 64 isn't beating 1080ti and NEVER will. When there are hundreds of titles out there to benchmark, yet you use this title.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Aldoarip said:

And your other links even showed that vega 64 isn't beating 1080ti and NEVER will.

Yes, like Tahiti will never beat the GTX 780 - oh wait, that actually did happen... 

Or a Radeon HD7870 beats a GTX680 - oh wait that also happened...

Or a Radeon R9-280X (6G) beats a GTX 980...

 

And yet I've already shown you that exactly that happened in Ni No Kuni (wich I always mentiond that it shows what is possible with the right optimization...

 

And saying NEVER WILL!!!11 is just a lie, as Ni No Kuni II already has shown you. How can you say "VEGA will never beat 1080ti", when exactly that was what happened in that game? That it doesn't matter as even far weaker cards are more than fast enough for that game is also something you seem to ignore for whatever reason...

If you have an RX470/4G or a 1060/6G, you can play that game rather well, hell even with a R9-380X and a Geforce GTX780 it runs pretty god.

 

It might be improbable, but to say never is very disningenuous, as the past has already shown that exactly that happened.

 

You are just wrong, how can you say that a VEGA will never beat a 1080ti when you were proven wrong before even saying that?! That makes NO SENSE.

 

Or who would have thought that a Radeon HD7970GHz will fight with the GTX780 in more modern games and leaves the GTX680 behind, so that the 7870 can fight with it...
Especially if we talk about higher resolutions like 1440p or even 2160p...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stefan Payne said:

Yes, like Tahiti will never beat the GTX 780 - oh wait, that actually did happen... 

Or a Radeon HD7870 beats a GTX680 - oh wait that also happened...

Or a Radeon R9-280X (6G) beats a GTX 980...

 

And yet I've already shown you that exactly that happened in Ni No Kuni (wich I always mentiond that it shows what is possible with the right optimization...

 

And saying NEVER WILL!!!11 is just a lie, as Ni No Kuni II already has shown you. How can you say "VEGA will never beat 1080ti", when exactly that was what happened in that game? That it doesn't matter as even far weaker cards are more than fast enough for that game is also something you seem to ignore for whatever reason...

If you have an RX470/4G or a 1060/6G, you can play that game rather well, hell even with a R9-380X and a Geforce GTX780 it runs pretty god.

 

It might be improbable, but to say never is very disningenuous, as the past has already shown that exactly that happened.

 

You are just wrong, how can you say that a VEGA will never beat a 1080ti when you were proven wrong before even saying that?! That makes NO SENSE.

 

Or who would have thought that a Radeon HD7970GHz will fight with the GTX780 in more modern games and leaves the GTX680 behind, so that the 7870 can fight with it...
Especially if we talk about higher resolutions like 1440p or even 2160p...

 

OMFG. So hundreds of titles that showed 1080ti beats vega 64 to the dust is wrong and unoptimized, heh? So this game, in which vega 64 beat 1080ti, even though every other titles stated otherwise, is optimized and perfect according to you.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Stefan Payne said:

And with the CPUs, they are all equally fast, there is noone beating ot anything.

i7-9560X, i3/6100, i5-6600, Ryzen 7/1800X and a couple of others are equally fast. That means that there is some kind of frame limit in the game. And the minfps might be caused by the graphics card. Nothing wrong with that, that is to be expected in such situations.

No seriously. the graphs have drop down menus and you can select different CPU's and see the corresponding graphics card benchmark scores. If you select an i3 6100 the 1080ti scores 148fps, the 1080 scores 114fps the Vega 64 scores 114fps and the Vega 56 scores 114fps. 

 

Now if you select an i7 6700 the 1080ti still scores 148fps, the 1080 still scores 114fps yet the Vega 56 now scores 132fps and the Vega 64 now scores 143fps.

 

Lets try an i5 7600k 1080ti 148fsp, 1080 114fps, Vega 56 132fps and Vega 64 150fps? That makes absolutely no sence, your gonna tell me that the 1080ti did not benefit from the i3 bottleneck to and i7. Not to mention the i5 outperforming the i7 and match the 5960x?

 

Nothing about there testing adds up to any other conclusion than to disregard there testing all together, or its skewed.

 

7 hours ago, Stefan Payne said:

And yet I've already shown you that exactly that happened in Ni No Kuni (wich I always mentiond that it shows what is possible with the right optimization...

I'm sorry what optimization do you speak of? All they did was change a cpu and suddenly the Vega parts gained with the nvidia parts staying static (along with the Vega 56)

 

21 hours ago, Stefan Payne said:

 

Quote

1080ti>1080>1070ti/Vega64>1070>Vega56. 

Now that's just a lie and total bullshit based on your beliefs.


In reality it doesn't look that way and the VEGA64 regularly beats the 1080 regularly and comes close to the 1080ti. So its not as bad as you claim it to be.


Well, if the developer of the game doesn't seem too incompetent or uses that Gameworks Shit...

the beliefs of reputable reviews, reviewers and benchmarks. I don't care if Vega can compute the shit out of etherium, its a 1070 killer period, to bad it was overpriced for its capabilities.

 

14 hours ago, Trixanity said:

Are we really discussing whether Vega 64 or 1080 Ti is the better gaming card? Seriously?

 

1080 Ti is the better card for gaming. Unanimously. 

 

The only area where they trade blows is in some compute workloads.

 

Theoretical performance is a poor metric especially since it hasn't been realized after a year. Tflops != Gaming performance. 

Sanity. THANK YOU

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Aldoarip said:

So hundreds of titles that showed 1080ti beats vega 64 to the dust is wrong and unoptimized, heh?

Yes, there is a whole bunch of shitty or not at all optimized games out there.

 

And it also seems that you aren't able to refute my claims, especially on the R9-280x(6G) vs GTX 980 case. You didn't even mention that.

 

And at this point you should google "High Bandwith Cache". Because the VEGA 64 has something in there that allows easy access to the main memory, without a big performance impact like it was the case with more classical cards. That means that VRAM on VEGA isn't an issue, even when it ran out, the performance doesn't drop as dramatically as it is the case with other cards.

 

With that said, that is another case where your claim that VEGA will never beat 1080ti is not true. And there are already some other cases as well.

 

To say that XXX can never beat YYY only shows that you have no idea what you are talking about, especially since there are already cases where exactly that was the case.

 

€dit:
Async Compute is also another thing that is supported by AMD while it is not by nVidia....

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, valdyrgramr said:

AMD, unlike Nvidia, does use drivers to improve performance in the long run of their GPUs.  Pretty sure they did this with the 390, the non-x fury, and a few other cards.  Those are just the two I remember them doing that with.  Not sure if it originally did or not just saying with better drivers over time I could see it happening.

On the performance front I don't think it's that AMD cards kept getting magically faster and faster over the years, just a bit of that. But I think it's mostly that for even earlier gcn generations AMD never stopped doing the testing and application specific driver profiling.

 

So if you purchased a HD 7970 six years ago you are still benefiting from testing and driver optimizations/profiling geared towards modern 2018 AAA PC games. Whereas I highly doubt Nvidia is profiling the newest games on a gtx 680. As a result it loses to it's old competitor on new titles, but when you compare them using older games from their time they still trade blows.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Humbug said:

On the performance front I don't think it's that AMD cards kept getting magically faster and faster over the years, just a bit of that. But I think it's mostly that for even earlier gcn generations AMD never stopped doing the testing and application specific driver profiling.

 

So if you purchased a HD 7970 six years ago you are still benefiting from testing and driver optimizations/profiling geared towards 2018 AAA PC games. Whereas I highly doubt Nvidia is profiling the newest games on a gtx 680. As a result it loses to it's old competitor on new titles, but when you compare them using older games from their time they still trade blows.

I think it's also in part due to as games get more demanding in certain ways the GPU idleness tendency in GCN diminishes so there is less performance loss over time, fully utilizing GCN has always been rather bad/hard. Problem is by that time it's a bit irrelevant since the new products on the market are just that much better and games are that much more demanding your performance isn't that good anyway. My 290X's are a good example of that, aged well but still crap compared to current options.

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Humbug said:

On the performance front I don't think it's that AMD cards kept getting magically faster and faster over the years, just a bit of that. But I think it's mostly that for even earlier gcn generations AMD never stopped doing the testing and application specific driver profiling.

 

So if you purchased a HD 7970 six years ago you are still benefiting from testing and driver optimizations/profiling geared towards modern 2018 AAA PC games. Whereas I highly doubt Nvidia is profiling the newest games on a gtx 680. As a result it loses to it's old competitor on new titles, but when you compare them using older games from their time they still trade blows.

GCN also seems to have scaled better with the direction that Intel CPUs went over the last few generations. Intel put most of its focus into areas that help GPU-based Gaming performance (because the performance looks like Database work, in many ways). I'd speculate it has something to do with the lower clock speeds that GCN runs at, so the latency improvements in Memory & I/O help GCN more compared to the relevant Nvidia architecture. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

I think it's also in part due to as games get more demanding in certain ways the GPU idleness tendency in GCN diminishes so there is less performance loss over time, fully utilizing GCN has always been rather bad/hard. Problem is by that time it's a bit irrelevant since the new products on the market are just that much better and games are that much more demanding your performance isn't that good anyway. My 290X's are a good example of that, aged well but still crap compared to current options.

Somewhere in the ~2008 range, AMD's design departments made a bunch of decisions that had internal bottlenecks that they're still trying to work around. I caught discussion one time that the first completed Orochi design was put down in 2007, so some of the issues stretch back well over a decade at this point. And AMD won't be free of the mistakes until 2022. That's a pretty brutal set of mistakes. (It's also why the rumor that AMD is going back to VLIW designs makes a lot of sense.)

 

Still, as Jensen said, it's going to be a long time before we see any new architectures.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Taf the Ghost said:

(It's also why the rumor that AMD is going back to VLIW designs makes a lot of sense.)

Yes, because it simplifies the scheduler and front end, makes it less complicated and lets you scale a bit further.

The disadvantage of that is, like it was in the old Terrascale, that it is pretty inefficient and its hard to utilize all the Units you have - but they can't do that right now either. So not much is lost there...


Maybe you (or someone else) misunderstood it and they implement a two stage scheduling. A primary big scheduler for the whole chip and then another smaller one for a Units cluster.


So you put 64CUs together and cluster them, you have one big scheduler that schedules the stuff to the smaller scheduler wich then schedules it to each individual CU. 

 

That might sound retarded in the beginning but it also reduces the complexety of the scheduler.

 

And AMD was always a fan of clustering things and then make the next stage independent from the first - that was one of those things that made R300 so awesome back in 2002...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, valdyrgramr said:

I'm not going to deny the likeliness of the Vega 64 probably competing with the 1080ti in benches.  Before people give me shit about that just think about it for a second.  AMD, unlike Nvidia, does use drivers to improve performance in the long run of their GPUs.  Pretty sure they did this with the 390, the non-x fury, and a few other cards.  Those are just the two I remember them doing that with.  Not sure if it originally did or not just saying with better drivers over time I could see it happening.

Its multiple things that happens and is the cause for all this. 

 

  1. change of games development in a direction that runs better on AMDs GCN.
  2. constand driver improvements for all architectures on AMD's Side
  3. Implementing techniques that run well on AMD and allow them to utilize their performance better (=Async Compute)
  4. nVidia adpting some/many of the designs of AMDs GCN Architecture
  5. nVidia stopping to invest as much as they did when the GPU was still sold.

And all that leads the old GK104 in many games to compete to the GCN counterpart that it looks like on paper. The GTX680 competes with the HD7870 and other Pitcairn iterations, GTX780 competes with Tahiti

 

 

The worst Example of this might be the initial Witcher 3 benchmarks, where a GTX960(!) beat a GTX780. 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×