Jump to content

AMD is confident there won't be availability issues with the RX 6000 series when it launches

Delicieuxz
3 hours ago, RejZoR said:

@leadeater

Tho, if we are honest, AMD did bring RX 5700 XT literally out of nowhere. It wasn't a crown stealing product, but the way how AMD was struggling for years and every time we thought they have a killer card like R9 Fury with its radical design or RX Vega with it's hype, they were always underwhelming. But then came RX 5700 XT and surprised everyone. We didn't really heard anything about it, nothing about RDNA, it just dropped out of the blue and it's a well performing card. Given how they just dropped this one and made it a really good success, my hopes and expectations are now much higher than during Fury or Vega era. It means they now have some clue how to make good GPU. Similar was with Ryzen. It wasn't a killer, but it was a disruptor. I'm suspecting Big Navi to be that too. And that's fine.

Couldn't agree more, manage expectations and hope for the best.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, leadeater said:

We've "known" about spec before and been able to guess before and every time AMD has missed the mark, each and every time. You know I love my 290X's they were great value, still not the best GPU on the market at the time but the best AMD could do.

If you want some fun reading then type in "AMD leak" into the search box, and then filter by time so you get no thread younger than 3 years in the search results.

You'll get a ton of gems like threads where people say Vega will beat the 1080 by like 20% because they saw a post on WCCFTech about it.

 

We also get gems like this thread where a (now banned) forum member talks about how bad Zen would be and how it couldn't match Intel... Just to then get proven horribly wrong not too long after.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, leadeater said:

But things don't scale linearly and historical records and ability to execute does absolutely count. Who cares what node Intel is on when they still have the performance required for the market, something AMD GPUs do not in anything other that mid/upper-mid to low tier. How much do you care about Intel 10600k? Yea not much right compared to 10900k and Ryzen 3900X/3950X correct? At least with AMD CPUs you get basically the same gaming performance up and down their CPU stack so something like a 3600 is actually interesting.

 

We've "known" about spec before and been able to guess before and every time AMD has missed the mark, each and every time. You know I love my 290X's they were great value, still not the best GPU on the market at the time but the best AMD could do.

things do scale pretty well when it comes to CU count vs performance. a 5500xt has 22CU vs the 40 on a 5700xt they have ~100mhz clock speed difference and the 5700xt is ~1.9x faster than the 5500xt

ah yes Intel has the performance required as they raise power consumption on a 6 year old node and design. I don't care at all about intels lineup at this point because I'm not after a 200-300W heater that wins by 5%

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, valdyrgramr said:

Well, to be fair cards like the Vegas were faster than even the 1080 Ti and 2080 Ti in some regards.    But, it's better to wait and see what happens than speculating on history.

Vega VII was nearly equal to a 2080ti in compute workload. Just no where near in gaming 

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

The main issue is really the assumption or expectation that not only AMD can but also will release a card as fast as or faster than the RTX 3080. When is the last time AMD had a current generation fastest GPU? What historical record are peopling going off to set this expectation? Seems to be all purely hopes and dreams to me.

 

I've basically always owned ATI/AMD GPUs with a few Nvidia but that does not influence me in any way to making critical assessments of the situation.

Where are you getting this assumption from what I said?

I have no data that they do.  I am saying that if they want a major position in the market I think that’ll need one but that in no way says I think they actually have one.   I do not think it is likely that they do. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/26/2020 at 6:01 PM, porina said:

We've only been given a date for the announcement, but we don't have a date for when people can buy. I'd argue they could still vary that if they see demand, and build up more stock to match if needed.

 

Bonus points if anyone can work out what the typical time gap is between previous AMD product announcements and when you can actually buy it. Let's see if this follows that pattern, or if it is longer.

exactly what I was thinking xD 

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

Well there is always a massive diminishing return on price to performance when you get into the high end. Happens with pretty much all products from all companies.

Cards like the 3090 aren't made for the price conscious. Never has been.

 

 

The graph you posted puts it at 16% higher on average, and the Asus card being 19% higher on average.

86 to 100 = 16.3% increase.

84 to 100 = 19% increase.

 

So I'd say it's more accurate to say the 3090 is 15-20% faster than the 3080 according to TechPowerUp.

 

 

 

 

I don't understand what you mean. What does this have to do with AMD needing a high end card to be competitive with Nvidia? The person I responded to said they would think AMD were competitive on the high end if their top of the line card was 10% slower than Nvidia's 3080. I said I would only consider AMD competitive on the high end if their top of the line card was actually matching Nvidia's high end cards.

 

 

Are you sure about that? I'd say people are buying Nvidia cards not because "they are Nvidia fanboys" but rather because Nvidia simply have the best products on the market.

I don't think a lot of people are buying the 30 series of GPUs just because they want an Nvidia card. I think they are buying it because they are simply way better than what AMD is offering, and I think you'll find it hard to prove otherwise.Re: are you sure?

re: 3090 sure.  
There is little evidence though that the cost of production of a card has a heck of a lot to do with it’s asked for price.  I don’t know what a 3090 actually costs to produce.  I think it’s possible that if Nvidia really had to they could sell them for much much less. I don’t know.  The way it works is “whatever the market will bear” though.  Nvidia thinks the market will bear 1,500, so that is what they are sold for. 

 

re: im not sure.

 It has to do with market entry.  The GPU market is seen by Nvidia and has been for years.  Just because there appears to be space between a 3070 and a 3080 doesn’t mean there actually IS.space.  There is ample evidence that Nvidia can and has manipulated the maximum speed of their cards down so that if they wish to later they can remove that speed impediment. 
 

 

Re: are you sure?

I read this as “I do not believe Nvidia fanbois exist and I don’t think you can prove they do”

 

I think statistically they more or less have to simply because there are AMD fanboys.   I do agree that they are harder to pick out than AMD fanboys. How do you pick a fair weather football  fan from a die hard fan when a team is winning? 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Bombastinator said:

Where are you getting this assumption from what I said?

I have no data that they do.  I am saying that if they want a major position in the market I think that’ll need one but that in no way says I think they actually have one.   I do not think it is likely that they do. 

It's not you'res but it's from the conversation chain you joined and are commenting on, what I mentioned is basically the real point of contention. These very optimistic statements about how good Big Navi is going to be based on very little and ignores a long history of abject failure when it comes to being able to put out a performance leading product. There is actually something to be said for proven track record.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

This seems to me to be an argument over what “effect development” means.

I don't think so, there is zero evidence consoles have helped PC development in any way other than financially for the hardware companies involved and the profit margins are extremely low on console hardware supply contracts. It really is only recently that development tools are becoming more common between console and PC so this will change, I don't doubt that but if we are talking historically the no I completely disagree that AMD having hardware in consoles has helped them get better optimization in PC games which was the original statement made. It has not. It is a major reason why AMD GPUs have not been able to deliver the in game performance they likely could, because it's not happening. We've seen a very select few titles that AMD have directly sponsored and helped with optimization on the PC and the performance uplift is huge, the problem is this is a rarity.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, leadeater said:

It's not you'res but it's from the conversation chain you joined and are commenting on, what I mentioned is basically the real point of contention. These very optimistic statements about how good Big Navi is going to be based on very little and ignores a long history of abject failure when it comes to being able to put out a performance leading product. There is actually something to be said for proven track record.

My statement so far has mostly merely been that I think the differentiation between the 3080 and 3090 is an artificial one created by Nvidia. I think that if AMD comes out with a card that is not so fast that Nvidia cannot beat it, a “3080” class card such as a 3080ti WILL appear that is fractionally faster even if it is functionally different from a 3090 in nothing but name. It price will likely be vastly lower of course.  If they have a gpu that is only as fast as a 3080, they have not beaten ga102.  I believe ga104 could likely be made to go as fast as a 3080.  So if they have a card that at its top end is as fast as a 3080 they don’t have a ga102 competitor they have a ga104 competitor because ga104 and 102 are both gimped by Nvidia when placed in 3070s and 3080s.  A 3080 is ga102 it’s got the same gpu as a 3090.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Bombastinator said:

My statement so far has mostly merely been that I think the differentiation between the 3080 and 3090 is an artificial one created by Nvidia

Well yes but that isn't what was being discussed at the time. Basically all GPU models are artificial differences because Nvidia only makes 6 dies (Gx100, Gx102, Gx104, Gx106, Gx107 and Gx108). All the revisions of these die codes are either replacements of former ones or just an end configuration based on binning and isn't a change to the fabrication of the die.

 

For example every GT102 die has the potential to be a Quadro RTX 8000 but it doesn't mean it will actually be put in to the product even it it passes all the binning tests to validate it for that product, it all depends on demand for other products and having to meet those as well.

 

But this isn't just an Nvidia thing, ATI/AMD does it too. This is just how it works.

 

22 minutes ago, Bombastinator said:

I think that if AMD comes out with a card that is not so fast that Nvidia cannot beat it, a “3080” class card such as a 3080ti WILL appear that is fractionally faster even if it is functionally different from a 3090 in nothing but name. It price will likely be vastly lower of course.

The full GA102 die has 84 SMs (10,752 CUDA cores) and the RTX 3090 is only using 82 of those and the RTX 3080 only uses 68 of them. There will most likely be an RTX 3080 Ti or equivalent at some point basically regardless of what AMD does because I expect Nvidia will go back to what they normally do and release this product mid generation like they used to after manufacturing optimization is there to support it and there is appetite from customers to upgrade to something newer and faster with the benefit of not having to spend a lot on architecture development.

 

If I were to guess a "RTX 3080 Ti card" will have between 72 to 76 active SMs and higher clocks than the RTX 3090 and will perform slightly better in games. However most of the performance limitations right now is actually power so I'm going to be very interested to watch over time how AIB cards change things and if they are allowed to start releasing models with significantly higher power caps.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Bombastinator said:

My statement so far has mostly merely been that I think the differentiation between the 3080 and 3090 is an artificial one created by Nvidia. I think that if AMD comes out with a card that is not so fast that Nvidia cannot beat it, a “3080” class card such as a 3080ti WILL appear that is fractionally faster even if it is functionally different from a 3090 in nothing but name. It price will likely be vastly lower of course.  If they have a gpu that is only as fast as a 3080, they have not beaten ga102.  I believe ga104 could likely be made to go as fast as a 3080.  So if they have a card that at its top end is as fast as a 3080 they don’t have a ga102 competitor they have a ga104 competitor because ga104 and 102 are both gimped by Nvidia.

If AMD can scale their Navi card to be 2x faster than a 5700XT in benchmarks,it would be close enough to a 3080, and I think AMD might price their card slightly lower than a 3080 but I understand AMD doesn't want to be seen as the second best low quality option, with the 5700XT the prices were close to the competing 2070 Super.

And the Nvidia 30 series has been a mess, high power consumption, and it seems the AIB's didn't get enough testing time, or some didn't have enough margin to include more high end components, AMD also needs to scale up their card while consuming less power, then I think they would have something that could actually compete even if it were 10% slower than a 3080.

51 minutes ago, Bombastinator said:

Re: are you sure?

I read this as “I do not believe Nvidia fanbois exist and I don’t think you can prove they do”

 

I think statistically they more or less have to simply because there are AMD fanboys.   I do agree that they are harder to pick out than AMD fanboys. How do you pick a fair weather football  fan from a die hard fan when a team is winning?

I'm not sure if it's just people getting onto the hype train but theres always people falling for the Nvidia marketing like the 3080 is 2X faster claims, or some cherry picked benchmarks without any consideration of how accurate those claims are in other games. But there definitely is the bias of only AMD drivers have bugs.

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, leadeater said:

I don't think so, there is zero evidence consoles have helped PC development in any way other than financially for the hardware companies involved and the profit margins are extremely low on console hardware supply contracts. It really is only recently that development tools are becoming more common between console and PC so this will change, I don't doubt that but if we are talking historically the no I completely disagree that AMD having hardware in consoles has helped them get better optimization in PC games which was the original statement made. It has not. It is a major reason why AMD GPUs have not been able to deliver the in game performance they likely could, because it's not happening. We've seen a very select few titles that AMD have directly sponsored and helped with optimization on the PC and the performance uplift is huge, the problem is this is a rarity.

That’s a repetition of your previous statement not a critique of mine.  You are merely repeating your definition. I’m not disagreeing with you.  I’m saying that the two views are not based on a difference of what is occurring but a difference of what given words mean. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Bombastinator said:

That’s a repetition of your previous statement not a critique of mine.  You are merely repeating your definition. I’m not disagreeing with you.  I’m saying that the two views are not based on a difference of what is occurring but a difference of what given words mean. 

 

Well here is the original statement that was made

Quote

AMD have the advantage of developers optimizing for GCN and RDNA hardware

 

and my reply to it

 

Quote

Not on the PC they do not, AMD has been the console hardware provider for a long time and this has done exactly zero to get game developers to give equivalent or preferential optimization treatment for AMD. Nvidia on PC is king by a very long way and thus gets treated in the way their position commands, almost all games and game engines are optimized for Nvidia architecture, that's just how it is and will not change without multiple consecutive generations of AMD outselling Nvidia. You don't reverse over a decade of market dominance in a single generation i.e. Ryzen vs Intel.

 

There is no word difference issue at all, it all comes back to this statement which is not true. Just because there is AMD hardware in consoles, like there has been for decades now, has not resulted in better optimization over of PC like was stated.

 

Edit:

I have to keep repeating myself because for what ever reason people what to pick at words and not the claim itself, the claim is what matters.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, leadeater said:

 

Well here is the original statement that was made

 

There is no word difference issue at all, it all comes back to this statement which is not true. Just because there is AMD hardware in consoles, like there has been for decades now, has not resulted in better optimization over of PC like was stated.

 

Re: the original statement

So you’re not actually replying to my statement at all even though you quoted it and said my statement was false.

 

re: word difference issue

I wasn’t even in that one.  That was you and another poster. This new statement changes the field though and is different from previous statements.

“AMD hardware for decades now”. are you saying that the ps3, PS4, Xbox and Xbox one all ran AMD cpu/GPUs?  or are you differentiating gpu from “hardware”? If so what “hardware” are you talking about?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

This one will be interesting to see how it plays out by AMD after the launch of RTX 3080/3090. 

 

CPU Cooler Tier List  || Motherboard VRMs Tier List || Motherboard Beep & POST Codes || Graphics Card Tier List || PSU Tier List 

 

Main System Specifications: 

 

CPU: AMD Ryzen 9 5950X ||  CPU Cooler: Noctua NH-D15 Air Cooler ||  RAM: Corsair Vengeance LPX 32GB(4x8GB) DDR4-3600 CL18  ||  Mobo: ASUS ROG Crosshair VIII Dark Hero X570  ||  SSD: Samsung 970 EVO 1TB M.2-2280 Boot Drive/Some Games)  ||  HDD: 2X Western Digital Caviar Blue 1TB(Game Drive)  ||  GPU: ASUS TUF Gaming RX 6900XT  ||  PSU: EVGA P2 1600W  ||  Case: Corsair 5000D Airflow  ||  Mouse: Logitech G502 Hero SE RGB  ||  Keyboard: Logitech G513 Carbon RGB with GX Blue Clicky Switches  ||  Mouse Pad: MAINGEAR ASSIST XL ||  Monitor: ASUS TUF Gaming VG34VQL1B 34" 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Re: the original statement

So you’re not actually replying to my statement at all even though you quoted it and said my statement was false.

Well I can't see anywhere where I said you said anything false?

 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Bombastinator said:

I wasn’t even in that one.  That was you and another poster. This new statement changes the field though and is different from previous statements.

“AMD hardware for decades now”. are you saying that the ps3, PS4, Xbox and Xbox one all ran AMD cpu/GPUs?  or are you differentiating gpu from “hardware”? If so what “hardware” are you talking about?

That other poster was still commenting on a conversation from that original statement I addressed. It also doesn't make much difference if it were CPU or GPU but since older generation consoles were not AMD CPUs and only GPUs that should answer that.

 

The reason why having hardware in those consoles didn't improve optimization on the PC side was because the entire development process was completely different and none of it was transferable and when it comes to PC Nvidia has that market locked in and has very strong industry partnerships that AMD just doesn't have in that market.

 

The original statement is making the assumption that having AMD hardware in the console and game developers making games and optimizing it for that console hardware also helps the PC optimization because the hardware architecture is the same but unfortunately it doesn't work that way. It would be nice if it did, and like I mentioned I think it will in the PS5/Xbox One X and post PS5/Xbox One X era but it's not a thing just yet.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/27/2020 at 9:13 AM, GDRRiley said:

NVIDIA has had major launch driver issues. just no one remembers cus it was only the top end cards out for the first 2-3 months and by the time the masses got them it was fixed.

AMDs had 1 rough driver launch their first new GPU in almost a decade.

They are taking the time to get this right. Hell the same IP is already been made by the tens millions for PS5/XOBX X they could have made this GPU 2-4 months earlier if they wanted to rush

AMD drivers have ongoing issues - yes they fixed some stuff (recently) but overall experience is still rather buggy, and saying otherwise doesn't really help anyone . their drivers are really weird too, if they'd actually work as advertised and stable they were really really good but as is it seems they're trying too much and aren't as low level integrated like NV and Intel drivers are, it all has a pretty amateur'ish flavor to it, which again is weird for someone who's as long in the business as AMD is.

 

Also to beat the market leader you need to be the first one showing, not months later when most people already bought and are invested in the "other side". It does very much feel like AMD isn't really trying and is indeed content to be the chip provider for the next gen consoles...

 

yes, definitely NV botched this launch, maybe AMD can take advantage,  though they'd need to convince people that this time the performance is there and drivers are actually stable and up to snuff, which remains a steep uphill battle for them . reviews alone won't cut it, you can only get disappointed by a product not performing as expected so many times (which at this point is literally AMDs biggest issue imo)

 

 

@leadeater I agree AMD specifically hasn't done a whole lot in regards of consoles > PC, but console and PC development are often very much connected on the developers side I think - it has been quite fascinating to watch this with Monster Hunter World where the PC version took over a year to catch up with the console version for example and has seen several improvements such as a "high resolution texture pack".

 

That said , about the point I was making above, what do you think AMD can do that PC games are better optimized for their hardware (GPU side) ? Because that's also a thing I keep hearing when I or others complain about lower than expected (advertised) performance on AMD cards ... "well *this* game [specifically] is not optimised well for AMD no surprise it runs better on NVIDIA..."  (problem is of course this seems to apply to pretty much nearly *all* games)

 

I mean this is the problem AMD really needs to tackle, most people will just keep chosing Nvidia, not because they're "fanboys" but because it "just works" and usually *better* than advertised even...

 

It's probably a driver issue as much as it's an issue of developers targeting Nvidia hardware more, and I don't know how can AMD solve this. Egg and chicken problem quite literally. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

That other poster was still commenting on a conversation from that original statement I addressed. It also doesn't make much difference if it were CPU or GPU but since older generation consoles were not AMD CPUs and only GPUs that should answer that.

 

There reason why having hardware in those consoles didn't improvement optimization on the PC side was because the entire development process was completely different and none of it was transferable and when it comes to PC Nvidia has that market locked in and has very strong industry partnerships that AMD just doesn't have in that market.

 

The original statement is making the assumption that having AMD hardware in the console and game developers making games and optimizing it for that console hardware also helps the PC optimization because the hardware architecture is the same but unfortunately it doesn't work that way. It would be nice if it did, and like I mentioned I think it will in the PS5 and post PS5 era but it's not a thing just yet.

Re: Nvidia has that market locked.  
This appears to not be the case for this generation, This generation it seems that it is AMD that has that market locked.  I don’t know if that will make a difference or not though.  You say historically it hasn’t.  I am not so sure.  It may have changed how well Nvidia GPUs ran things and may have affected their advantage in the PC market.  We shall see.  This generation PC and Console hardware are much much closer in design Than they were before. I don’t know if that will change the metric or not. We will have more information that could affect this very soon.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mark Kaine said:

That said , about the point I was making above, what do you think AMD can do that PC games are better optimized for their hardware (GPU side) ? Because that's also a thing I keep hearing when I or others complain about lower than expected (advertised) performance on AMD cards ... "well *this* game [specifically] is not optimised well for AMD no surprise it runs better on NVIDIA..."  (problem is of course this seems to apply to pretty much nearly *all* games)

 

I mean this is the problem AMD really needs to tackle, most people will just keep chosing Nvidia, not because they're "fanboys" but because it "just works" and usually *better* than advertised even...

 

It's probably a driver issue as much as it's an issue of developers targeting Nvidia hardware more, and I don't know how can AMD solve this. Egg and chicken problem quite literally. 

Honestly I think AMD just need more resources (money basically) to be able to invest in industry partnerships and work with game engine developers to get the necessary optimizations in to the engines and then also better support game developers. Nvidia does this very well but they have the money and people to do so.

 

AMD just hasn't been in a financial situation to do this, having to support both CPU and GPU development is very hard and for a long while neither were doing all that well in the market. AMD has a real opportunely to leverage their recent successes and be able to branch out from just hardware development in to software support.

 

I know GameWorks got a very bad reputation but that is essentially the type of thing AMD needs to be able to do, be able to create high quality dedicated game development tools that can be used to accelerate the process and in a way that is inherently already optimized for their hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Honestly I think AMD just need more resources (money basically) [...]

That's what I've been thinking, but I wasn't sure if it's that "easy" but it makes sense, also about the tools for game development!

 

We'll see if AMD can do this, seems to be a good opportunity with their recent success on the CPU side, and Nvidia having a less than ideal start into this generation...

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

That's what I've been thinking, but I wasn't sure if it's that "easy" but it makes sense

I'm not sure it's that simple either, but it's one of the differences that I can easily point to. We all have such a high level view of the situation and lack so much information I can really only say what I think but it's also very easy for me to be completely wrong 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

The full GA102 die has 84 SMs (10,752 CUDA cores) and the RTX 3090 is only using 82 of those and the RTX 3080 only uses 68 of them. There will most likely be an RTX 3080 Ti or equivalent at some point basically regardless of what AMD does because I expect Nvidia will go back to what they normally do and release this product mid generation like they used to after manufacturing optimization is there to support it and there is appetite from customers to upgrade to something newer and faster with the benefit of not having to spend a lot on architecture development.

 

If I were to guess a "RTX 3080 Ti card" will have between 72 to 76 active SMs and higher clocks than the RTX 3090 and will perform slightly better in games. However most of the performance limitations right now is actually power so I'm going to be very interested to watch over time how AIB cards change things and if they are allowed to start releasing models with significantly higher power caps.

I'm having trouble justifying an RTX-3080ti in my head. If the 3090 is already pulling large amounts of power, I can't see AIBs or Nvidia wanting EVEN more power hungry cards in a higher clocked 3080ti. If we're looking at performance, there's quite a small gap for a potential RTX-3080ti to fit in between the 3080 and 3090 if you want gaming performance to slot in between the two and the VRAM picture doesn't look that good either. Theres going to be a 20gb 3080 sometime so I find it hard for Nvidia to justify a 3080ti between 20 and 24gb of vram (both are plenty). If Nvidia does release it, I feel like it'll be another case of extreme segmentation within their product line which tbh, I don't really see a large market for. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

I'm not sure it's that simple either, but it's one of the differences that I can easily point to. We all have such a high level view of the situation and lack so much information I can really only say what I think but it's also very easy for me to be completely wrong 🤷‍♂️

yeah, no I don't think that's wrong at all, it's what everyone would think looking at the situation, I was just asking because I thought maybe there's something obvious I didn't think of, like you said there's a lot we simply don't know probably, but the direction is clear...They (AMD) need to invest a lot, good and theoretically competitive products is surely a good start but there's a lot more to it to be on the same level of success as Nvidia. It will be really interesting to see how it unfolds in the near future...

 

I was also secretly hoping Intel would get into the consumer GPU market  by now (this time for real lol) but they don't seem to be in a rush to do that either! 😂

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×