Jump to content

RX 7600 and RTX 4060Ti specs and release dates leaked | 8GB and 16GB cards coming for both

AlTech
18 minutes ago, AluminiumTech said:

If you play at low settings sure.

 

But High settings and higher won't be attainable with an 8GB card like the 4060Ti 8GB.

Let me check if any games Im playing are on low... hmm, nope none, absolutely zero, its almost like I already said what settings I was playing on.

Quote

 Most of them with settings maxed out (obviously not maxed out settings with the 760) 

18 minutes ago, AluminiumTech said:

Nvidia told Hardware Unboxed that the 4060Ti isn't meant for 1080p Ultra, it's meant for 1080p High.

 

Nvidia themselves are saying this.

That's just a CYA statement because of the 2 games it cant play on ultra and the games that come out in 3 years and you know it. Fact is, it can play 99% of games 1440p ultra, and 85% of games coming out today at 1440p ultra. That .1% of AAA games coming out tommorow wont be able to run 100fps 1080p ultra is so awful. 

AKA, why using a resolution in this way as a marketing point is dumb. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, starsmine said:

Let me check if any games Im playing are on low... hmm, nope none, absolutely zero, its almost like I already said what settings I was playing on.

That's just a CYA statement because of the 2 games it cant play on ultra and the games that come out in 3 years and you know it. Fact is, it can play 99% of games 1440p ultra, and 85% of games coming out today at 1440p ultra. That .1% of AAA games coming out tommorow wont be able to run 100fps 1080p ultra is so awful. 

Citation needed.

 

HUB tested 27 games released this year or last year and of those most were impacted by 8GB. They either had worse performance or worse texture quality. Despite high settings being enabled, a number of games loaded low settings or medium settings to compensate for the VRAM deficiency.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, AluminiumTech said:

Citation needed.

 

HUB tested 27 games released this year or last year and of those most were impacted by 8GB. They either had worse performance or worse texture quality. Despite high settings being enabled, a number of games loaded low settings or medium settings to compensate for the VRAM deficiency.

No, no citation needed, you keep quoting the hub test and then not reading the hub test even though I link it to you. 

https://www.techspot.com/article/2661-vram-8gb-vs-16gb/

why dont you look at it for a change? instead of quoting it like it shows a difference
As said, TLOU bench is invalid as its an older build of the game.

Shocker, the faster card wins. Regardless of vram. 
What games on this list can you not play 1440p ultra on with a 3060?
image.thumb.png.718e9eac2d1e26d218538bdaee01dd96.png
image.thumb.png.963ee06ab7829ddfdb4a3345f4efc526.png

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has confirmed the RX 7600 specs.

 

It looks like only the 8GB model is coming in May as expected though the Chip will be Navi 33 XL instead of Navi 33 XT and the VRAM speef is 18Gbps. The price will be unveiled later.

 

MLID has said distributors are sure AMD can sell it for $279 if they want to but AMD sources say AMD is trying to play hard ball and are insisting they'll charge $299.

 

MLID also confirmed RX 7600 will perform like a RTX 4060 and not the 4060Ti.

 

It seems like the RX 7700 will be AMD's answer to the 4060Ti coming later.

 

Though the 4060Ti continues to face competition from the 6700XT selling for $350 or less at retailers.

 

Why AMD aren't announcing the RX 7700 right now boggles the mind.

 

https://videocardz.com/newz/amd-radeon-rx-7600-gpu-specs-confirmed-navi-33-xl-with-2048-stream-processors-and-8gb-vram

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

Got some examples of those dev comments? I'd like to read up on it. I have no objection to them using more than 8GB of VRAM on the higher end. The question is what quality you get at 8GB. Most releases are not going to drop 8GB and even lower support for a very long time, unless they want to make a tech showcase for the highest end only.

Spoiler

Looking at the stats from current Steam Hardware Survey:

20% below 4GB

14% at 4GB

19% at 6GB

28% at 8GB

18% above 8GB, mostly 12GB at 11%. Presumably this is driven by the 3060.

Say they make 12GB the minimum. They would only potentially have 13% of the Steam user market as it stands. This goes up to 46% at 8GB, 65% at 6GB, and 82% at 4GB. They will be asking themselves where to draw the line.

considering how one might like to target console for AAA, which holds more Vram. lower end cards can start to struggle if all games get pushed into the same pipeline. Then again indie titles do be smashing it off

Link to comment
Share on other sites

Link to post
Share on other sites

Well, at least AMD has the same specs for their two different sized memory SKUs. It should at least not do as much of a controversy as Nvidia fucking people over with the same numbered card but vastly inferior performances for the lower vram model. 

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, starsmine said:

What games on this list can you not play 1440p ultra on with a 3060?
image.thumb.png.718e9eac2d1e26d218538bdaee01dd96.png
image.thumb.png.963ee06ab7829ddfdb4a3345f4efc526.png

Is there a single game on that list using a current-gen engine?

 

The whole point is new hardware is meant to enable new software, not force developers to continue to limit their vision because one aspect of the hardware is still holding them back.

 

Its not about "most software" or "current games", its about easing the hardware bottlenecks so future games can do more.  VRAM capacity is absolutely a bottleneck to making even more realistic looking games, even (if not especially) for smaller developers.

I said from the first day PS5 specs were officially announced that there would be a rude awakening for PC gamers.  They've gotten complacent with just how far games can scale, but sooner or later that baseline has to increase.  There comes a point where even if it DOES run, it doesn't feel like the same game as the cutbacks are so severe.  I experienced this first-hand back on PS Vita where the few games which were available on it and home consoles, didn't play or look the same at all.  Heck, even Burnout 3 didn't feel the same on PS2 as it did on Xbox, it was much more laggy to the point it got me to pay for Xbox Live.

I've been a PC gamer since before GPUs existed and the problem is things have stagnated for so long, some people haven't even experienced a big hardware transition before.  But it HAS to happen, if we want games to continue to look more realistic and developers be able to afford to actually leverage that power.

Its all very well blaming developers for not optimising their games enough, but are you willing to pay the financial cost to do that?  As top-end hardware gets better, you need to set the baseline higher or you've just increasing how much the game needs to scale dramatically and we'll continue to get games that run like garbage for months after launch because of that.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Alex Atkin UK said:

Is there a single game on that list using a current-gen engine?

Define current gen?

 

Fifa 23 uses a relatively recent version of Frostbite (similar in version to Battlefield 2042's version of Frostbite or possibly even newer) (which is hilariously overpowered/under utilised for a Fifa game).

 

I think of the top 58 games COD: Modern Warfare 2 / Warzone 2.0 is the only one using an engine of a remotely similar calibre to UE5.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, AluminiumTech said:

AMD is selling as many $999 7900XTX units as they can make. They can't keep up with demand, there's just too many people who want it.

7900 XTX has been consistently in stock since launch (at least in UK). They're selling, but they're not instantly disappearing.

 

9 hours ago, AluminiumTech said:

The 7900XT Pricing at $899 was too much for gamers and so AIBs and distributors cut the prices to $799 as a street price. Compared to the 4070Ti, the 7900XT is a no brainer at $799.

Lowering prices happens when you want to try and stimulate demand and/or adjusting to competition (both seller vs seller, brand vs brand).

 

7 hours ago, AluminiumTech said:

Why AMD aren't announcing the RX 7700 right now boggles the mind.

They don't know what they are doing as they don't have an expert like yourself.

 

While not ideal, I went to the only numerical source of sales data I know: mindfactory. Unfortunately the data I managed to dig up is rather limited in timescale, and extracted from TechEpiphany tweets. Does anyone know where to find the actual source of the data? I can't find it directly from Mindfactory. I'd caution this is one seller in one country, and may not reflect the wider market.

 

Post Date 13-May 15-Apr 27-Mar
Week 19 15 12
4090 130 210 180
4080 170 150 255
4070 Ti 220 290 200
4070 335 535 0
3080 Ti 10 0 0
3080 10 40 50
3070 Ti 40 60 130
3070 0 70 80
3060 Ti 90 100 130
3060 230 255 320
3050 20 20 25
       
7900 XTX 160 270 170
7900 XT 170 350 370
6950 XT 200 330 110
6900 XT 40 10 40
6800 XT 150 40 110
6800 40 310 240
6750 XT 90 100 90
6700 XT 230 480 330
6700 20 0 0
6650 XT 120 40 120
6600 XT 10 20 20
6600 220 530 240
6500 XT 10 160 50
6400 0 20 40

 

4070 released just before the middle date above.

 

Ada is moving. On Ampere, 3070 and above look like they're selling out, or just not selling, which is appropriate since they're transitioning to 40 series parts already. 3060 parts may not be far off as we move towards 4060 series.

 

On AMD side, 7900 are moving. There's a fair bit of RDNA2 volume through the range. Did they over-produce that much?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

7900 XTX has been consistently in stock since launch (at least in UK). They're selling, but they're not instantly disappearing.

 

Lowering prices happens when you want to try and stimulate demand and/or adjusting to competition (both seller vs seller, brand vs brand).

 

They don't know what they are doing as they don't have an expert like yourself.

 

While not ideal, I went to the only numerical source of sales data I know: mindfactory. Unfortunately the data I managed to dig up is rather limited in timescale, and extracted from TechEpiphany tweets. Does anyone know where to find the actual source of the data? I can't find it directly from Mindfactory. I'd caution this is one seller in one country, and may not reflect the wider market.

 

Post Date 13-May 15-Apr 27-Mar
Week 19 15 12
4090 130 210 180
4080 170 150 255
4070 Ti 220 290 200
4070 335 535 0
3080 Ti 10 0 0
3080 10 40 50
3070 Ti 40 60 130
3070 0 70 80
3060 Ti 90 100 130
3060 230 255 320
3050 20 20 25
       
7900 XTX 160 270 170
7900 XT 170 350 370
6950 XT 200 330 110
6900 XT 40 10 40
6800 XT 150 40 110
6800 40 310 240
6750 XT 90 100 90
6700 XT 230 480 330
6700 20 0 0
6650 XT 120 40 120
6600 XT 10 20 20
6600 220 530 240
6500 XT 10 160 50
6400 0 20 40

In Europe Nvidia is more popular than in the US.

 

In the US, AMD does sell a bit better than in Europe.

4 hours ago, porina said:

4070 released just before the middle date above.

 

Ada is moving. On Ampere, 3070 and above look like they're selling out, or just not selling, which is appropriate since they're transitioning to 40 series parts already. 3060 parts may not be far off as we move towards 4060 series.

 

On AMD side, 7900 are moving. There's a fair bit of RDNA2 volume through the range. Did they over-produce that much?

Yes, both companies overproduced by a lot.

 

AMD have been getting rid of massive amounts of stock through price cuts and discounts. Though MLID says Nvidia's over supply situation is worse and that they have more cards to get rid of but that they aren't as willing to discount prices as AMD.

 

I would assume most (if not all) of the RDNA2 price cuts are being supported by money AMD is giving AIBs.

 

The cards that I think are running low in stock from RDNA2 are the 6800 and 6800XT.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, AluminiumTech said:

Define current gen?

 

Fifa 23 uses a relatively recent version of Frostbite (similar in version to Battlefield 2042's version of Frostbite or possibly even newer) (which is hilariously overpowered/under utilised for a Fifa game).

 

I think of the top 58 games COD: Modern Warfare 2 / Warzone 2.0 is the only one using an engine of a remotely similar calibre to UE5.

You're right, its hard to define what a current-gen engine is and its just just the engine that counts, its what baseline hardware the developer was targeting when writing the game itself.   I suppose I'd have to go by engine developers definitions, major version number changes like UE5.  Unfortunately they don't always tell us and different developers may even change the major number based on different criteria.

 

That's rather the problem, an engine is only going to really be fully optimised for modern features based on what scaling the engine developer is intending.  What hardware were they aiming for (as a minimum spec) during the re-write, you can't just say "everything" and get lots of new functionality, there has to be a target and if that target is 8GB VRAM minimum then its likely it would scale a lot higher than if it was 4GB minimum.

 

But that's kinda the point as even most games using things like raytracing, its a feature glued on top of an existing engine - not developed from the ground up for modern hardware.  I think there's only really a few PS5 games (Demon's Souls and Ratchet & Clank Rift Apart spring to mind) that were written from the ground up specifically to target that hardware alone and so would qualify as truly next-gen optimised.

 

Its further complicated when we have Fortnite in UE5 but that game is specifically optimised to scale extremely widely and inherently the game logic is ported from UE4, so while the engine is current-gen, the game itself clearly is not.  Any competitive FPS needs to be more widely scalable than what we can expect from other game genres, as the primary focus is frame rate and input latency rather than graphical quality.  Fortnite is a test bed for the engine developers to show off how a last-gen game can be ported to a current-gen engine, but not necessarily taking advantage of all its features.  Plus its a rare case where the developer had money to burn on this, as its meant to promote use of the engine.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Alex Atkin UK said:

That's rather the problem, an engine is only going to really be fully optimised for modern features based on what scaling the engine developer is intending.  What hardware were they aiming for (as a minimum spec) during the re-write, you can't just say "everything" and get lots of new functionality, there has to be a target and if that target is 8GB VRAM minimum then its likely it would scale a lot higher than if it was 4GB minimum.

That's really not how game engines and game development work. The engine has vastly less to do with the complaints you are talking about. You can right now use UE5 and create a game with 1995 game assets and coding designed to target such old and low performance and the resource utilized will be not much different to back then, could be either side i.e. more or even less due to modern efficiencies.

 

Much of what newer builds and major versions of game engines do is bringing in capabilities of what were external plugins like lightening engines. What you can do in UE5 was done in UE4 with additional software, UE5 has brought a lot of that internal and inbuilt. Simply pointing to the usage of a "not UE5" engine doesn't at all mean it's not using current highly advanced game design tools and techniques.

 

Massive resource increase (GPU memory) almost entirely come from game design choices like model/geometry quality, textures, lighting techniques, particle effects, number of elements in "world" and viewport etc. Game engine has much more to do with how difficult something is to achieve rather than what is possible to achieve hence why it has little to do with "games need more than 8GB now".

 

The two main drivers behind resource increases is the desire to do more and the ability to do so more attainably with lower effort thus lower cost. What holds this back is "what can reasonably run [this]".

 

You can make something look amazing, truly the most detailed visuals you have ever seen, however with a dead and empty world because you can have so little in the game environment and that would be a dull boring and useless game.

 

Anyway I would say 8GB hardly limits what is possible factoring in what is reasonable, I would however say game design pressures do a poor job of managing best use and reasonable assets in the right situations. If you spent a lot of time optimizing "detail levels" for the explicit purpose of reducing VRAM usage with as little as possible difference in viewable visual quality there should be plenty to gain, this used to be done way back when consoles and PCs had so little and yet there are games that still look visually appealing even today. I personally am very weary of more VRAM simply being used to do technical and effort level worse job than to do something new and better that was born possible from more VRAM.

 

UE5 does have revolutionary features and capabilities but so far next to no usage of them and they themselves don't require massive increases in VRAM usage either. The above reasons have more to do with that and what UE5 offers can help lower VRAM usage not increase it thus allowing more to be done increasing VRAM usage... etc etc.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, AluminiumTech said:

Yes, both companies overproduced by a lot.

 

AMD have been getting rid of massive amounts of stock through price cuts and discounts. Though MLID says Nvidia's over supply situation is worse and that they have more cards to get rid of but that they aren't as willing to discount prices as AMD.

I don't think that is very good analysis nor takes in to account historic business practices of AMD. I guarantee that AMD is still producing RDNA2 GPUs and they did not over produce. AMD likes, historically, to continue to sell existing products well through more than one generation even along side a refresh of the same thing for a decent time i.e RX480 vs RX 580 which were both being made and sold at the same time for a not short period of time.

 

There are a lot of current sales of RX 6000 graphics cards simply because they are still being made. Why create a RX 7700 when there is an existing product that can be sold now with the same effective capabilities and performance without incurring development and logistics costs of a new product.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, leadeater said:

That's really not how game engines and game development work. The engine has vastly less to do with the complaints you are talking about. You can right now use UE5 and create a game with 1995 game assets and coding designed to target such old and low performance and the resource utilized will be not much different to back then, could be either side i.e. more or even less due to modern efficiencies.

I think its a bit of both, as engines are so modular these days and its how all those modules fit together that can cause huge inefficiencies.

 

44 minutes ago, leadeater said:

Much of what newer builds and major versions of game engines do is bringing in capabilities of what were external plugins like lightening engines. What you can do in UE5 was done in UE4 with additional software, UE5 has brought a lot of that internal and inbuilt. Simply pointing to the usage of a "not UE5" engine doesn't at all mean it's not using current highly advanced game design tools and techniques.

 

Indeed, later UE4 games seem to be hitting odd bottlenecks that are neither CPU or GPU, its something in that pipeline that is not scaling to what those games are trying to do.  When you just keep adding lots of third-party building blocks to the engine, which aren't necessarily optimised to work well with each other, this is what we end up with I guess.  UE5 bringing those things into the engine natively ensures they are optimised to work together more efficiently.  Not to mention the tools on the development side too reducing the need to manually code features, saving time and reducing bugs.

 

44 minutes ago, leadeater said:

I would however say game design pressures do a poor job of managing best use and reasonable assets in right right situations. If you spent a lot of timing optimizing "detail levels" for the explicit purpose of reducing VRAM usage with as little as possible difference in viewable visual quality there should be plenty to gain, this used to be done way back when consoles and PCs had so little and yet there are games that still look visually appealing even today.

The problem is that is financially expensive and the wider range of hardware you are supporting, the harder it is to do.

 

I don't pretend to be an expert, but ever increasing memory requirements has always been a thing and to see us stick with 8GB for so long, its only logical that has to be putting a limitation on what can be done.  Sooner or later 8GB WILL become the new low settings, the problem is nobody seems prepared to accept that as its been "good enough" for so long.

 

I'm sceptical at the idea of using cinema quality assets and letting the engine convert them in real time myself.  Not that it can't do it, I'm sure it can, but those assets are going to be huge and nobody is ready for all AAA games to be hundreds of Gigabytes in size, especially if your PC is on the lower end so you're not getting any benefit out of them.  All demos I've seen of this technology has been on top-end hardware, though I guess that's the point, to show what it can do in the future rather than what we might have today.

Thing is though, I do want to see at least some games do that, and in order for it to be worth it they need people to have enough VRAM to actually see the benefits, so we come back to the point.

 

I tend to look at it as the reason we think 8GB is enough is because they HAVE been optimising for that, so we haven't seen what they could REALLY do if more was readily available.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Alex Atkin UK said:

I don't pretend to be an expert, but ever increasing memory requirements has always been a thing and to see us stick with 8GB for so long, its only logical that has to be putting a limitation on what can be done.  Sooner or later 8GB WILL become the new low settings, the problem is nobody seems prepared to accept that as its been "good enough" for so long.

GDDR capacity per module has been a bit stuck for a while now, density isn't really increasing like it used to and then you have companies like Nvidia that choose to implement a narrower memory bus that offsets any density gains that have been made.

 

It's also the reason we had odd ball situations like the RTX 3060 vs RTX 3060 Ti. The only reason the RTX 3060 got 12GB was the choice to implement a narrow bus and then use high density memory to avoid being stuck with 6GB. The RTX 3060 could have been an 8GB card, quite easily but that would have impacted board cost and TGP/TBP along with not aligning with Nvidia's "Vision" of how their products should be.

 

Nvidia is more focused on performance tiers of their products rather than things like VRAM capacity, capacity is secondary to performance and product segmenting based on that. As illogical as more VRAM on a lesser card seems the simple fact is VRAM capacity doesn't and never will sell cards and they know it. They also have the market power to keep it that way.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

On 5/19/2023 at 7:06 PM, Alex Atkin UK said:

I
I've been a PC gamer since before GPUs existed and the problem is things have stagnated for so long, some people haven't even experienced a big hardware transition before.  But it HAS to happen, if we want games to continue to look more realistic and developers be able to afford to actually leverage that power.
 

GPU's have always existed. Just the modern incarnation looks nothing like the old ones.

 

There was the EGA->VGA transition (16 colors 640x480)

The ISA VGA to "accelerated" (video adapters that support 640x480 32-bit color)

ISA ->VL BUS and PCI (video adapters up to 8MB)

PCI-AGP 1995 the advent of "3D accelerators", S3 Virge, ATI Rage, 3DFX, Riva TNT, etc.

AGP->PCIe 2003 (SLI)

PCI 1.0x-> 3.0 (SLI still viable)

PCI 3.0->4.0/5.0 (3-slot GPU's)

 

The leapfrog's in performance came from the ISA to VL-BUS and the PCI-AGP changes. After the AGP to PCIe migration (where SLI became a thing) the improvements stopped being leapfrogged because bandwidth managed to get ahead of game performance. Now it's more like performance is being artificially held back by poor power management. There is no reason why a GPU should take 2 slots, let alone 3 or 4 and also pull 600w. 

 

What I expect is there will be a reckoning in the GPU markets as "green" policies start discouraging the use of high-TDP GPU parts due to what happened with crypto. With this being hammered to game consoles as well.

 

I'm sure there is still performance and "graphics realism" to squeeze out of high end GPU's, but we've long since passed the necessary need to have a high end GPU and CPU to run 1080p or 1440p, but we're still not quite at 4K-everywhere. Which means that we're probably on a trajectory where once 4Kp144 is available, the need for higher capability GPU's hits a wall, and game consoles stop having any reason to sell a new model.

 

1080p is already done though. If you aren't 4K gaming, then you've largely have no reason to use a top model GPU, but the lack of VRAM also results in games having issues even at 1080p. So perhaps the next series of GPU's should double the VRAM across the board.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

AMD likes, historically, to continue to sell existing products well through more than one generation even along side a refresh of the same thing for a decent time i.e RX480 vs RX 580 which were both being made and sold at the same time for a not short period of time.

The 480 and 580 were essentially the same thing though? It makes sense to sell older gen product at the same time if it doesn't overlap with newer ones, but otherwise it feels like it mainly happens because there is inventory. There's also the case where they may be contracted to supply the old product to some customers even when the new one is available, but that is usually planned ahead.

 

4 hours ago, leadeater said:

Why create a RX 7700 when there is an existing product that can be sold now with the same effective capabilities and performance without incurring development and logistics costs of a new product.

They don't really save on it if the product will exist at some point in the future. At best, they're delaying it a little. The design is probably already done and it is more a manufacturing and market decision when to release.

 

4 hours ago, Alex Atkin UK said:

I tend to look at it as the reason we think 8GB is enough is because they HAVE been optimising for that, so we haven't seen what they could REALLY do if more was readily available.

There's a difference between 8GB being supported, and 8GB being the max used. They can easily scale upwards if they want to. A low minimum spec may limit scaling to higher quality, but I don't feel VRAM is the bigger limit. I feel moving CPU minimums to a decent 6 core (Coffee Lake/Zen 2) and 16GB of system ram would be a great start and be less challenging than moving minimum VRAM above 8GB.

 

1 hour ago, Kisai said:

GPU's have always existed. Just the modern incarnation looks nothing like the old ones.

 

PCI-AGP 1995 the advent of "3D accelerators", S3 Virge, ATI Rage, 3DFX, Riva TNT, etc.

I might be picking at words, but PC GPUs existed from around the early nvidia era. Before that, they were video cards/graphics cards.

 

I think there was only one major transition in PC gaming. The move from 2D to 3D era which 3dfx managed to push. So called "3D" chips existed before then but everyone had their own standard and none really took off. Games only started using 3D hardware support widely around then.

 

Other transitions were less impactful and more incremental. Ray Tracing is about the only other major shift of note.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, porina said:

The 480 and 580 were essentially the same thing though?

Yes, that's what I was saying. AMD can be weird like that. Even the RX 590 is just a refresh of the RX 580 which is a refresh if the RX 480 heh.

 

38 minutes ago, porina said:

They don't really save on it if the product will exist at some point in the future. At best, they're delaying it a little. The design is probably already done and it is more a manufacturing and market decision when to release.

They save on it by doing it at a time where the fabrication costs is lower and the can move the costs in to different financial quarters as well. Quite often when the cost is becomes rather important not just how much it is.

 

When you have the supply chain, product SKUs and distribution already happening then introducing something new as well as phasing something out has a cost.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, porina said:

 

 

I might be picking at words, but PC GPUs existed from around the early nvidia era. Before that, they were video cards/graphics cards.

 

I think there was only one major transition in PC gaming. The move from 2D to 3D era which 3dfx managed to push. So called "3D" chips existed before then but everyone had their own standard and none really took off. Games only started using 3D hardware support widely around then.

 

Other transitions were less impactful and more incremental. Ray Tracing is about the only other major shift of note.

Just to frame this correctly. The "2D accelerators" were a thing before 3D was, I could probably find old magazines touting this. Just at the time a distinction between a "3d accelerator" and the regular graphics chip was not distinct.

We only started calling them GPU's with the Playstation 2. Nvidia and ATI/AMD then basically swiped that label.

 

Like look at the marketing material for the ATI Mach64 and S3 Trio64. This was because of the need for the MPEG motion estimation features of the then new DVD players.

 3DFX's part was a stand alone 3d card.

 

The S3 Virge and Savage 3D products, likewise ATI's Rage/Rage Pro products all exist in a time where the bandwidth of the ISA bus was insufficient for pushing DVD, video capture, 3D graphics or anything meaningful over it. So hence we didn't get "3D" before VL Bus. Yes, we did get a few MPEG cards though.

 

At any rate, in some alternate timeline, 3D didn't push forward AGP, nor PCIe, and instead something else did like storage, before 3D arrived, and we'd still be a decade behind today.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I might be picking at words, but PC GPUs existed from around the early nvidia era. Before that, they were video cards/graphics cards.

 

I think there was only one major transition in PC gaming. The move from 2D to 3D era which 3dfx managed to push. So called "3D" chips existed before then but everyone had their own standard and none really took off. Games only started using 3D hardware support widely around then.

 

Other transitions were less impactful and more incremental. Ray Tracing is about the only other major shift of note.

Honestly, giving nvidia credit for GPU gives them to much credit and is letting them get away with rewriting history.

Its was really them going  "its not a VGA, its a GPU"
Kinda like what they did with RTX

"its not a GTX, its an RTX, WE INVENTED REAL-TIME RAY TRACING"

nah, it's the same thing, just a rebranding of the VGA.

CGA cards were GPUs, they processed the graphics to be displayed onto the screen, even if it was 2d, and SGI were doing 3d graphic accelerators in the 80s

SGI stuff looks intimidating 
http://www.sgistuff.net/hardware/systems/images/iris-1500-gf2.jpg

 

  

6 minutes ago, Kisai said:

JLike look at the marketing material for the ATI Mach64 and S3 Trio64. This was because of the need for the MPEG motion estimation features of the then new DVD players.

 3DFX's part was a stand alone 3d card.

 

The S3 Virge and Savage 3D products, likewise ATI's Rage/Rage Pro products all exist in a time where the bandwidth of the ISA bus was insufficient for pushing DVD, video capture, 3D graphics or anything meaningful over it. So hence we didn't get "3D" before VL Bus. Yes, we did get a few MPEG cards though.

Man, I wish I still had my old Pentium 1 PC. We turned it into our first DVD player for my family with one of those cards, but I have no idea which one. Once we got our pentium 4 in spring 2001, that PC was gotten rid of and I was to young at the time to know what was in it.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, starsmine said:

CGA cards were GPUs, they processed the graphics to be displayed onto the screen, even if it was 2d, and SGI were doing 3d graphic accelerators in the 80s

SGI stuff looks intimidating 
http://www.sgistuff.net/hardware/systems/images/iris-1500-gf2.jpg

My father used to service SGI powered A-4 Skyhawk simulators. Also Nvidia simply wasn't relevant at the birth of PC 3D graphics, that goes to S3 & 3DFX and nobody else, strictly in the context of PCs.

 

1 hour ago, starsmine said:

Honestly, giving nvidia credit for GPU gives them to much credit and is letting them get away with rewriting history.

Its was really them going  "its not a VGA, its a GPU"

Wasn't even Nvidia at all

 

Quote

The term "GPU" was coined by Sony in reference to the 32-bit Sony GPU (designed by Toshiba) in the PlayStation video game console, released in 1994

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, leadeater said:

Wasn't even Nvidia at all

To clarify, I didn't say nvidia invented it or the term, but they popularised its usage in the PC space.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

I don't think that is very good analysis nor takes in to account historic business practices of AMD. I guarantee that AMD is still producing RDNA2 GPUs and they did not over produce. AMD likes, historically, to continue to sell existing products well through more than one generation even along side a refresh of the same thing for a decent time i.e RX480 vs RX 580 which were both being made and sold at the same time for a not short period of time.

 

There are a lot of current sales of RX 6000 graphics cards simply because they are still being made. Why create a RX 7700 when there is an existing product that can be sold now with the same effective capabilities and performance without incurring development and logistics costs of a new product.

Then why are 6800 and 6800XT unobtanium? And why would AMD keep selling cards that undermine the 7600 such as the 6700 10GB for $280?

 

Also, RX 7700 will be cheaper to produce than a Navi 21 die. AMD has no incentive to keep making large and expensive Navi 21 dies.

 

Polaris was sold for as long as it was partially because of oversupply. Same with 5700 and 5700XT.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

Then why are 6800 and 6800XT unobtanium?

There's an absolute ton of them on newegg from many different AIBs with multiple different models per AIB 🤷‍♂️

 

Also I happen to own a PowerColor Liquid Devil 6800 XT so....

 

1 hour ago, AluminiumTech said:

And why would AMD keep selling cards that undermine the 7600 such as the 6700 10GB for $280?

You're mistaken if you think it's undermining a product that does not exist. Such cards do not exist because AMD is already selling cards well at price points people have been complaining about not existing, that is the feedback they have been getting.  You can get a RX 6700 for $280, you won't get a RX 7600 for that low today.

 

It doesn't matter if both existing might undermine each other what matters is value compared to market and a more expensive GPU performing the same as a cheaper one is bad and that's not even looking at Nvidia options. Since when is it a good idea to release a new product with worse $/FPS?

 

Computex is literally coming soon, lower model RX 7000 GPUs will be announced then if they are going to be before Q4 otherwise they will be Q4.

 

And anyway an RX 7600 is likely going to be faster than a RX 6700 which means both can be sold at the same time perfectly fine. Some RX 6000 models will be phased out of production, many will exist for a long time. Plus RX 7800 and RX 7700 variants seem more likely to me than lower ones for the next off the rank to be released.

 

Navi 22 and especially Navi 23 have a lot of value life left in them so I don't see why AMD would want to stop making those chips and transition to a vastly more expensive node even if the die size shrinks, I just don't see it being cheaper. Navi 21 is big, like actually big for a die so that's going first. However if it's stopped being produced that would have been very, very recently.

 

I'd also like to point out multiple RX 500 GPUs are still being made. The RX 570 which is very similar performance to the RX 6500 XT, each being faster than the other in certain games, while undercutting the price of the RX 6500 XT.

 

1 hour ago, AluminiumTech said:

Polaris was sold for as long as it was partially because of oversupply. Same with 5700 and 5700XT.

Nope, because it's still being manufactured now. There's a lot of "it was over produced" with basically nothing back that. AMD is a serial offender for not making enough GPUs, NOT too many.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Nope, because it's still being manufactured now. There's a lot of "it was over produced" with basically nothing back that. AMD is a serial offender for not making enough GPUs, NOT too many.

We can flip that around, what evidence is there they are still being produced? To make sure we're arguing over the same thing: is AMD still placing new orders for silicon of RDNA2 and older? I doubt it.

 

What doesn't count:

AMD have existing stocks of silicon which AIBs turn into cards, as it is worth more than writing it off.

Fabs are following through on contracts placed a long time ago, perhaps even going back into crypto boom era. It isn't necessarily a choice they would make now, but one they had to in the past with the information they had at the time.

 

What would count:

Significant ongoing customer orders for older product, like large OEMS maybe? They would likely market older generation with some overlap into new generation, plus need some for warranty or similar.

Non-consumer facing use cases (embedded?)

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×