Jump to content

RX 7600 and RTX 4060Ti specs and release dates leaked | 8GB and 16GB cards coming for both

AlTech

Edit:

Corrections on Expected Pricing for the 4060Ti. The expected pricing for the 8GB is actually $399 and not $349. The expected price for the 16GB is $499 and not $449.

 

Sorry for the error.

 

Edit 2: Added confirmed info by Nvidia and AMD and added MLID expected performance.

 

Summary

 

The specs of the upcoming RX 7600 (Previously believed to be called RX 7600XT) and RTX 4060Ti have been leaked.

 

The result is an interesting mirroring of strategies with both companies preparing an 8GB card to be launched in May and both companies launching a 16GB card in July.

 

The tables have been compiled by yours truly combining both info from the sources and MLID.

 

The 3 GPUs in this table are all expected to perform like an RX 6700.

Screenshotfrom2023-05-2000-42-20.thumb.png.24fe1b68fb4fdd0db95690f24e27b3dd.png

 

The 4060Ti is expected to perform like a 3070.

Screenshotfrom2023-05-2000-49-33.png.3bab180e98da73894205b392ac5ca839.png

 

 

The 4060Ti 8GB is expected to launch on May 24th whilst the RX 7600 is expected to launch on May 25th.

 

Quotes

Quote

The configuration of NVIDIA’s upcoming mid-range GPU called GeForce RTX 4060 Ti has now been confirmed through a Geekbench test. It is clear that board partners and NVIDIA have already sent out the samples to the media and influencers and the following results are most likely a first sign that the card is already being tested.

 

Quote

Nevertheless, the RTX 4060 Ti with 16GB VRAM will have a different SKU than 8GB model (363 vs. 361), but the board design will remain the same (PG190). More importantly, NVIDIA found it necessary to use a different GPU variant for this card specifically, but it is a minor bump from AD106-350 to 351. This likely means that it has the same CUDA core count, but some physical changes to the design were still necessary.

 

Quote

The Radeon RX 7600 is based on Navi 33 GPU with 32 Compute Units (2048 Stream Processors). The packaging confirms this configuration of the GPU as well as its memory capacity of 8GB. What the box does not say are the GPU and memory clocks, as well as the memory bus. The bus is what might limit the card’s capability because it is reportedly just 128-bit wide. Furthermore, AMD Navi 33 only supports up to 8 lanes in PCIe Express 4.0 standard.

 

 

My thoughts

It's kinda interesting to see AMD ditch the XT branding for the 7600 but honestly releasing these 8GB cards (from both sides/teams) is an insult and should not be rewarded by gamers. This card should only have come in 1 model: the 16GB version. 8GB VRAM isn't enough for 1440p and soon may not be enough for 1080p.

 

I sincerely hope people give Nvidia the middle finger if their pricing turns out as expected. We don't need $350 1080p cards launching in 2023. 2013 called, it wants it's GPUs back.


MLID has speculated this is expected to be AMD's last hurrah at the mid range dGPU segment and that AMD will switch buyers towards Ryzen APUs with performance that's better than the RX 7600 in the future. Much as I am disappointed that this will happen, it is inevitable and resistance is futile. The end is never the end is never etc.

 

Sources

https://videocardz.com/newz/nvidia-geforce-rtx-4060-ti-shows-up-on-geekbench-with-4352-cuda-cores-and-8gb-vram

https://videocardz.com/newz/sapphire-radeon-rx-7600-pulse-graphics-card-pictured-features-32-rdna3-cus-and-8gb-vram

https://videocardz.com/newz/nvidia-geforce-rtx-4060-ti-16gb-to-feature-ad106-351-gpu-and-165w-tdp

 

 

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Finally, a card that i'll buy used in 3 years

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

Its mental that we will have lower-end cards with more VRAM than higher-end ones.  But of course NVIDIA feel safe to do this as it wont impact their workstation card market that way.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Alex Atkin UK said:

Its mental that we will have lower-end cards with more VRAM than higher-end ones.  But of course NVIDIA feel safe to do this as it wont impact their workstation card market that way.

I think it's gonna make the 4070 and 4070Ti users upset (as they rightfully should be) 😛.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Sadly a theoreticaly priced $280 7600 8GB would likely sell well simply due to how inflated all GPU prices are. "I'll just buy this now and upgrade later" mentality for sub $1k PC builds. 

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TVwazhere said:

Sadly a theoreticaly priced $280 7600 8GB would likely sell well simply due to how inflated all GPU prices are. "I'll just buy this now and upgrade later" mentality for sub $1k PC builds. 

AMD apparently settled on $279 cos they've seen/heard that they'll get flak if they charged $350 as they originally wanted and intended to do.

 

Also, a lot of people don't seem terribly upset by the idea of a sub $300 8GB card; even though imo they should be.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, AluminiumTech said:

The specs of the upcoming RX 7600

Live Claire Reaction, RX6700XT out of the frame: 

 

 

Press quote to get a response from someone! | Check people's edited posts! | Be specific! | Trans Rights

I am human. I'm scared of the dark, and I get toothaches. My name is Frill. Don't pretend not to see me. I was born from the two of you.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SorryClaire said:

snip

But you prob paid less than $350 for your 6700XT.

 

Also, you have that card rn whereas the 16GB cards aren't coming out for 2 months.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, AluminiumTech said:

AMD apparently settled on $279 cos they've seen/heard that they'll get flak if they charged $350 as they originally wanted and intended to do.

 

Also, a lot of people don't seem terribly upset by the idea of a sub $300 8GB card; even though imo they should be.

What do you mean by "sub $300 8GB card" - shouldn't 8GB be seen as entry-level these days, so it being below $300 would then be a good thing?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, YoungBlade said:

What do you mean by "sub $300 8GB card" - shouldn't 8GB be seen as entry-level these days, so it being below $300 would then be a good thing?

I think that for $200 8GB is fine but for $300 or close to it we should be getting 12GB or more.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AluminiumTech said:

I think that for $200 8GB is fine but for $300 or close to it we should be getting 12GB or more.

Okay, because "sub $300" just means "below $300" so I was kind of confused what you meant by that.

 

I agree, 8GB should be reserved for cards in the $100-200 price range, it's just that $200 is also "sub $300."

Link to comment
Share on other sites

Link to post
Share on other sites

As someone who has yet to run into a VRAM issue with a 2060 6GB mobile gpu I will say the 8GB of VRAM for 1080p should be more than fine. The issue with more VRAM being added to cards is it seems to increase the price by a substantial margin more than what you would expect the cost of adding the extra RAM to be. Which means they are factoring in either an increase in cost based on how much better the card could preform with the larger memory or they are just plain saying we feel you own us for giving you this option.

 

Personally I feel that game creators need to learn how to cut down on the VRAM usage, because it is getting out of hand if someone really needs 16GB or more to play a game. There are a lot of PCs on tye market right now that only have 16GB of system RAM and now we are expected to have that muchbor more on our GPU too. Even when some one is playing at higher resolutions, this really is over the top from where I stand.

 

If you are going to be using either of these cards at given the lower price range I would assume you are gaming at 1080p (since the displays are cheaper), there for the 8GB should be enough for your needs. Though from the stand point of future proofing and get a bit more preformance the 16GB could be a better option for you if you can stomach the price. Since you areblikely using a 1080p display and are on a lower bugget I would take a guess that you also don't use Ray Tracing as you would take performance hit with it. So, in my personally opinion the RX 7600 16GB would be the best option if you wanted the higher VRAM. 

 

I would never pay $450 for a 4060 ti unless I absolutly had to, so I would go team red in this case. If I was looking for a card in this range.

My Main PC

  • CPU: 13700KF
  • Motherboard: MSI MAG Z790 Tomahawk
  • RAM: 32GB (16GBx2) DDR5-6000MHz TEAMGROUP T-Force Delta
  • GPU: RTX 4070 ASUS Dual
  • Case: RAIDMAX X603
  • Storage: WD SN770 2TB
  • PSU: Corsair RM850X Fully Modular
  • Cooling: DEEPCOOL LS720
  • Display(s): Gigabyte G24F2 & Dell S2318HN/NX
  • Keyboard: Logitech G512 Carbon (GX Blue)
  • Mouse: Logitech G502 Hero
  • Sound: Bose Headphone & Creative SBS260
  • Operating System: Windows 11 Pro

Laptop: Alienware m15 R1

  • OS: Windows 10 Pro
  • CPU: 9750H
  • MB: OEM
  • RAM: 16GB (8GBx2) DDR4 2666Mhz
  • GPU: RTX 2060 (Mobile)

Phone: Galaxy A54

Other: Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AluminiumTech said:

I think it's gonna make the 4070 and 4070Ti users upset (as they rightfully should be) 😛.

As a 4070 owner it doesn't phase me at all because I haven't bought into the VRAM FUD. 12GB is plenty for the upper-mid range at least the next couple of years. 8GB is still fine for a low-mid GPU.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TVwazhere said:

Sadly a theoreticaly priced $280 7600 8GB would likely sell well simply due to how inflated all GPU prices are. "I'll just buy this now and upgrade later" mentality for sub $1k PC builds. 

I mean for that price range it's a good 1080p card and for that 8gb is probably enough for alot of games and the ones where it isn't you can likely just turn down some settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, porina said:

As a 4070 owner it doesn't phase me at all because I haven't bought into the VRAM FUD.

Do you think Hardware Unboxed is lying to people when they show 8GB Nvidia GPUs being physically unable to run the same experiences AMD equivalent cards can because of VRAM?

46 minutes ago, porina said:

12GB is plenty for the upper-mid range at least the next couple of years.

It is if you enjoy spending $600 for a 1440p capable GPU from Nvidia when you could buy a 4K capable GPU from AMD for $600.

46 minutes ago, porina said:

8GB is still fine for a low-mid GPU.

Because gamers who only play at 1080p deserve to pay $350 or more for GPUs?

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AluminiumTech said:

Do you think Hardware Unboxed is lying to people when they show 8GB Nvidia GPUs being physically unable to run the same experiences AMD equivalent cards can because of VRAM?

It is if you enjoy spending $600 for a 1440p capable GPU from Nvidia when you could buy a 4K capable GPU from AMD for $600.

Because gamers who only play at 1080p deserve to pay $350 or more for GPUs?

Honestly I don't think alot of people are into 4k gaming and if you are you are probably wanting to buy at least a 4080 anyways. That being said it is good that you have AMD for an option if you want more vram. Personally I play on a 1080p 360hz monitor exclusively so vram has never really been much of an issue especially because all of the cards I have bought had at least 11gb or ram anyways which is more than enough for 1080p. I think a huge issue is trying to make sure your gpu is both affordable and has enough ram. I mean yes having a bunch of vram is nice but if it requires a big increase in the price of the card then you might run into issues and it might be worth having less vram to keep pricing down. I would imagine that the 4070ti would need 24 gb of ram if they were to increase the ram amount without a change is bus width and other chip features which would likely cost a whole lot more if they went this direction. I guess the easiest way around that would be having the chip designed to have 16gb in the first place but I would imagine that would make the 4080 look even worse than if already is. 

Link to comment
Share on other sites

Link to post
Share on other sites

16gb version being 92bit, and an 4060. 🙂 price increase is not great, while 8gb might be fine on the lower cards, do wonder how Vram will be impacted in the lower end. If PC will struggle more with that than the built-in systems like consoles.

 

gotta hope the 7600 is a good lower end gpu choice.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Azurael said:

As someone who has yet to run into a VRAM issue with a 2060 6GB mobile gpu I will say the 8GB of VRAM for 1080p should be more than fine. The issue with more VRAM being added to cards is it seems to increase the price by a substantial margin more than what you would expect the cost of adding the extra RAM to be

It doesn't. Nvidia's just being stingy.

1 hour ago, Azurael said:

Which means they are factoring in either an increase in cost based on how much better the card could preform with the larger memory or they are just plain saying we feel you own us for giving you this option.

 

1 hour ago, Azurael said:

Personally I feel that game creators need to learn how to cut down on the VRAM usage,

Not going to happen. Devs for big companies have already said they're making use of that VRAM and that 8GB is the new minimum they're targeting.

1 hour ago, Azurael said:

because it is getting out of hand

The only thing getting out of hand is how much people are forgiving Nvidia for not putting enough VRAM on GPUs. It's not the developers fault Nvidia doesn't put enough VRAM on GPUs.

1 hour ago, Azurael said:

if someone really needs 16GB or more to play a game. There are a lot of PCs on tye market right now that only have 16GB of system RAM and now we are expected to have that muchbor more on our GPU too.

Yes because the PS5 and Xbox Series consoles have access to a shared pool of 16GB for VRAM and RAM.

 

AMD sees the problem and their solution in the next few years is to replace mid range GPUs with APUs.

 

So if you buy Ryzen you'll end up buying a Ryzen APU which has the CPU and GPU perf you want and it'll use DDR5 for both RAM and VRAM.

1 hour ago, Azurael said:

Even when some one is playing at higher resolutions, this really is over the top from where I stand.

What amount of VRAM should a game use at 1440p?

 

8GB isn't enough. 12GB is sufficient for now.

 

What about 4K?

10GB isn't enough. 12GB is barely enough for games that don't use a lot of VRAM and 16GB is fine for 4K.

 

The only problem here is Nvidia is consistently giving gamers not enough VRAM.

1 hour ago, Azurael said:

If you are going to be using either of these cards at given the lower price range I would assume you are gaming at 1080p (since the displays are cheaper), there for the 8GB should be enough for your needs. Though from the stand point of future proofing and get a bit more preformance the 16GB could be a better option for you if you can stomach the price.

The expected pricing of the 4060Ti is lunacy IMHO. It's about $50 to $100 too expensive.

1 hour ago, Azurael said:

Since you areblikely using a 1080p display and are on a lower bugget I would take a guess that you also don't use Ray Tracing as you would take performance hit with it.

And Ray Tracing increases VRAM requirements.

1 hour ago, Azurael said:

So, in my personally opinion the RX 7600 16GB would be the best option if you wanted the higher VRAM. 

 

1 hour ago, Azurael said:

I would never pay $450 for a 4060 ti unless I absolutly had to, so I would go team red in this case. If I was looking for a card in this range.

 

Just now, Quackers101 said:

16gb version being 92bit, and an 4060.

The 4060Ti 16GB will be 128 Bit.

Just now, Quackers101 said:

price increase is not great, while 8gb might be fine on the lower cards, do wonder how Vram will be impacted in the lower end. If PC will struggle more with that than the built-in systems like consoles.

Undoubtedly they will. The consoles need the amount of RAM they do for a reason. As GPUs are expected to do more things, the requirements for VRAM will keep going up.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Brooksie359 said:

Honestly I don't think alot of people are into 4k gaming and if you are you are probably wanting to buy at least a 4080 anyways.

So you think people are buying overpriced 1440p cards from Nvidia then?

19 minutes ago, Brooksie359 said:

That being said it is good that you have AMD for an option if you want more vram.

It's not really a matter of want but as a matter of need. New games say you should have 16GB of VRAM for 4K.

19 minutes ago, Brooksie359 said:

Personally I play on a 1080p 360hz monitor exclusively so vram has never really been much of an issue especially because all of the cards I have bought had at least 11gb or ram anyways which is more than enough for 1080p. I think a huge issue is trying to make sure your gpu is both affordable and has enough ram.

Not really tbh.

19 minutes ago, Brooksie359 said:

I mean yes having a bunch of vram is nice but if it requires a big increase in the price of the card then you might run into issues and it might be worth having less vram to keep pricing down.

It doesn't require a big price increase. Going from 8GB to 16GB VRAM costs around $20 to $30 according to MLID's supply chain sources.

 

Anything above a $30 increase + a tiny bit extra profit for increasing VRAM is milking customers.

 

$100 extra is Nvidia milking customers that they know will buy Nvidia no matter what.

19 minutes ago, Brooksie359 said:

I would imagine that the 4070ti would need 24 gb of ram if they were to increase the ram amount without a change is bus width

I suppose they could do 18GB. But this is a problem entirely of Nvidia's own making.

19 minutes ago, Brooksie359 said:

and other chip features which would likely cost a whole lot more if they went this direction. 

Only thing that would cost most is having the VRAM, there would be any significant cost increase otherwise

19 minutes ago, Brooksie359 said:

I guess the easiest way around that would be having the chip designed to have 16gb in the first place

Yes, by designing and building their GPUs correctly which seems to be too difficult for poor Nvidia with 70% marketshare.

19 minutes ago, Brooksie359 said:

but I would imagine that would make the 4080 look even worse than if already is. 

It is not the job of gamers to accept inferior products in order to avoid embarrassing companies who make dumb decisions based on corporate greed.

 

The 4080 is a bad card all around. It's $400 too expensive for a 4K card when AMD has a 4K card in the form of the 7900XT for $899 MSRP which has regularly seen discounts to $799 or lower.

 

It's performance is mildly better than a 3080 but is a significant step down from the 4090.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Unless AMD feels like supporting a single socket for a long time, and vendors are aboard with long term BIOS updates, I don’t see APUs dethroning the mid-range dGPU.
 

Hampering future GPU upgrades is a big tradeoff. Even if the platform can no longer run current games (mine being a pretty strong example), a new GPU means running what you have now, at much higher settings, resolution, framerate, etc. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, AluminiumTech said:

So you think people are buying overpriced 1440p cards from Nvidia then?

It's not really a matter of want but as a matter of need. New games say you should have 16GB of VRAM for 4K.

Not really tbh.

It doesn't require a big price increase. Going from 8GB to 16GB VRAM costs around $20 to $30 according to MLID's supply chain sources.

 

Anything above a $30 increase + a tiny bit extra profit for increasing VRAM is milking customers.

 

$100 extra is Nvidia milking customers that they know will buy Nvidia no matter what.

I suppose they could do 18GB. But this is a problem entirely of Nvidia's own making.

Only thing that would cost most is having the VRAM, there would be any significant cost increase otherwise

Yes, by designing and building their GPUs correctly which seems to be too difficult for poor Nvidia with 70% marketshare.

It is not the job of gamers to accept inferior products in order to avoid embarrassing companies who make dumb decisions based on corporate greed.

 

The 4080 is a bad card all around. It's $400 too expensive for a 4K card when AMD has a 4K card in the form of the 7900XT for $899 MSRP which has regularly seen discounts to $799 or lower.

 

It's performance is mildly better than a 3080 but is a significant step down from the 4090.

I think the majority of gamers are on 1440p or 1080p and that 4k gaming is niche and of those 4k gamers alot of them are buying the 4090 anyways as it is objectively the best 4k gaming card on the market. I wouldn't even be surprised if NVIDIA is intentionally limiting vram to funnel 4k gamers to their higher end offerings. Anyways I think most of the people buying a 4070ti ot 4070 are not going to be 4k gaming but rather 1440p or 1080p gaming. 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, AluminiumTech said:

Do you think Hardware Unboxed is lying to people when they show 8GB Nvidia GPUs being physically unable to run the same experiences AMD equivalent cards can because of VRAM?

You can pick scenarios to sit either side of a VRAM limit if you want to. Or just find a setting and play the game.

 

41 minutes ago, AluminiumTech said:

It is if you enjoy spending $600 for a 1440p capable GPU from Nvidia when you could buy a 4K capable GPU from AMD for $600.

There's more to life than raster performance. For my uses I'd consider a 7900XT to be their lowest offering I'd consider, and quite simply not worth it at the asking price as the 4080 makes better sense again if I'm going that far.

 

41 minutes ago, AluminiumTech said:

Because gamers who only play at 1080p deserve to pay $350 or more for GPUs?

It'll be fine at 1440p and even 4k at realistic settings.

 

11 minutes ago, AluminiumTech said:

It doesn't require a big price increase. Going from 8GB to 16GB VRAM costs around $20 to $30 according to MLID's supply chain sources.

There's a big difference between BOM cost and finished goods. Adding say $10 of material to a product will make the finished product go up by a lot more than that. This applies in general, not just limited to PC tech.

 

As a very simplified example:

Say you have a product that is $10 of "stuff". Material only.

You turn that "stuff" into a product. It costs to do that, and you want a profit on top.

You sell that wholesale to distributors. Distys take their profit.

Distys sell to resellers/stores. They take their cut of profit.

At some point the tax man will take their share too.

 

It multiplies up the chain.

 

 

I trimmed too much quote somewhere. On VRAM targets in games, at least some broken releases are getting fixes. TLOU in particular has had reduced VRAM usage in a recent patch, and before that they increased the visual quality of low and medium textures. I haven't looked to see if anyone has done retesting or if references are still pointing to the out of date launch tests. It is a sad fact that so called AAA releases on PC are often broken on launch, regardless how much VRAM you have, and it can take quite some time before they patch it into what it should have been.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zodiark1593 said:

Unless AMD feels like supporting a single socket for a long time, and vendors are aboard with long term BIOS updates,

They do though. AM4 was supported for Ryzen 1000 through Ryzen 5000.

 

AM5 will probably be supported for at least the next 3-4 years if not the next 5 years.

1 hour ago, Zodiark1593 said:

I don’t see APUs dethroning the mid-range dGPU.
 

AMD already has a Zen 5 laptop APU planned that performs like a 6700XT which is what the 7600 is targeting performance wise for wayyy less power. There's no reason AMD couldn't also release that on desktop. And it wouldn't have VRAM issues because it would use system RAM.

 

OEMs have privately said they will stop putting Nvidia's 50 and 60 class GPUs in laptops if AMD makes these laptop APU designs easy to integrate.

1 hour ago, Zodiark1593 said:

Hampering future GPU upgrades is a big tradeoff.

How would GPU upgrades be hampered? By tying them to CPU upgrades?

1 hour ago, Zodiark1593 said:

Even if the platform can no longer run current games (mine being a pretty strong example), a new GPU means running what you have now, at much higher settings, resolution, framerate, etc. 

Which would equally benefit from having a better CPU.

 

1 hour ago, Brooksie359 said:

I think the majority of gamers are on 1440p or 1080p and that 4k gaming is niche and of those 4k gamers alot of them are buying the 4090 anyways as it is objectively the best 4k gaming card on the market.

You'd be surprised with the number of people buying 4080s for 4K.

 

Buying a 4070Ti for 1440p is a waste of money when a cheap and cheerful GPU for $400 or less can run 1440p.

1 hour ago, Brooksie359 said:

I wouldn't even be surprised if NVIDIA is intentionally limiting vram to funnel 4k gamers to their higher end offerings.

It might be backfiring on Nvidia cos most of the few people who buy 4080s regret it and return it for an RDNA2 card.

1 hour ago, Brooksie359 said:

Anyways I think most of the people buying a 4070ti ot 4070 are not going to be 4k gaming but rather 1440p or 1080p gaming. 

Which would be wasting money. People shouldn't have to spend $600 to play 1440p. In 2020 a supposedly 4K card from Nvidia was $699. (it wasn't actually a real 4K card cos it didn't have enough VRAM but that's besides the point) and it was $699.

 

Over time tech is meant to become more affordable as it becomes easier and cheaper to produce. Nvidia is instead charging more every generation for no reason other than corporate greed.

 

1 hour ago, porina said:

You can pick scenarios to sit either side of a VRAM limit if you want to. Or just find a setting and play the game.

If you buy a $500 or more GPU you should not be compromising the visual integrity of the game just to get a badly designed GPU to work.

 

This is acceptable for budget and low end GPUs that cost $200 to $250. Not for GPUs that cost 2-3x as much.

1 hour ago, porina said:

There's more to life than raster performance. For my uses I'd consider a 7900XT to be their lowest offering I'd consider, and quite simply not worth it at the asking price as the 4080 makes better sense again if I'm going that far.

How? The 7900XT is objectively better than the 4070Ti, it can actually run games with RT better than Nvidia's card.

 

You don't think $799 (a price that you can essily buy a 7900XT for) for almost 4080 perf is better than paying $1200 for 4080 perf?

 

Or even $1000 7900XTX vs $1200 4080?

1 hour ago, porina said:

It'll be fine at 1440p and even 4k at realistic settings.

So you're fine paying a $200-$500 Nvidia tax for 4K gaming?

 

Or a $600-$800 tax for 1440p gaming?

1 hour ago, porina said:

There's a big difference between BOM cost and finished goods. Adding say $10 of material to a product will make the finished product go up by a lot more than that. This applies in general, not just limited to PC tech.

 

As a very simplified example:

Say you have a product that is $10 of "stuff". Material only.

You turn that "stuff" into a product. It costs to do that, and you want a profit on top.

You sell that wholesale to distributors. Distys take their profit.

Distys sell to resellers/stores. They take their cut of profit.

At some point the tax man will take their share too.

 

It multiplies up the chain.

 

The end result still doesn't give you the difference between a 4070Ti and a 4080 for an extra 4GB of VRAM. Nvidia could afford to give 4080 owners 32GB of VRAM for that price and still make a comfortable profit.

1 hour ago, porina said:

On VRAM targets in games, at least some broken releases are getting fixes. TLOU in particular has had reduced VRAM usage in a recent patch, and before that they increased the visual quality of low and medium textures. I haven't looked to see if anyone has done retesting or if references are still pointing to the out of date launch tests.

Other games still prove the point. 1 game later being optimized doesn't negate the pattern shown by many games.

1 hour ago, porina said:

It is a sad fact that so called AAA releases on PC are often broken on launch, regardless how much VRAM you have, and it can take quite some time before they patch it into what it should have been.

And many that are demanding games aren't broken but push the limit of what is possible in game engines. UE5 and Unity are improving to be able to take advantage of what people have and what they expect gamers to have in the future.

 

If the Infinity Ward dev on MLID's podcast recently is anything to go off of, this isn't going away. Call of Duty and other AAA games are gonna need more VRAM. Nvidia can't dictate to developers what they need; it is the job of developers to dictate to Nvidia what they need and for Nvidia to make that a product that customers can buy for a reasonable price.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

As a 4070 owner it doesn't phase me at all because I haven't bought into the VRAM FUD. 12GB is plenty for the upper-mid range at least the next couple of years. 8GB is still fine for a low-mid GPU.

Same as me with a 4070ti, only running 2560x1080 monitor and due to my bad eyesight I can't see the pixels so not likely to go higher any time soon 

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, AluminiumTech said:

 

 

AM5 will probably be supported for at least the next 3-4 years if not the next 5 years.

 

Which would equally benefit from having a better CPU.

 

 

And I stuck a GPU in my 8 year old desktop, and saw massive gains in the games I currently play. 
 

Though, if future APUs still get PCI-e anyway, I suppose arguing one way or another hardly matters. Get the APU now, run that for 6-7 years or so, then stick a dGPU in there as needed. 
 

One other hurdle is getting enough memory bandwidth to the GPU. iGPUs on PC (using much slower DDR4/5 memory), though quite potent in the shading department, are quite lacking when it comes to bandwidth heavy effects, such as high quantities of particle effects. We’d have to get GDDR7 sticks, and potentially triple channel (192-bit) on top, unless you feel like splitting things, allowing standard DDR5 sticks, and soldering down the Video RAM. 

 

Maybe a combination of triple-channel DDR5 and HBM would work well too?

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×