Jump to content

RX 7600 and RTX 4060Ti specs and release dates leaked | 8GB and 16GB cards coming for both

AlTech
6 minutes ago, AluminiumTech said:

This is acceptable for budget and low end GPUs that cost $200 to $250. Not for GPUs that cost 2-3x as much.

If you want the best you have to get the 4090. If you get anything below that, it is a balance of multiple factors, which does include cost and performance in part.

 

6 minutes ago, AluminiumTech said:

How? The 7900XT is objectively better than the 4070Ti, it can actually run games with RT better than Nvidia's card.

 

You don't think $799 (a price that you can essily buy a 7900XT for) for almost 4080 perf is better than paying $1200 for 4080 perf?

I don't know how many times I have to repeat this before you notice. Raster performance is not the only measure! I picked the 4070 as the best balance of my needs. AMD's offering would have cost more. If I'm going to exceed my needs, why stop there? I did half debate getting a 4090 "because I can" not "because I should". I didn't have a real need for it.

 

6 minutes ago, AluminiumTech said:

The end result still doesn't give you the difference between a 4070Ti and a 4080 for an extra 4GB of VRAM. Nvidia could afford to give 4080 owners 32GB of VRAM for that price and still make a comfortable profit.

There's a lot more difference between those two GPUs than 4GB of VRAM.

 

6 minutes ago, AluminiumTech said:

And many that are demanding games aren't broken but push the limit of what is possible in game engines. UE5 and Unity are improving to be able to take advantage of what people have and what they expect gamers to have in the future.

IMO CP2077 RT Overdrive is the best looking gaming graphics ever. It runs fine at 8GB. 

 

6 minutes ago, AluminiumTech said:

If the Infinity Ward dev on MLID recently is anything to go off of, this isn't going away. Call of Duty and other AAA games are gonna need more VRAM. Nvidia can't dictate to developers what they need; it is the job of developers to dictate to Nvidia what they need and for Nvidia to make that a product that customers can buy for a reasonable price.

If you're saying that games will trend towards needing more VRAM over time, I'd agree on that part. I want gaming to be the best it can be, not limited to PC space. But within PC space we have reasonable expectations of scaling. If a dev wants to make an insane texture pack that needs 48GB of VRAM, they're welcome to. But outside of tech demos, most gamers still are on 8GB GPUs and they're not going to drop that any time soon as a baseline. What quality for how much VRAM above that is an unknown but it isn't going to stretch that far.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zodiark1593 said:

And I stuck a GPU in my 8 year old desktop, and saw massive gains in the games I currently play. 
 

Though, if future APUs still get PCI-e anyway, I suppose arguing one way or another hardly matters. Get the APU now, run that for 6-7 years or so, then stick a dGPU in there as needed. 
 

One other hurdle is getting enough memory bandwidth to the GPU. iGPUs on PC (using much slower DDR4/5 memory), though quite potent in the shading department, are quite lacking when it comes to bandwidth heavy effects, such as high quantities of particle effects. We’d have to get GDDR7 sticks, and potentially triple channel (192-bit) on top, unless you feel like splitting things, allowing standard DDR5 sticks, and soldering down the Video RAM. 

 

Maybe a combination of triple-channel DDR5 and HBM would work well too?

Another option is just throwing a stack of HBM on the package when it comes to high power APUs

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, porina said:

If you want the best you have to get the 4090. If you get anything below that, it is a balance of multiple factors, which does include cost and performance in part.

 

I don't know how many times I have to repeat this before you notice. Raster performance is not the only measure!

It's not, but it's the most important measure.

 

In RT performance, below a 4080 or 4090 AMD does quite well.

16 minutes ago, porina said:

I picked the 4070 as the best balance of my needs. AMD's offering would have cost more.

AMD's offering (the RX 7700) doesn't exist yet. It's expected to cost less than Nvidia's 4070 offering.

16 minutes ago, porina said:

If I'm going to exceed my needs, why stop there? I did half debate getting a 4090 "because I can" not "because I should". I didn't have a real need for it.

 

There's a lot more difference between those two GPUs than 4GB of VRAM.

Not really imo.

16 minutes ago, porina said:

IMO CP2077 RT Overdrive is the best looking gaming graphics ever. It runs fine at 8GB. 

 

If you're saying that games will trend towards needing more VRAM over time, I'd agree on that part. I want gaming to be the best it can be, not limited to PC space. But within PC space we have reasonable expectations of scaling. If a dev wants to make an insane texture pack that needs 48GB of VRAM, they're welcome to. But outside of tech demos, most gamers still are on 8GB GPUs and they're not going to drop that any time soon as a baseline.

Yeah but that baseline is for 1080p and may eventually be for 720p. Not for 1440p or 4K. The 1440p 8GB train left the station in 2019. We're in the 1080p 8GB, 1440p 12GB, and 4K 16GB era.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AluminiumTech said:

In RT performance, below a 4080 or 4090 AMD does quite well.

Which is why I didn't consider any RDNA2 GPU at all. 7900 is the only offering that is at all acceptable and if you're going to pick edge cases, there are some where AMD's RT implementation doesn't work well too.

 

5 minutes ago, AluminiumTech said:

AMD's offering (the RX 7700) doesn't exist yet. It's expected to cost less than Nvidia's 4070 offering.

How do I consider a product that doesn't exist?

 

5 minutes ago, AluminiumTech said:

The 1440p 8GB train left the station in 2019. We're in the 1080p 8GB, 1440p 12GB, and 4K 16GB era.

Not even close. My main GPU was a 3070 since 2019 and there wasn't a single game that I couldn't run at high+ 1440p 60+. Maybe there were one or two out there but statistically insignificant. I even used it at 4k for quite a time and there is was GPU limited not VRAM limited. We are only just starting to see a handful of games need more than 8GB for higher settings. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, porina said:

Which is why I didn't consider any RDNA2 GPU at all. 7900 is the only offering that is at all acceptable and if you're going to pick edge cases, there are some where AMD's RT implementation doesn't work well too.

 

How do I consider a product that doesn't exist?

By considering the relative performance compared to other cards by AMD or Nvidia.

40 minutes ago, porina said:

Not even close. My main GPU was a 3070 since 2019 and there wasn't a single game that I couldn't run at high+ 1440p 60+.

Right now it can't without using Low of Medium quality textures. That's a fact.

40 minutes ago, porina said:

Maybe there were one or two out there but statistically insignificant. I even used it at 4k for quite a time and there is was GPU limited not VRAM limited.

If you remove the GPU limitation the VRAM limitation remains (See 3080 4K performance).

40 minutes ago, porina said:

We are only just starting to see a handful of games need more than 8GB for higher settings. 

I wouldn't call over 20 games a handful.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, AluminiumTech said:

It is if you enjoy spending $600 for a 1440p capable GPU from Nvidia when you could buy a 4K capable GPU from AMD for $600.

That is if you feel like wasting money on a 4K display. 1440p is really the highest resolution that seems logical right now because the frame rates at 4K and higher are appallingly low unless you're using a 4090. 4K is really only for those people that like a stunning image and don't care about how crisp the game play is, have an ass load of cash and afford the prices on premieum tech, have bought into the corperate marketing that 4K is the only resolution that gamers should care about, or any combination there of. Oh and nearly all the 4K and higher monitors I see these days are curved. Curved monitors are like the dumbest display gimic in the world they drive my eyes nuts.

My Main PC

  • CPU: 13700KF
  • Motherboard: MSI MAG Z790 Tomahawk
  • RAM: 32GB (16GBx2) DDR5-6000MHz TEAMGROUP T-Force Delta
  • GPU: RTX 4070 ASUS Dual
  • Case: RAIDMAX X603
  • Storage: WD SN770 2TB
  • PSU: Corsair RM850X Fully Modular
  • Cooling: DEEPCOOL LS720
  • Display(s): Gigabyte G24F2 & Dell S2318HN/NX
  • Keyboard: Logitech G512 Carbon (GX Blue)
  • Mouse: Logitech G502 Hero
  • Sound: Bose Headphone & Creative SBS260
  • Operating System: Windows 11 Pro

Laptop: Alienware m15 R1

  • OS: Windows 10 Pro
  • CPU: 9750H
  • MB: OEM
  • RAM: 16GB (8GBx2) DDR4 2666Mhz
  • GPU: RTX 2060 (Mobile)

Phone: Galaxy A54

Other: Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, AluminiumTech said:

Right now it can't without using Low of Medium quality textures. That's a fact.

In a handful of games released this year, at launch, going to medium may be suggested.

 

17 minutes ago, AluminiumTech said:

I wouldn't call over 20 games a handful.

Got a list? I don't follow every single release but certainly several AAA releases this year have been hit. And is it based on launch quality or does it account for later updates?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

In a handful of games released this year, at launch, going to medium may be suggested.

Yeah it might be "suggested". if you don't want the game to look like it came out in 2003 with low textures.

1 minute ago, porina said:

And is it based on launch quality or does it account for later updates?

Launch. People care about launch. Not many people care to followup after the game has received numerous updates.

 

Many game reviewers say GTA is an easy to run game from 2013. It's not true anymore. GTA Online has significantly higher requirements than GTA did in 2013

 

And yet not many reviewers create followup content because they believe not enough people will watch it.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AluminiumTech said:

Yeah it might be "suggested". if you don't want the game to look like it came out in 2003 with low textures.

Using TLOU as an example again, its medium textures did look awful at launch, far worse than other games at similar settings. Again, they apparently have updated that in a previous patch but I've not seen any comparisons on how it looks now. Also, they further reduced VRAM usage so it may not even need going down to medium in some cases.

 

1 minute ago, AluminiumTech said:

Launch. People care about launch. Not many people care to followup after the game has received numerous updates.

So, no one should look at CP2077 now because it sucked hard at launch? With the recent RT overdrive update, it is probably the best looking graphics in a PC game, ever - give or take some subjective interpretation as to what games should look like. It was a rather heavy update so it probably got more coverage than a routine patch would. That's ok. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, AluminiumTech said:

AMD apparently settled on $279 cos they've seen/heard that they'll get flak if they charged $350 as they originally wanted and intended to do.

 

Also, a lot of people don't seem terribly upset by the idea of a sub $300 8GB card; even though imo they should be.

why should people be? Its still 10 dollars a gig (not even X, just normal GDDR6), a sub 300 dollar card cant really go around spending over 100 dollars of the bom on ram and ram alone. 

the 16 GB versions of those cards in the link are waiting a couple of months hoping that prices will go down some more. (No one wants to make another mistake like the 3060)

 

38 minutes ago, porina said:

In a handful of games released this year, at launch, going to medium may be suggested.

 

Got a list? I don't follow every single release but certainly several AAA releases this year have been hit. And is it based on launch quality or does it account for later updates?

They just made up a list to get to 20, and even those 20 can run fine at 8GB, no matter how much they wine about not running at ultra, which if made right, ultra at launch should not be runnable anyways on flagships.  Ultra is beyond reason where 5% increase in visual fidelity increase cuts frames in half. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AluminiumTech said:

They do though. AM4 was supported for Ryzen 1000 through Ryzen 5000.

 

AM5 will probably be supported for at least the next 3-4 years if not the next 5 years.

AMD already has a Zen 5 laptop APU planned that performs like a 6700XT which is what the 7600 is targeting performance wise for wayyy less power. There's no reason AMD couldn't also release that on desktop. And it wouldn't have VRAM issues because it would use system RAM.

 

OEMs have privately said they will stop putting Nvidia's 50 and 60 class GPUs in laptops if AMD makes these laptop APU designs easy to integrate.

How would GPU upgrades be hampered? By tying them to CPU upgrades?

Which would equally benefit from having a better CPU.

 

You'd be surprised with the number of people buying 4080s for 4K.

 

Buying a 4070Ti for 1440p is a waste of money when a cheap and cheerful GPU for $400 or less can run 1440p.

It might be backfiring on Nvidia cos most of the few people who buy 4080s regret it and return it for an RDNA2 card.

Which would be wasting money. People shouldn't have to spend $600 to play 1440p. In 2020 a supposedly 4K card from Nvidia was $699. (it wasn't actually a real 4K card cos it didn't have enough VRAM but that's besides the point) and it was $699.

 

Over time tech is meant to become more affordable as it becomes easier and cheaper to produce. Nvidia is instead charging more every generation for no reason other than corporate greed.

 

If you buy a $500 or more GPU you should not be compromising the visual integrity of the game just to get a badly designed GPU to work.

 

This is acceptable for budget and low end GPUs that cost $200 to $250. Not for GPUs that cost 2-3x as much.

How? The 7900XT is objectively better than the 4070Ti, it can actually run games with RT better than Nvidia's card.

 

You don't think $799 (a price that you can essily buy a 7900XT for) for almost 4080 perf is better than paying $1200 for 4080 perf?

 

Or even $1000 7900XTX vs $1200 4080?

So you're fine paying a $200-$500 Nvidia tax for 4K gaming?

 

Or a $600-$800 tax for 1440p gaming?

The end result still doesn't give you the difference between a 4070Ti and a 4080 for an extra 4GB of VRAM. Nvidia could afford to give 4080 owners 32GB of VRAM for that price and still make a comfortable profit.

Other games still prove the point. 1 game later being optimized doesn't negate the pattern shown by many games.

And many that are demanding games aren't broken but push the limit of what is possible in game engines. UE5 and Unity are improving to be able to take advantage of what people have and what they expect gamers to have in the future.

 

If the Infinity Ward dev on MLID's podcast recently is anything to go off of, this isn't going away. Call of Duty and other AAA games are gonna need more VRAM. Nvidia can't dictate to developers what they need; it is the job of developers to dictate to Nvidia what they need and for Nvidia to make that a product that customers can buy for a reasonable price.

I am not sure what you want me to say. Most people play at 1080p and then some play at 1440p while very few play at 4k. Clearly not many people are interested in playing at 4k and would rather have higher refreshrate gaming at 1440p as a good middle ground. It has little to do with if their guy could play 4k games. If it is between 4k 60 and 1440p 144hz then most would just play at 1440p because it's a better experience. The only gpu that can realistically do 4k high refreshrate is the 4090 which is why I was saying that most 4k gamers are probably going to go with the 4090. That being said the 4080 has enough vram for 4k anyways so if they go with that instead it's not really a vram issue but price issue. You sell the 4089 for 1K and then it would be more popular. Anyways it's not like getting a 4070Ti is bad for 1440p. You can get really high fps with that card at 1440p so not sure why it would be a waste. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Brooksie359 said:

I am not sure what you want me to say. Most people play at 1080p and then some play at 1440p while very few play at 4k.

Not anymore. Most new monitors bought for gaming are 1440p

10 minutes ago, Brooksie359 said:

Clearly not many people are interested in playing at 4k and would rather have higher refreshrate gaming at 1440p as a good middle ground. It has little to do with if their guy could play 4k games. If it is between 4k 60

4K 144Hz does exist.

10 minutes ago, Brooksie359 said:

and 1440p 144hz then most would just play at 1440p because it's a better experience. The only gpu that can realistically do 4k high refreshrate is the 4090 which is why I was saying that most 4k gamers are probably going to go with the 4090. That being said the 4080 has enough vram for 4k anyways so if they go with that instead it's not really a vram issue but price issue. You sell the 4089 for 1K and then it would be more popular. Anyways it's not like getting a 4070Ti is bad for 1440p.

No, just bad for one's wallet.

10 minutes ago, Brooksie359 said:

You can get really high fps with that card at 1440p so not sure why it would be a waste. 

Cos there are very good 1440p cards for much cheaper.

14 minutes ago, starsmine said:

why should people be? Its still 10 dollars a gig (not even X, just normal GDDR6), a sub 300 dollar card cant really go around spending over 100 dollars of the bom on ram and ram alone. 

 

Source?

 

Citation needed.

14 minutes ago, starsmine said:


the 16 GB versions of those cards in the link are waiting a couple of months hoping that prices will go down some more. (No one wants to make another mistake like the 3060)

The 3060 had 12GB because Nvidia needed people to overlook the fact that it performed like a 2060 Super.

14 minutes ago, starsmine said:

They just made up a list to get to 20, and even those 20 can run fine at 8GB

Actually I believe Hardware Unboxed tested 27 games.

14 minutes ago, starsmine said:

, no matter how much they wine about not running at ultra, which if made right, ultra at launch should not be runnable anyways on flagships.  Ultra is beyond reason where 5% increase in visual fidelity increase cuts frames in half. 

Nobody complained about not running at Ultra. People are upset they can't run at High 1440p without the game swapping the textures for low quality on an 8GB $500 card from 2020.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, AluminiumTech said:

Not anymore. Most new monitors bought for gaming are 1440p

4K 144Hz does exist.

No, just bad for one's wallet.

Cos there are very good 1440p cards for much cheaper.

 

29 minutes ago, AluminiumTech said:

Source?

 

Citation needed.

Why is a citiation needed? 
https://www.digikey.com/en/products/filter/memory/memory/774?s=N4IgTCBcDaIOIBEECUBsIC6BfIA
Its been linked on these forums dozens if not hundreds of times.
 

29 minutes ago, AluminiumTech said:

The 3060 had 12GB because Nvidia needed people to overlook the fact that it performed like a 2060 Super.

3060 had 12GB because Nvidia did not want to use 6GB, or nuke bandwidth and do 8gb making it perform like poo, and bet on prices droping... and then prices didnt. Prices more then dobuled, and we are only almost back to pre pandemic pricing.

29 minutes ago, AluminiumTech said:

Actually I believe Hardware Unboxed tested 27 games.

Nobody complained about not running at Ultra. People are upset they can't run at High 1440p without the game swapping the textures for low quality on an 8GB $500 card from 2020.

Did they?
https://www.techspot.com/article/2661-vram-8gb-vs-16gb/
OH Wow, the 6800 which is a faster card in general beats it consistantly and TLOU got updated with performance patches RIGHT when the put out the data (so I dont fault them for that, but the data isnt valid, games are always being patched, and its not their job to know when they will drop). And frankly a 3070 should not be playing games that came out after it came out on ultra anyways, because if a 3070 can, the Devs left visual fidelity on the table for zero reason. 

So TLOU data is thrown out in there tests, and you only have two game running ultra w/ ray tracing that is unplayable, Plagues tale and Callisto Protocal.... so you know what you do? you dont run those settings. 

(re4r does not count, high is .5GB of textures, their RT test was already using ultra quality textures of 1GB, and maxed out slider is literally 8GB of textures, nothing else, aka an uber ultra, make game look 5% better for 8x the resources spent, because not giving you the full quality textures is just something devs do to baby gamers (and then release those textures as an HD pack 2 years later or as an "HD remaster" or whatever))

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, starsmine said:

 

Why is a citiation needed? 
https://www.digikey.com/en/products/filter/memory/memory/774?s=N4IgTCBcDaIOIBEECUBsIC6BfIA
Its been linked on these forums dozens if not hundreds of times.
 

That's not the price AIBs or AMD or Nvidia pay for GDDR6.

42 minutes ago, starsmine said:

 

3060 had 12GB because Nvidia did not want to use 6GB, or nuke bandwidth and do 8gb making it perform like poo, and bet on prices droping... and then prices didnt. Prices more then dobuled, and we are only almost back to pre pandemic pricing.

Did they?
https://www.techspot.com/article/2661-vram-8gb-vs-16gb/
OH Wow, the 6800 which is a faster card in general beats it consistantly

Not helped by the VRAM issues which make the 3070 slower.

42 minutes ago, starsmine said:

So TLOU data is thrown out in there tests, and you only have two game running ultra w/ ray tracing that is unplayable, Plagues tale and Callisto Protocal.... so you know what you do? you dont run those settings. 

Or buy a GPU that doesn't require you to compromise?

42 minutes ago, starsmine said:

(re4r does not count, high is .5GB of textures, their RT test was already using ultra quality textures of 1GB, and maxed out slider is literally 8GB of textures, nothing else, aka an uber ultra, make game look 5% better for 8x the resources spent, because not giving you the full quality textures is just something devs do to baby gamers (and then release those textures as an HD pack 2 years later or as an "HD remaster" or whatever))

If that's not the definition of Ultra Settings then frankly I don't know what is.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Azurael said:

Personally I feel that game creators need to learn how to cut down on the VRAM usage, because it is getting out of hand if someone really needs 16GB or more to play a game. There are a lot of PCs on tye market right now that only have 16GB of system RAM and now we are expected to have that muchbor more on our GPU too. Even when some one is playing at higher resolutions, this really is over the top from where I stand.

I strongly disagree, you have to draw the line somewhere or games get hobbled by old technology.  By your logic where do we draw the line, 8GB, 4GB, 2GB?  The way modern engines work (eg UE5) mean loading stupidly high resolution assets and converting them down in real-time, that's simply not going to scale to tiny amounts of VRAM.

 

Its only the last couple of console generations they got away with a lack of increasing system RAM and VRAM, because the consoles lagged so far behind PC they could get away with only marginal asset improvements on PC.  This generation consoles have made a huge jump, largely due to storage IO finally becoming decent so streaming huge assets in and out of memory became possible, leaving mid-range PCs lagging behind.  Even DirectStorage is not as efficient as how consoles are doing it, so we need to make up that gap with more VRAM and way more CPU power.

 

The blame here lies squarely on NVIDIA for not increasing VRAM enough on high-end cards because it would have cut into their huge profit margins on cards used for business work, that benefit more from tons of VRAM rather than just raw processing power.  So they are reluctantly adding more VRAM to lower-end cards, where it wouldn't benefit production work as they are too slow for that (at the professional level where time is money), and increasing the price of the top-end gaming cards so they don't take as much of a hit from people buying those instead of their higher priced cards.

 

From a business perspective, they didn't really have any choice as gamers are a tiny percentage of their income vs AMD where its a majority.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Azurael said:

Oh and nearly all the 4K and higher monitors I see these days are curved. Curved monitors are like the dumbest display gimic in the world they drive my eyes nuts.

At least we agree on something, except good 4K TVs are getting smaller which kinda makes larger 4K monitors redundant anyway.

 

Not everyone is into competitive gaming, so its 55" OLED 4K 120Hz with a Dolby Atmos surround system for me.  Beats playing on a 28" monitor with headphones any day.

 

It boggles my mind people still play with headphones or crappy stereo sound, its not nearly as immersive or way more sweaty.  Maybe its my unique ears, but no 3D audio system comes close to the sound stage of my surround system.  I can literally tell when something is right behind me, with headphones its more like somewhere around the top of my head, not behind, it lacks precision.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

I strongly disagree, you have to draw the line somewhere or games get hobbled by old technology.  By your logic where do we draw the line, 8GB, 4GB, 2GB?  The way modern engines work (eg UE5) mean loading stupidly high resolution assets and converting them down in real-time, that's simply not going to scale to tiny amounts of VRAM.

I would draw the line at 8GB for 1080P, 12GB for 1440 and 20GB for 4K. I really don't understand why they need SO MUCH VRAM for certain games especually at 1080p 8GB. You can't just look at it with the idea that it is limiting creativity if you are resticted from programing something to require large amounts VRAM. The creators need to remember that PC components cost a lot of money and the more high end memory you are force to buy the great the hit to the consumer's wallet.

I am not saying that they have gone off the deep end requering more VRAM for high resolutions, but they need keep it resonable for the average consumer playing at a lower resolution. Especially when Nvidia's pricing scheme is really out of touch with consumers already.

My Main PC

  • CPU: 13700KF
  • Motherboard: MSI MAG Z790 Tomahawk
  • RAM: 32GB (16GBx2) DDR5-6000MHz TEAMGROUP T-Force Delta
  • GPU: RTX 4070 ASUS Dual
  • Case: RAIDMAX X603
  • Storage: WD SN770 2TB
  • PSU: Corsair RM850X Fully Modular
  • Cooling: DEEPCOOL LS720
  • Display(s): Gigabyte G24F2 & Dell S2318HN/NX
  • Keyboard: Logitech G512 Carbon (GX Blue)
  • Mouse: Logitech G502 Hero
  • Sound: Bose Headphone & Creative SBS260
  • Operating System: Windows 11 Pro

Laptop: Alienware m15 R1

  • OS: Windows 10 Pro
  • CPU: 9750H
  • MB: OEM
  • RAM: 16GB (8GBx2) DDR4 2666Mhz
  • GPU: RTX 2060 (Mobile)

Phone: Galaxy A54

Other: Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, AluminiumTech said:

Not anymore. Most new monitors bought for gaming are 1440p

4K 144Hz does exist.

No, just bad for one's wallet.

Cos there are very good 1440p cards for much cheaper.

Source?

 

Citation needed.

The 3060 had 12GB because Nvidia needed people to overlook the fact that it performed like a 2060 Super.

Actually I believe Hardware Unboxed tested 27 games.

Nobody complained about not running at Ultra. People are upset they can't run at High 1440p without the game swapping the textures for low quality on an 8GB $500 card from 2020.

Yes I am very much aware that 4k 144hz monitors exist but my point was if you can't get high fps at 4k with decent settings then you might as well go 1440p. Also like I said most people use 1080p currently I never said that 1440p isn't becoming more popular. I mean if you are going to play at 4k high refreshrate you should get a card capable of doing that and the 4070ti is certainly not that. At that point I don't see why nvidia would market the 4070ti for 4k gaming when again its more of a 1440p card. Yes I now you are going to say there are other gpu that can run 1440p high refreshrate for cheaper but the 4070ti is going to do so better. I mean if you are trying to hit 240 fps at 1440p in most games because you bought a nice 1440p 240hz monitor then it seems like the 4070ti isn't bad at 800 dollars if that is your budget. Granted this is assuming you are going to use nvidai related features otherwise AMD has better deals tbh. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Azurael said:

I would draw the line at 8GB for 1080P, 12GB for 1440 and 20GB for 4K. I really don't understand why they need SO MUCH VRAM for certain games especually at 1080p 8GB. You can't just look at it with the idea that it is limiting creativity if you are resticted from programing something to require large amounts VRAM. The creators need to remember that PC components cost a lot of money and the more high end memory you are force to buy the great the hit to the consumer's wallet.

I am not saying that they have gone off the deep end requering more VRAM for high resolutions, but they need keep it resonable for the average consumer playing at a lower resolution. Especially when Nvidia's pricing scheme is really out of touch with consumers already.

Honestly they often are more bound by console specs than pc specs. I doubt how much vram nvidai puts in its new gpu effects game development in an significant manner. Consoles on the other hand have fixed hardware and they will make the game with those in mind. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Alex Atkin UK said:

From a business perspective, they didn't really have any choice as gamers are a tiny percentage of their income vs AMD where its a majority.

Nvidia's recent financials:

NV_Q422_004.png.4e60b87e4b22dd496b86eaa25c8b7345.png

https://www.jonpeddie.com/news/nvidia-reports-fourth-quarter-fiscal-year-2023-numbers/

 

Gaming is currently roughly 1/3 of their total business. Gaming was their biggest until about a year ago when DC took over.

 

 

I've been unable to get a clear picture of how AMD does in comparison recently, since their product mix and reporting gets more complicated. Still, DC is bigger than client, which would include Ryzen CPUs we use. They have a gaming category but I think that is basically console silicon, as it is far too big for their tiny GPU share.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Nvidia's recent financials:

NV_Q422_004.png.4e60b87e4b22dd496b86eaa25c8b7345.png

https://www.jonpeddie.com/news/nvidia-reports-fourth-quarter-fiscal-year-2023-numbers/

 

Gaming is currently roughly 1/3 of their total business. Gaming was their biggest until about a year ago when DC took over.

 

 

I've been unable to get a clear picture of how AMD does in comparison recently, since their product mix and reporting gets more complicated. Still, DC is bigger than client, which would include Ryzen CPUs we use. They have a gaming category but I think that is basically console silicon, as it is far too big for their tiny GPU share.

I don't understand how that chart adds up given they say "For fiscal 2023, revenue was $26.97 billion".

 

Although its clear NVIDIA have a problem, they made record revenue due to the crypto boom and its completely unrealistic to expect to get back to those values.  Yet legally they have to try, to avoid the wrath of shareholders wanting to know why they didn't.

 

This is unfortunately why its down to gamers to vote with their wallets, if the evidence is that the market wont bear how much they are charging, they have to reduce prices, but the evidence has to be there first to show the shareholders.  Also assuming they aren't happy to just ditch the gaming market and double-down on datacenter instead.  We can at least hope that if they do indeed still have 1/3 from gaming, that wouldn't make much sense to give up on.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Alex Atkin UK said:

I don't understand how that chart adds up given they say "For fiscal 2023, revenue was $26.97 billion".

I believe Nvidia's financial reporting is done in such a way it's given in +1 quarter/year so you take 1 quarter/year off essentially (current period rather than period just been). I'm pretty sure it's Nvidia anyway. One of the big tech companies does it super odd compared to the rest.

 

What is shown is current quarter, previous quarter and current quarter last year. They won't and don't add up since it's not showing 4 quarters of one year.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Alex Atkin UK said:

a business perspective, they didn't really have any choice as gamers are a tiny percentage of their income vs AMD where its a majority.

No. It's a small minority when it comes to AMD.

 

Although AMD makes it hard to tell how much gaming revenue comes from GPUs (since most comes from Consoles), it is not a lot.

 

Most of AMD's money now comes from Epyc, CDNA, and laptop APUs.

 

Ryzen desktop is a pretty small contributor to their revenue as well.

 

AdoredTV did a rough calculation and found out how much gaming GPUs make for AMD. The key takeaway besides the financial aspect is it's AMD's passion project/side hustle. They're not making a lot of money through it and nobody but AMD will know if desktop GPUs sales are profitable for them.

 

RDNA architecture development is effectively being subsidized both by A) Sony and Microsoft and B) laptop APU (including Steam Deck) customers at this point.

 

AMD needs to have a GPU division to satisfy their console and laptop APU customers and we, PC gamers, have the benefit of reaping the benefits of that.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AluminiumTech said:

No. It's a small minority when it comes to AMD.

 

Although AMD makes it hard to tell how much gaming revenue comes from GPUs (since most comes from Consoles), it is not a lot.

When I said gaming I was including consoles and other gaming SoCs in general (eg Steam Deck).  This is a market NVIDIA largely doesn't bother with, the exception being the Nintendo Switch which given its the same SoC used in ShieldTV I'm not sure that even counts.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

I don't understand how that chart adds up given they say "For fiscal 2023, revenue was $26.97 billion".

Last quarter was ball park 5B in total. x4 quarters ~20B. A little shy of 27B but there is natural variations between quarters, plus the macroeconomic environment means people aren't spending in general recently. Not just a problem for nvidia, but also AMD, Intel, Apple, and outside tech space. I've seen claims the lull in this situation is expected to be worst at start of this year, so there may be some recovery as we move forwards.

 

16 minutes ago, Alex Atkin UK said:

When I said gaming I was including consoles and other gaming SoCs in general (eg Steam Deck).  This is a market NVIDIA largely doesn't bother with, the exception being the Nintendo Switch as it was effectively a mobile SoC they already developed.

I did a bit more digging and found this:

image.png.0c0f58fa970ddb1f11a026ac30e2c581.png

https://www.anandtech.com/show/18845/amd-reports-q1-2023-earnings-back-into-the-red-as-client-sales-crumble

 

Gaming made up roughly 1/3 of AMD revenue that quarter. It primarily refers to console silicon, but looking at their slides it also includes gaming dGPUs. I don't know if things like Steam Deck APU would be counted as Client or Gaming, likely depending on if it is a standard product or customised for Valve, but it is probably small beans anyway.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×