Jump to content

Nvidia VRAM - Deserved vs Given for the Last 3 Generations

YoungBlade

This analysis is based on the performance and VRAM capacity of the GTX 1060 6GB. I think most people would agree that the 6GB found on the 1060 was sufficient. The card has aged very well and in games where the 6GB buffer isn't enough, the GPU generally isn't fast enough to keep up anyway. It also isn't generally considered to be a card that had too much VRAM - it seemed to be about the right amount, where folks generally didn't think much about it. And I think that, for basically all cards, that's where it should be. VRAM should be enough to be a non-issue, but obviously it's wasteful to give a card too much to the point it can never use it.

 

With that in mind, I wondered whether Nvidia really has given consumers that much less VRAM per performance level over the last few generations. So I decided to take a look at that.

 

If you're wondering where this data comes from, the numbers here are based on the Techpowerup relative performance chart. My method was that I selected the GTX 1060 6GB and multiplied the relative performance by its 6GB of VRAM, then I rounded to the nearest GB. Then, to color the chart, I used green for a card where the Deserved/Given value was within 20% (<1.2). For yellow, it was within 50% (<1.5). For orange, less than 100% (<2). And red is for anything at or above 100% (>=2).

 

  Deserved Given     Deserved Given
GTX 1050 3 GB 2 GB   RTX 3050 8 GB 8 GB
GTX 1050 Ti 4 GB 4 GB   RTX 3060 12GB 11 GB 12 GB
GTX 1060 3GB 5 GB 3 GB   RTX 3060 Ti 14 GB 8 GB
GTX 1060 6GB 6 GB 6 GB   RTX 3070 17 GB 8 GB
GTX 1070 8 GB 8 GB   RTX 3070 Ti 18 GB 8 GB
GTX 1070 Ti 9 GB 8 GB   RTX 3080 10 GB 22 GB 10 GB
GTX 1080 10 GB 8 GB   RTX 3080 12 GB 22 GB 12 GB
GTX 1080 Ti 12 GB 11 GB   RTX 3080 Ti 24 GB 12 GB
        RTX 3090 25 GB 24 GB
GTX 1650 5 GB 4 GB   RTX 3090 Ti 28 GB 24 GB
GTX 1650 Super 6 GB 4 GB        
GTX 1660 7 GB 6 GB   RTX 4070 22 GB 12 GB
GTX 1660 Ti/Super 8 GB 6 GB   RTX 4070 Ti 25 GB 12 GB
        RTX 4080 32 GB 16 GB
RTX 2060 10 GB 6 GB   RTX 4090 40 GB 24 GB
RTX 2060 Super 11 GB 8 GB        
RTX 2070 11 GB 8 GB        
RTX 2070 Super 12 GB 8 GB        
RTX 2080 13 GB 8 GB        
RTX 2080 Super 14 GB 8 GB        
RTX 2080 Ti 16 GB 11 GB        

 

Looking at this, you can see that the only cards really lacking in VRAM for the 10 series were the GTX 1050 and the 1060 3GB. This should come as a surprise to no one, and I took that as a sign that this metric wasn't totally off-base. I know that 1060 3GB owners are hurting today, and that any owner of a 2GB card is having a rough time in 2023. I would be curious if any GTX 1080 owners feel that their card is lacking in VRAM - I would guess not, as I have never heard that was a problem.

 

Then you'll see that the 16 series is also fairly reasonable, though not great, which seemed to be in line with the sentiment at the time - the cards had just enough to get by. I'm curious if 16 series owners feel this way today.

 

Now for the problems: you can see that not a single card in the 20 series was given as much VRAM relative to its performance as the 1060 6GB received. The closest cards were the RTX 2060 Super and RTX 2070, which have both aged pretty well. I own an RTX 2060 Super, and I can tell you that I have yet to find a game where I felt like my card should be able to run the game, but that a lack of VRAM is the reason it can't. So from my anecdotal evidence, being in the yellow seems alright. However, the RTX 2060 and RTX 2080s seem to be lacking in VRAM. I haven't heard much about this - I'd be curious if any owners of these cards are having problems today. If so, that would help to validate this method. If not, it could show that it doesn't scale out to newer generations.

 

For the 30 series, VRAM seems to be a complete mess. The 3050 and 3060 have enough, and then the 3090s have enough. This is also the only instance where a card was given more VRAM per performance than the GTX 1060 - the RTX 3060 12GB, which raised eyebrows at launch due to its high amount of VRAM. Then, everything in between the top and bottom seems lacking. This is in line with the recent uproar about the 3070 cards not having enough VRAM, and it implies that RTX 3080 and 3080 Ti owners are not far off from having problems as well. It will be interesting to see how these cards age.

 

Finally, for the 40 series, it looks to be a complete disaster. All of the cards look like they are severely lacking in VRAM relative to their performance tiers. Perhaps this method just doesn't work out when you are scaling to nearly a 7x performance increase, but it implies that the RTX 4090 is the best card in terms of VRAM amount, and that even it is off by nearly 70%.

 

From this, it looks like Nvidia needs to seriously rework the amount of VRAM their cards come with for the next generation. The 5000 series should give 24GB+ to every card more powerful than an RTX 4070, and whatever card replaces the RTX 4090's performance level should be given 40GB - or at least 32GB to be in line with the GTX 1080's result. Otherwise, we're likely to see cards age very poorly relative to the previous gens. The 10 series has shown that cards with enough VRAM can have great longevity - plenty of people are still using those cards happily today, and I think the VRAM amounts are a big factor there.

 

But what do you think? Are my methods here totally off-base? Is my conclusion right or wrong? And if you have any of the cards marked in orange or red, are you having any problems today turning up settings due to VRAM when your card would otherwise be fine?

 

EDIT: To clarify something here, I'm talking about how much VRAM a card needs so that the VRAM capacity is not a bottleneck. I fully understand that these cards are able to get their relative performances in spite of their current VRAM capacities, but what I believe is that the ones in orange and red are not able to push themselves to their GPUs' limit before VRAM capacity becomes the limiting factor in some situations.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, YoungBlade said:

But what do you think? Are my methods here totally off-base? Is my conclusion right or wrong?

I don't think their off base personally but i think giving 24 gigs of vram to each nvidia card is a bit overkill (probs not that much but a boost in vram should be expected)

I think nvidia have been actively making all their products worse by comparison as they have to make them to make people buy the 4090

3 minutes ago, YoungBlade said:

And if you have any of the cards marked in orange or red, are you having any problems today turning up settings due to VRAM when your card would otherwise be fine?

I've got a 3060 and I love it! (the 12 gig version btw, thank god not the 8 gig) Its a great pair for my 5600 and i can easily play comp games at 1440p on max (such as apex legends) and vram is not an issue whatsoever

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, YoungBlade said:

But what do you think? Are my methods here totally off-base? Is my conclusion right or wrong? And if you have any of the cards marked in orange or red, are you having any problems today turning up settings due to VRAM when your card would otherwise be fine?

 

Way off base....

 

30 series.

Deserved: 3050- 8gb

Deserved: 3060- 8gb

Deserved: 3060ti- 12gb

Deserved: 3070- 12gb

Deserved: 3070ti- 12gb

Deserved: 3080 16gb

Deserved: 3080 16gb

Deserved: 3080ti- 20gb

Deserved: 3090- 24gb

40 series:

Deserved: 4090- 24gb

Deserved: 4080- 20gb

Deserved:4070ti- 16gb

Deserved: 4070- 16gb

 

 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ShawtyT30beTHICCC said:

 

Way off base....

 

30 series.

Deserved: 3050- 8gb

Deserved: 3060- 8gb

Deserved: 3060ti- 12gb

Deserved: 3070- 12gb

Deserved: 3070ti- 12gb

Deserved: 3080 16gb

Deserved: 3080 16gb

Deserved: 3080ti- 20gb

Deserved: 3090- 24gb

40 series:

Deserved: 4090- 24gb

Deserved: 4080- 20gb

Deserved:4070ti- 16gb

Deserved: 4070- 16gb

 

 

Where are you getting those numbers from? Or is this just your gut feeling?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, YoungBlade said:

Where are you getting those numbers from? Or is this just your gut feeling?

Just my opinion based on team red and game requirements in 2023

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

Random thoughts:

 

What would happen if you gave AMD the same treatment? Polaris (RX480 etc.) released about same time as 1060, so team red GPUs since then would include Polaris, Vega, and RDNA generations. Would you scale them to the 1060 6GB still? Or is another reference applicable?

 

I don't think VRAM size is a linearly scaling factor. You have enough or not enough. Excess is wasteful. We should be looking at games. What do they really need for a given resolution/setting? Monitoring displayed usage is not necessarily the correct number to look at.

 

I just realised at some point I owned every nvidia 70 tier GPU since Maxwell. 970, 1070, 2070, 3070, 4070. I'll focus on the 3070 for now, since it seems to be the most commonly called out GPU when VRAM discussions kick off. I got it soon after release, and have had over 2 years of use and only just got replaced by the 4070. Most of that time was connected to a 4k TV. I did not run into VRAM limitations in any game during that time. It needed a bit more grunt in general for 4k, but worked fine otherwise. The only piece of software that didn't run well on it was Nvidia's RTX Marbles demo, which seemed to be tuned for more VRAM as it was fine on my 2080 Ti. Still too early to talk about 4070. Ask again in a couple years.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, YoungBlade said:

But what do you think? Are my methods here totally off-base? Is my conclusion right or wrong? And if you have any of the cards marked in orange or red, are you having any problems today turning up settings due to VRAM when your card would otherwise be fine?

You forgot that one of the main segmentation points between a geforce product and a professional one is exactly the amount of VRAM. The 3090 already displaced many of the entry-level workstation/server GPUs, giving a 4090 40gb of vram would make it way too close to a RTX 6000 Ada with its 48 of GDDR6 (non-X!) memory. 48gb is also the maximum amount of vram you can get with a non-HBM Nvidia GPU at the moment.

 

Also, GDDR6X is both expensive and generates more heat, so cramming many chips of those would be quite a challenge by itself.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

At least the GTX 970 is not being looked at here as it was given 3.5GB of VRAM while it deserved 4GB. 

 

 

/s

CPU Cooler Tier List  || Motherboard VRMs Tier List || Motherboard Beep & POST Codes || Graphics Card Tier List || PSU Tier List 

 

Main System Specifications: 

 

CPU: AMD Ryzen 9 5950X ||  CPU Cooler: Noctua NH-D15 Air Cooler ||  RAM: Corsair Vengeance LPX 32GB(4x8GB) DDR4-3600 CL18  ||  Mobo: ASUS ROG Crosshair VIII Dark Hero X570  ||  SSD: Samsung 970 EVO 1TB M.2-2280 Boot Drive/Some Games)  ||  HDD: 2X Western Digital Caviar Blue 1TB(Game Drive)  ||  GPU: ASUS TUF Gaming RX 6900XT  ||  PSU: EVGA P2 1600W  ||  Case: Corsair 5000D Airflow  ||  Mouse: Logitech G502 Hero SE RGB  ||  Keyboard: Logitech G513 Carbon RGB with GX Blue Clicky Switches  ||  Mouse Pad: MAINGEAR ASSIST XL ||  Monitor: ASUS TUF Gaming VG34VQL1B 34" 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, porina said:

Random thoughts:

 

What would happen if you gave AMD the same treatment? Polaris (RX480 etc.) released about same time as 1060, so team red GPUs since then would include Polaris, Vega, and RDNA generations. Would you scale them to the 1060 6GB still? Or is another reference applicable?

 

I don't think VRAM size is a linearly scaling factor. You have enough or not enough. Excess is wasteful. We should be looking at games. What do they really need for a given resolution/setting? Monitoring displayed usage is not necessarily the correct number to look at.

 

I just realised at some point I owned every nvidia 70 tier GPU since Maxwell. 970, 1070, 2070, 3070, 4070. I'll focus on the 3070 for now, since it seems to be the most commonly called out GPU when VRAM discussions kick off. I got it soon after release, and have had over 2 years of use and only just got replaced by the 4070. Most of that time was connected to a 4k TV. I did not run into VRAM limitations in any game during that time. It needed a bit more grunt in general for 4k, but worked fine otherwise. The only piece of software that didn't run well on it was Nvidia's RTX Marbles demo, which seemed to be tuned for more VRAM as it was fine on my 2080 Ti. Still too early to talk about 4070. Ask again in a couple years.

OoOOOOO I'm not the only one on the forum who has a FE 4070. How do you like it @porina?

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, YoungBlade said:

My method was that I selected the GTX 1060 6GB and multiplied the relative performance by its 6GB of VRAM, then I rounded to the nearest GB

 

29 minutes ago, YoungBlade said:

But what do you think? Are my methods here totally off-base?

You went "this card performs twice as well so it should have twice as much VRAM". Your method is flawed because it doesn't acknowledge that those cards performed relatively better with the VRAM amount they had. If they were able to get twice the relative  performance with only 60% more VRAM then if anything that proves more VRAM wasn't necessary.

 

You're looking at the cards that are performing well despite having less VRAM relative to their performance and saying they don't have enough. Logic is flawed.

 

 

Having more VRAM than necessary isn't going to improve performance. The better performing cards also use much faster memory than the 1060. Capacity doesn't tell the full story. 

Adding a bunch of unnecessary VRAM to a card is just going to increase costs. Graphics cards are already expensive enough.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ShawtyT30beTHICCC said:

OoOOOOO I'm not the only one on the forum who has a FE 4070. How do you like it @porina?

So far so good. The main reason I wanted it was to get the 3080 I couldn't buy 2 years ago. It's cheaper than the 3080 at launch and when sold new now. Haven't put heavier games on it yet but I gave it a quick spin on Witcher 3 RT and Portal RTX and both run much better than the outgoing 3070. It's the difference between "it's amazing that it runs at all" to "it's amazing!".

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

So far so good. The main reason I wanted it was to get the 3080 I couldn't buy 2 years ago. It's cheaper than the 3080 at launch and when sold new now. Haven't put heavier games on it yet but I gave it a quick spin on Witcher 3 RT and Portal RTX and both run much better than the outgoing 3070. It's the difference between "it's amazing that it runs at all" to "it's amazing!".

I just like the form factor, build quality, looks, additional VRAM vs. outgoing 3060ti, temp under load, etc. I just wish it wasn't so freaking gimped. It should be like 20% faster in theory if Nvidia didn't shift everything down a full tier.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Spotty said:

 

You went "this card performs twice as well so it should have twice as much VRAM". Your method is flawed because it doesn't acknowledge that those cards performed relatively better with the VRAM amount they had. If they were able to get twice the relative  performance with only 60% more VRAM then if anything that proves more VRAM wasn't necessary.

 

You're looking at the cards that are performing well despite having less VRAM relative to their performance and saying they don't have enough. Logic is flawed.

 

 

Having more VRAM than necessary isn't going to improve performance. The better performing cards also use much faster memory than the 1060. 

Adding a bunch of unnecessary VRAM to a card is just going to increase costs. Graphics cards are already expensive enough.

I don't think that's right either, because the GTX 1060 3GB clearly did not have enough VRAM, and yet its relative performance was only 12% off the GTX 1060 6GB, and that's only because they cut down the core. By that logic, we could have given the GTX 1060 6GB just 3GB of VRAM without cutting down the GPU itself and it would have been fine - but that's clearly not the case. And the RX 580 4GB performs identically to the RTX 580 8GB, because the GPU used is identical. By this logic, the 8GB version is a waste, but that's also not true.

 

I'm not talking about how much VRAM a card needs to perform well in general, but how much VRAM a card needs to be able to push settings as much as possible before the GPU itself starts to become the limiting factor. In other words, how much VRAM you need to avoid the VRAM capacity being a bottleneck.

 

Whether or not a card can perform well with a given amount of VRAM is a different discussion.

Link to comment
Share on other sites

Link to post
Share on other sites

List makes 0 sense. 8GB should be standard entry, 12-16GB for midrange, 20-24GB for high end cards, no real need for more VRAM.

 

Basically what Shawty said:

22 minutes ago, ShawtyT30beTHICCC said:

30 series.

Deserved: 3050- 8gb

Deserved: 3060- 8gb

Deserved: 3060ti- 12gb

Deserved: 3070- 12gb

Deserved: 3070ti- 12gb

Deserved: 3080 16gb

Deserved: 3080 16gb

Deserved: 3080ti- 20gb

Deserved: 3090- 24gb

40 series:

Deserved: 4090- 24gb

Deserved: 4080- 20gb

Deserved:4070ti- 16gb

Deserved: 4070- 16gb

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, porina said:

Random thoughts:

 

What would happen if you gave AMD the same treatment? Polaris (RX480 etc.) released about same time as 1060, so team red GPUs since then would include Polaris, Vega, and RDNA generations. Would you scale them to the 1060 6GB still? Or is another reference applicable?

I suppose I could use the GTX 1060 6GB for an AMD version. I feel like that's not quite right, because I know that AMD cards scale differently than Nvidia ones, but I'm also not sure what AMD card I would use. The RX 480 had both 4GB and 8GB versions. At the time of release, the 8GB was seen as wasteful, but today it is seen as the much better option. And in head-to-head videos between the RX 580 and GTX 1060, I generally see reviewers writing off the extra 2GB of VRAM on the RX 580 as unnecessary, whereas I don't think I've ever heard anyone say that the 6GB on the GTX 1060 was excessive. That's why I picked it.

21 minutes ago, porina said:

I don't think VRAM size is a linearly scaling factor. You have enough or not enough. Excess is wasteful. We should be looking at games. What do they really need for a given resolution/setting? Monitoring displayed usage is not necessarily the correct number to look at.

I'm talking about the phenomenon of VRAM capacity becoming the bottleneck. In my opinion, that should basically never happen. The RTX 3070 Ti is more than capable of running games like Resident Evil 4 Remake maxed out at 1080p - all the settings cranked to the max, including RT - and yet the game crashes due to lack of VRAM. That, to me, is a fail. You shouldn't be able to get a playable framerate with a GPU, only to have that stopped by a lack of VRAM.

 

That is wasteful, too, just in a different direction. It means that the GPU attached to the VRAM is too powerful and could have been cut down more for that particular game.

25 minutes ago, porina said:

I just realised at some point I owned every nvidia 70 tier GPU since Maxwell. 970, 1070, 2070, 3070, 4070. I'll focus on the 3070 for now, since it seems to be the most commonly called out GPU when VRAM discussions kick off. I got it soon after release, and have had over 2 years of use and only just got replaced by the 4070. Most of that time was connected to a 4k TV. I did not run into VRAM limitations in any game during that time. It needed a bit more grunt in general for 4k, but worked fine otherwise. The only piece of software that didn't run well on it was Nvidia's RTX Marbles demo, which seemed to be tuned for more VRAM as it was fine on my 2080 Ti. Still too early to talk about 4070. Ask again in a couple years.

So you did have a situation where, as a result of the 8GB VRAM buffer, the card failed to do something that it could do otherwise if it just had more VRAM.

 

That's the problem I'm talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zando_ said:

List makes 0 sense. 8GB should be standard entry, 12-16GB for midrange, 20-24GB for high end cards, no real need for more VRAM.

What are you basing that on? Where are you getting those numbers?

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, YoungBlade said:

Looking at this, you can see that the only cards really lacking in VRAM for the 10 series were the GTX 1050 and the 1060 3GB. This should come as a surprise to no one, and I took that as a sign that this metric wasn't totally off-base. I know that 1060 3GB owners are hurting today, and that any owner of a 2GB card is having a rough time in 2023.

I'm actually not having that much rough time on the 1050 tbh, but I'm also not playing ultra latest AAA titles either.

 

Card is still very sufficient at running esports titles on high fps and high quality, also older AAA games or "relatively" new ones still runable if they have good optimization.

 

Another note would be that back in the day, 1050 since it's 50 it's low end GPU though still Pascal so packing the decent punch regardless, and while higher memory would be better considering there was decent alternative of AMD in the same pricepoint, it would may or may not cost also more which would also mean some people wouldn't buy it.

 

My 1050 is brand new ~6+ years old and still kicking like it's brand new almost.

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, YoungBlade said:

What are you basing that on? Where are you getting those numbers?

Gaming. You don't need much more VRAM than that for the resolutions cards in that range can run. I play at 4K, 10-12GB VRAM or a little more is what I commonly see used. But I don't know that it's needed as my 8GB VRAM card handles the same titles fine. Even with DLSS which IIRC increases VRAM usage. Regardless, some of the latest titles do actually need the 10-12GB or more, and likely more will in the future, thus midrange cards should have 12 on the lower end and 16 on the higher end. High end cards and halo products also lean into prosumer workloads so more VRAM makes sense there. But as others noted, not so much that it chokes out actual workstation/enterprise products as companies are not eager to cannibalize their own sales. 

 

For reference on what GPUs I have used + what VRAM they had (and in no particular order) I've had/have:

  • 660 2GB
  • 1050 2GB
  • 1050 Ti 4GB
  • RX 570 4GB
  • RX 480 8GB
  • 980 Ti 6GB
  • 1660 Ti 6GB
  • 1080 8GB
  • 1080 Ti 11GB
  • Vega Frontier Edition 16GB
  • Vega 56 8GB
  • Radeon VII 16GB
  • 2060 Super 8GB
  • ARC A770 LE 16GB

and some assorted older cards but I haven't really ran games on those. Some of these cards I had multiple of and I think I may have forgotten one or two.... oh yeah, I currently have a set of GTX 780 3GB cards as well. 

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zando_ said:

Gaming. You don't need much more VRAM than that for the resolutions cards in that range can run. I play at 4K, 10-12GB VRAM or a little more is what I commonly see used. But I don't know that it's needed as my 8GB VRAM card handles the same titles fine. Even with DLSS which IIRC increases VRAM usage. Regardless, some of the latest titles do actually need the 10-12GB or more, and likely more will in the future, thus midrange cards should have 12 on the lower end and 16 on the higher end. High end cards and halo products also lean into prosumer workloads so more VRAM makes sense there. But as others noted, not so much that it chokes out actual workstation/enterprise products as companies are not eager to cannibalize their own sales. 

 

For reference on what GPUs I have used + what VRAM they had (and in no particular order) I've had/have:

  • 660 2GB
  • 1050 2GB
  • 1050 Ti 4GB
  • RX 570 4GB
  • RX 480 8GB
  • 980 Ti 6GB
  • 1660 Ti 6GB
  • 1080 8GB
  • 1080 Ti 11GB
  • Vega Frontier Edition 16GB
  • Vega 56 8GB
  • Radeon VII 16GB
  • 2060 Super 8GB
  • ARC A770 LE 16GB

and some assorted older cards but I haven't really ran games on those. SOme of these cards I had multiple of and I think I may have forgotten one or two.... oh yeah, I currently have a set of GTX 780 3GB cards as well. 

 

Looking at those cards, the only ones in the orange range that I can find would be the GTX 660 2GB and GTX 780 3GB, as they should have had 3GB and 5GB respectively by this metric.

 

Did you ever find situations where the GTX 660 or GTX 780 seemed like it should be able to run something with higher settings, but it failed to do so due to its VRAM capacity?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, YoungBlade said:

What are you basing that on? Where are you getting those numbers?

Common sense and how memory works is what id say. Adjusting settings is pretty much all thats needed for most of these games. VRAM Is less of an issue than Core performance. Ive said this for years, and there are only a few cards that dont go with it, the actual performance of the card becomes an issue MUCH sooner then the VRAM total. The 4 Cards that i could see that this is not true for is the 3070, 3070TI, 3080 10GB, and maybe 2080? Every other card performs Just fine in its category, VRAM isnt the issue. Most of the older cards, they are simply using what VRAM was available, and memory chips are generally 15-30% of the cost of the card depending on the type and amount used. 3090's 24GB cost alone in memory chips was around 300-350$ at launch, who are they gonna put that price on? Its always going to be the Consumer, Nvidia isnt eating that cost. So sure they could have done it, but you would have easily paid MUCH more for it sadly.

 

 

Older cards have issues running newer titles just a few years down the line, yeah thats normal.  Generally again its not a VRAM issue, even if your 780 had 6 GB or 8 GB, for the VAST Majority of the games at the time it was made+2-3 years. Did they skimp sometimes? Of course, thats NVIDIAs favorite thing to do is to do the bare minimum with memory amounts until it becomes an issue, then do a decent jump in performance and throw a LOT of VRAM at it, then upcharge a ton.

 

 

Playing at max settings on ANY GPU is pretty fucking stupid, yes i understand "I WANT EYECANDY MAX" people, but honestly it does not add much. Simply adjusting some settings does help.

 

If i was able to make a GPU, yes i would 100% make the baseline minimum card 12 GB at this point with the performance uplifts each generation gets, since that would only be 6x 2GB chips, which should be around 90$, but that also comes down to Memory layout so the amounts can only fit in those totals, although they can make it whatever they want as theyve shown flexibility in that as long as its designed for well before hand/

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, YoungBlade said:

You shouldn't be able to get a playable framerate with a GPU, only to have that stopped by a lack of VRAM.

There's this thing called settings. You can turn it down a bit if needed.

 

This is and will remain a complicated matter going forwards. The transfer of assets in and out of VRAM is going to be increasing as we go forwards as we get DirectStorage support.

 

12 minutes ago, YoungBlade said:

So you did have a situation where, as a result of the 8GB VRAM buffer, the card failed to do something that it could do otherwise if it just had more VRAM.

It's a tech demo and not set up in the same way a game would be. There were no settings at all. In a similar way, I tried running Blender benchmark on Arc A380 which has 6GB of VRAM. Problem was, the 2nd subtest requires 8GB and it just crashes.

 

At the end of the day it comes down to how well games are set up. The latest RT update to Cyberpunk 2077 even runs on a 3070, and it is possibly the best looking in game visuals to date. It makes the likes of TLOU and Forspoken a joke.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, porina said:

There's this thing called settings. You can turn it down a bit if needed.

 

 

Settings are dependant on the GPU performance, memory bandwidth, and the VRAM capacity.

 

With VRAM, the settings you need to turn down to help are generally textures, view distance, and level of detail. In my opinion, those are the most important settings for a game. I want those as high as possible. The only GPU heavy settings I think are truly important are shadows, reflections, and anti-aliasing, but by-and-large the quality level needed on those are not as high for a good experience.

 

I generally want to always have the former settings higher than the latter settings.

 

If I can max out textures, draw distance, and LOD in a game, I'm thrilled. I'm willing to turn down shadows, reflections, and AA to medium or low to allow the GPU to keep up, though I obviously prefer higher if I can.

 

In the games I play, I can always do that. I've yet to find one where my 2060S has to drop textures or draw distance, but I can still crank shadows and reflections to the max at the same time - because I'm not out of VRAM.

 

If I find a game where I have to turn textures way down, but I can run Ultra or even RT shadows with no issues, I'm gonna be upset about that.

 

Maybe the settings I prefer are way outside the norm, but given that texture mods are one of the most common things modded into a game, I don't think so. And VRAM is the single most important thing for running higher textures.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, YoungBlade said:

Maybe the settings I prefer are way outside the norm, but given that texture mods are one of the most common things modded into a game, I don't think so. And VRAM is the single most important thing for running higher textures.

I'm lazy. I'd rarely tune settings beyond built in presets other than maybe turning off motion blur. While I do mod, all bets are off once you start doing so. It is easy to consume a lot more resource than original once you start going that route.

 

Before this year, the settings needing may involve juggling between high and ultra. In most cases, you don't notice the difference between them unless you're comparing screenshots side by side, which is irrelevant in actual gameplay. Only this year, for a 8GB GPU we may be moving into medium territory. I recall Forspoken devs said post launch that 12GB recommended for high. I can't say I noticed much difference in Forspoken at medium or high on a 3070, for better or worse. TLOU can only run on 8GB with medium textures which in their implementation looks awful. On the flip side, Cyberpunk 2077's latest RT update will give 1440p30+ with DLSS on a 3070. VRAM is not the limiting factor there. CP2077 was a mess at launch too and it is hard to imagine what we have now at the time. TLOU launch is speculated to be rushed to tie in with the TV series. It's a sad fact that with PC games, many devs/publishers are unable to deliver a great experience on day 1. It is uncommon for them to say, it isn't ready, more time is needed. It does happen, it can happen, maybe not enough. At least it is more realistic than a major game being ready on time.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

I'm lazy. I'd rarely tune settings beyond built in presets other than maybe turning off motion blur. While I do mod, all bets are off once you start doing so. It is easy to consume a lot more resource than original once you start going that route.

 

Before this year, the settings needing may involve juggling between high and ultra. In most cases, you don't notice the difference between them unless you're comparing screenshots side by side, which is irrelevant in actual gameplay. Only this year, for a 8GB GPU we may be moving into medium territory. I recall Forspoken devs said post launch that 12GB recommended for high. I can't say I noticed much difference in Forspoken at medium or high on a 3070, for better or worse. TLOU can only run on 8GB with medium textures which in their implementation looks awful. On the flip side, Cyberpunk 2077's latest RT update will give 1440p30+ with DLSS on a 3070. VRAM is not the limiting factor there. CP2077 was a mess at launch too and it is hard to imagine what we have now at the time. TLOU launch is speculated to be rushed to tie in with the TV series. It's a sad fact that with PC games, many devs/publishers are unable to deliver a great experience on day 1. It is uncommon for them to say, it isn't ready, more time is needed. It does happen, it can happen, maybe not enough. At least it is more realistic than a major game being ready on time.

I don't actually have any mods for games installed myself, I just know that texture mods are popular, so that means that a lot of people like turning up textures as much as possible, to the point that the ones that come with the game evidently aren't sufficient for them.

 

I suppose a very simple test is this: if your GPU can run the game at High or Ultra when you turn down textures to Low, but can't when you turn textures back to High or Ultra, then it doesn't have enough VRAM in my opinion. So the 3070 has failed, because it can't run some newer games unless you turn down textures. Which to me, is arguably the single most important setting, but is one almost entirely dependant on VRAM capacity.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, YoungBlade said:

I don't actually have any mods for games installed myself, I just know that texture mods are popular, so that means that a lot of people like turning up textures as much as possible, to the point that the ones that come with the game evidently aren't sufficient for them.

I'm vaguely aware that can exist, but it may be more for older games developed before higher resolution displays were more common.

 

The mods I use add more items into games so things can get scary fast.

 

2 minutes ago, YoungBlade said:

I suppose a very simple test is this: if your GPU can run the game at High or Ultra when you turn down textures to Low, but can't when you turn textures back to High or Ultra, then it doesn't have enough VRAM in my opinion. So the 3070 has failed, because it can't run some newer games unless you turn down textures. Which to me, is arguably the single most important setting, but is one almost entirely dependant on VRAM capacity.

I get what you're saying, but it isn't so all or nothing. If you're a "max settings" type person, you're probably wanting something higher than 70 tier regardless. Once you're on a GPU that isn't guaranteeing top tier performance, you're making a tradeoff on image quality and performance. And gain, 3070 is over 2 years old at this point. Only now are some games really pushing 8GB limit, and they're generally not the best technical games. I'm not playing the "badly optimised" card here, and agree at some point top tier gaming will need more than 8GB. Done well 8GB shouldn't mean a bad gaming experience, especially as it is the biggest chunk in Steam Hardwar Survey at 32% overall. Above 8 GB GPUs only make up around 20% together.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×