Jump to content

Ideal GPU VRAM Hierarchy (Tier List)

Im sure everyone here knows about the complaints from RTX 3070 owners that they do not have enough VRAM and that in general, Nvidia needs to offer more VRAM for their cards, especially the middle-class cards.

 

With that in mind, I figured I would compile a short list of what I believe is the proper amount of VRAM for each Nvidia Card and see what you guys think.

 

Perhaps one day, Nvidia themselves might come across this list and use it for future reference. I know the chances of that are basically none, but you never know...

 

So, onto the list. Its simple, each card is listed with what I would consider a "reasonable" amount of VRAM that both gives the owner a little bit of overhead but without wasting too much VRAM on cards that do not need it.

 

RTX 4050: 8GB

RTX 4050-Ti: 8GB

RTX 4060: 10GB

RTX 4060-Ti: 12GB

RTX 4070: 12GB

RTX 4070-Ti: 16GB

RTX 4080: 16GB

RTX 4080-Ti: 20GB

RTX 4090: 24GB

RTX 4090-Ti: 24GB

 

In general, The following Resolutions and Settings can Demand the following amounts of VRAM in more demanding modern games:

 

1080p Medium: 4GB (8GB Card Recommended)

 

1080p High/Ultra: 6-7GB (8GB+ Card Recommended)

 

1440p Medium: 6-7GB (8GB+ Card Recommended)

 

1440p High/Ultra: 8-10GB (10-12GB Card Recommended)

 

4K Medium: 8-10GB (12GB+ Card Recommended)

 

4K High: 11-12GB (16GB Card Recommended)

 

4K Ultra: Up to 14GB (16GB+ Card Recommended)

 

Reasoning:

 

These days, the RTX 4070-Ti is actually capable of decent 4K gaming and the standard 4070 is capable of 1440p Ultra.

 

This means the RTX 4060 and 4060-Ti will CERTAINLY be capable of some level of 1440p gaming. With 1080p Ultra now using most of the VRAM in a standard 8GB card, I feel the 4060 and Ti deserve a small boost. They don't need much, just a little more.

 

You will also notice that some amounts match what Nvidia already offers. Nvidia might screw up a lot, but not ALL the time. If you ask me, 12GB of VRAM on the non-Ti 4070 is plenty, as its not really a 4K capable card the way the 4070-Ti is because of the massive performance difference between the two (yes Im aware the 4070-Ti was originally going to be called the 4080 12GB).

 

I think these levels of VRAM are plenty at each performance level and should last the user many years to come. So what do you think?

 

1440p Ultra gaming definitely needs more than 8GB of VRAM. Even if games are poorly optimized recently, it doesn't change the fact that 8GB of VRAM was introduced as the standard for mid tier cards almost a decade ago with the GTX 1070. It is certainly time to move on from this number as the standard.

 

The good news is, Im fairly certain that 12GB will be plenty for most cards and most owners for many years to come. It will take quite a lot more demand than today to saturate 12GB. It could easily be another decade, maybe two decades, before we ever need more than 12GB of VRAM

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, WallacEngineering said:

Im sure everyone here knows about the complaints from RTX 3070 owners that they do not have enough VRAM and that in general, Nvidia needs to offer more VRAM for their cards, especially the middle-class cards.

 

With that in mind, I figured I would compile a short list of what I believe is the proper amount of VRAM for each Nvidia Card and see what you guys think.

 

Perhaps one day, Nvidia themselves might come across this list and use it for future reference. I know the chances of that are basically none, but you never know...

 

So, onto the list. Its simple, each card is listed with what I would consider a "reasonable" amount of VRAM that both gives the owner a little bit of overhead but without wasting too much VRAM on cards that do not need it.

 

RTX 4050: 8GB

RTX 4050-Ti: 8GB

RTX 4060: 10GB

RTX 4060-Ti: 12GB

RTX 4070: 12GB

RTX 4070-Ti: 16GB

RTX 4080: 16GB

RTX 4080-Ti: 20GB

RTX 4090: 24GB

RTX 4090-Ti: 24GB

 

Reasoning:

 

These days, the RTX 4070-Ti is actually capable of decent 4K gaming and the standard 4070 is capable of 1440p Ultra.

 

This means the RTX 4060 and 4060-Ti will CERTAINLY be capable of some level of 1440p gaming. With 1080p Ultra now using most of the VRAM in a standard 8GB card, I feel the 4060 and Ti deserve a small boost. They don't need much, just a little more.

 

You will also notice that some amounts match what Nvidia already offers. Nvidia might screw up a lot, but not ALL the time. If you ask me, 12GB of VRAM on the non-Ti 4070 is plenty, as its not really a 4K capable card the way the 4070-Ti is because of the massive performance difference between the two (yes Im aware the 4070-Ti was originally going to be called the 4080 12GB).

 

I think these levels of VRAM are plenty at each performance level and should last the user many years to come. So what do you think?

 

1440p Ultra gaming definitely needs more than 8GB of VRAM. Even if games are poorly optimized recently, it doesn't change the fact that 8GB of VRAM was introduced as the standard for mid tier cards almost a decade ago with the GTX 1070. It is certainly time to move on from this number as the standard.

 

The good news is, Im fairly certain that 12GB will be plenty for most cards and most owners for many years to come. It will take quite a lot more demand than today to saturate 12GB. It could easily be another decade, maybe two decades, before we ever need more than 12GB of VRAM

for games that are some what competitive. 1440P would be the max that anyone would want.
bigger than that, you'd need a bigger screen to appreciate. but with bigger screen, you'd have hard time being aware of what's going on near the edge.

with more scenery games, 4k screen about 40 inch would be the max.

The other thing is, VR is nowhere near close to max. I could easily see people wanting 4Kper eye and both high refresh rate. that's goona need a lot of GPU

that aside, there's also one more thing that needs to be considered.

when average people get more VRAM, game developer will spend less time Optimizing textures. so games will bloat, even at same graphical fidelity.

Link to comment
Share on other sites

Link to post
Share on other sites

12gb on 4070 is not enough, and 4080 should also get more than 16gb

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Neftex said:

12gb on 4070 is not enough, and 4080 should also get more than 16gb

 

How is it not enough?

 

I recently upgraded to the Radeon RX 7900-XTX.

 

I play games on 3440x1440p (2K UltraWide) with Ultra settings. From what I have seen during gaming, most games typically load anywhere from 9-12GB of VRAM on my card. So basically, the 24GB of VRAM I have is completely overkill, and would be even at 4K Ultra. 16GB can handle 4K Ultra, which only the RTX 4090 is really designed to do at good frame rates. The RTX 4080 is really make for 4K High like this XTX is.

 

So I just cannot see how 16GB is not enough for the 4080. Its more than enough, you will BARELY ever push the demand to a full 12-12.5GB.

 

And of course the 4070 is NOWHERE NEAR the 4080, they are two completely separate classes of cards. So why would the 4070 need 16GB just for 1440p? Its not a card that is capable of 4K, its just not designed for that resolution.

 

1440p barely overwhelms 8GB, only just now are RTX 3070 owners beginning to experience issues at 1440p. So how is a full 50% bump in VRAM capacity somehow not enough? The most you will ever consume in 1440p Ultra is about 10GB.

 

 

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, WallacEngineering said:

 

How is it not enough?

 

I recently upgraded to the Radeon RX 7900-XTX.

 

I play games on 3440x1440p (2K UltraWide) with Ultra settings. From what I have seen during gaming, most games typically load anywhere from 9-12GB of VRAM on my card. So basically, the 24GB of VRAM I have is completely overkill, and would be even at 4K Ultra. 16GB can handle 4K Ultra, which only the RTX 4090 is really designed to do at good frame rates. The RTX 4080 is really make for 4K High like this XTX is.

 

So I just cannot see how 16GB is not enough for the 4080. Its more than enough, you will BARELY ever push the demand to a full 12-12.5GB.

 

And of course the 4070 is NOWHERE NEAR the 4080, they are two completely separate classes of cards. So why would the 4070 need 16GB just for 1440p? Its not a card that is capable of 4K, its just not designed for that resolution.

 

1440p barely overwhelms 8GB, only just now are RTX 3070 owners beginning to experience issues at 1440p. So how is a full 50% bump in VRAM capacity somehow not enough? The most you will ever consume in 1440p Ultra is about 10GB.

 

 

do you live under a rock maybe? new games are getting to 12gb vram in 1080p

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

I'd view 4070 as a (today) entry level 4k GPU, in a similar way the 3070 was an entry level 4k GPU when it was released. The 4070 is basically a 3080, are people really saying the 3080 isn't a 4k GPU? The nod to entry level is that in the couple years or so, game requirements are pushing harder. 4070 is not going to be a 4k high+ GPU for every most demanding games, although it can do it well in older ones, on better optimised titles, or at reduced settings.

 

As for VRAM I've let my money do the speaking. I got a 4070. I expect it to be fine for at least the next couple of years for mixed 1440p and 4k gaming, depending on if I connect it to my gaming monitor or TV. The vast majority of systems out there are still on 8GB or lower. Games will run on that. 

 

Specifically for one title that made the news recently, TLOU's latest patch says they have reduced VRAM usage. The patch before that said they improved textures for low/medium settings. I don't know exactly where that leaves us, but that game might not look like an Xbox 360 on 8GB GPUs any more.

 

IMO I'd simplify the VRAM tiers as follows:

8GB: 1080p high+, 1440p medium+

12GB: 1440p high+, 4k medium+

16GB: you're good

>16GB: excessive for pure gaming

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Neftex said:

do you live under a rock maybe? new games are getting to 12gb vram in 1080p

 

No they aren't. If you are basing this information soley on what you see in benchmarking during gaming, theres something you should know - those numbers are inaccurate for a very specific reason.

 

You see, when you load up a game, your GPU allocates VRAM for the game, and it gives it headroom. So even if a game reports 12GB usage, its not actually using 12GB, its just if your card has extra space, it will allocate that extra VRAM for overhead, just in case, even though it simply doesn't need it.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

I'd view 4070 as a (today) entry level 4k GPU... ...4070 is not going to be a 4k high+ GPU for every most demanding games

This is exactly why I do not consider the 4070 (non-Ti) to be a 4K capable card. It may be technically capable at lower settings but why would you bump up resolution to lower settings? Thats an oxi-moron. 1440p high looks better than 4k medium and especially 4k low.

 

To qualify as "designed for a particular resolution" a graphics card must be capable of 100 FPS average at high settings to keep 1% lows at or above 75 FPS to ensure a 100% smooth and responsive experience with high fidelity. Otherwise, you are better off lowering your resolution.

 

8 minutes ago, porina said:

IMO I'd simplify the VRAM tiers as follows:

8GB: 1080p high+, 1440p medium+

12GB: 1440p high+, 4k medium+

16GB: you're good

>16GB: excessive for pure gaming

 

Agreed 100%, will add to OP. As I said the 24GB on my card is definitely overkill.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, WallacEngineering said:

 

No they aren't. If you are basing this information soley on what you see in benchmarking during gaming, theres something you should know - those numbers are inaccurate for a very specific reason.

 

You see, when you load up a game, your GPU allocates VRAM for the game, and it gives it headroom. So even if a game reports 12GB usage, its not actually using 12GB, its just if your card has extra space, it will allocate that extra VRAM for overhead, just in case, even though it simply doesn't need it.

this video shows allocation and utilization separately, now if what you said was true why wouldnt it allocate 20+ gigs of the XTX since it would allow for more headroom and performance?

https://youtu.be/V2AcoBZplBs?t=760

https://youtu.be/alguJBl-R3I?t=414

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ThousandBlade said:

when average people get more VRAM, game developer will spend less time Optimizing textures. so games will bloat, even at same graphical fidelity.

Games are generally optimized based around the current consoles. The consoles offer up to 10GB of VRAM on the Xbox Series X and while the PS5 can technically offer all 16GB, in practice, games aren't going to use the whole thing, so 12-14GB seems more realistic. So basically all games will be optimized within that limit.

 

However, that will be for playing at console settings. And if you're spending more money on just your GPU than you are on an entire game console, you'd probably like to play at higher settings than that, which will almost certainly require more VRAM, but the issue of bloat mostly falls on consoles. The fact that the 7900XTX and 4090 have 24GB of VRAM is irrelevant to most game developers.

 

12 minutes ago, WallacEngineering said:

No they aren't. If you are basing this information soley on what you see in benchmarking during gaming, theres something you should know - those numbers are inaccurate for a very specific reason.

 

You see, when you load up a game, your GPU allocates VRAM for the game, and it gives it headroom. So even if a game reports 12GB usage, its not actually using 12GB, its just if your card has extra space, it will allocate that extra VRAM for overhead, just in case, even though it simply doesn't need it.

Yes, it's just allocation, but that's still worrying. The problem is that we're already seeing situations today where the card wants more than 12GB of VRAM, and the card just launched.

 

Remember, the 4070 is a $600 card. That used to be top-tier GPU money not that long ago. There's no reason for the 4070 to have to compromise on settings and be limited to 12GB apart from Nvidia's greed and a desire to segment their professional cards. The GPU is plenty powerful enough to take advantage of a 16GB buffer.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Neftex said:

this video shows allocation and utilization separately, now if what you said was true why wouldnt it allocate 20+ gigs of the XTX since it would allow for more headroom and performance?

https://youtu.be/V2AcoBZplBs?t=760

 

@YoungBlade

 

It only allocates a certain amount of headroom. Why exactly I do not know, but JaysTwoCents and Steve from Gamers Nexus have already proven it by running a card with higher VRAM and then running a card with lower VRAM with the lower VRAM running just fine at the same settings.

 

If it has enough to run the game without stuttering, then the game will run about the same even though it shows wildly different usage. On one card a game might show its allocated 9GB VRAM for 1080p Ultra because its a 12GB card and it has the room to do so.

 

But swap to a similar performance 8GB card and it will still run 1080p Ultra and deliver similar frame rates but for some reason monitoring software will only report 7.5GB VRAM usage. This is the allocation at work, I admit its extremely confusing and misleading.

 

There is a reason RTX 3070 owners are complaining about 1440p and not 1080p Ultra. 1080p Ultra is running fine for them, its 1440p High/Ultra that is actually overloading their VRAM and forcing data dumps to system RAM, which causes horrible stuttering so severe it makes the game unplayable.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, WallacEngineering said:

 

It only allocates a certain amount of headroom. Why exactly I do not know, but JaysTwoCents and Steve from Gamers Nexus have already proven it by running a card with higher VRAM and then running a card with lower VRAM with the lower VRAM running just fine at the same settings.

 

If it has enough to run the game without stuttering, then the game will run about the same even though it shows wildly different usage. On one card a game might show its allocated 9GB VRAM for 1080p Ultra because its a 12GB card and it has the room to do so.

 

But swap to a similar performance 8GB card and it will still run 1080p Ultra and deliver similar frame rates but for some reason monitoring software will only report 7.5GB VRAM usage. This is the allocation at work, I admit its extremely confusing and misleading.

what you dont see in those numbers is the fuckery the game has to do to allow running with lower vram. it might affect texture quality despite having the same settings selected - the game runs smooth but doesnt look as it should

 

heres one of the more extreme examples 8gb vs 16gb: https://youtu.be/Rh7kFgHe21k?t=547

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, WallacEngineering said:

This is exactly why I do not consider the 4070 (non-Ti) to be a 4K capable card. It may be technically capable at lower settings but why would you bump up resolution to lower settings? Thats an oxi-moron. 1440p high looks better than 4k medium and especially 4k low.

We have upscalers now. I maybe should have added that was for native rendering. For most games, using DLSS2/FSR2/XeSS offers some level of boost trading off image quality.

 

22 minutes ago, WallacEngineering said:

To qualify as "designed for a particular resolution" a graphics card must be capable of 100 FPS average at high settings to keep 1% lows at or above 75 FPS to ensure a 100% smooth and responsive experience with high fidelity. Otherwise, you are better off lowering your resolution.

Am I old fashioned? I'd still take 60+ fps average as "good for a resolution". Anything much above that may be nice to have in some but not all games.

 

11 minutes ago, YoungBlade said:

Remember, the 4070 is a $600 card. That used to be top-tier GPU money not that long ago.

980Ti is 8 years old now and that was $650 at launch. How much further back do we need to go? 1080 not-Ti was $600 at launch 7 years ago.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, porina said:

Am I old fashioned? I'd still take 60+ fps average as "good for a resolution". Anything much above that may be nice to have in some but not all games.

 

60 FPS is just fine for visual smoothness but not for input lag/responsiveness.

 

If you play First-Person shooters that are fast paced, ull quickly realize that 60 FPS isn't quite enough to get that responsiveness you need to respond quickly.

 

75 FPS is pretty good but isn't quite great yet. 90 FPS is where responsiveness is truly amazing to the point of diminishing returns.

 

The other problem with 60 FPS is with it as an AVERAGE. This means your 1% lows will dip well into the 40's. Even if you play slow-paced games like Dead Space, when those frames drop into the 40's you will definitely notice and it will not feel OR look good at all.

 

So the only reason 100 FPS average is even necessary is to keep those 1% lows above 75 FPS, so that no matter the situation and demand from the game, your experience will remain smooth and responsive, even at the worst of times.

 

Boot up a First Person shooter and use frame rate limiters to experience it for yourself.

 

Wiggle your mouse around at 45 FPS. Try to lock targets quickly, swinging your head quickly back and forth to induce motion blur. You will notice the lag immediately at 45 FPS. At 60 FPS you should still be able to notice it. At 75 FPS it should be hardly noticeable, and at 90 FPS it should be 100% smooth and responsive no matter what you do.

 

 

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Neftex said:

Heres one of the more extreme examples 8gb vs 16gb: https://youtu.be/Rh7kFgHe21k?t=547

 

Doesn't seem too terrible but ya, can clearly see a bit of a difference. However that is the difference between an RTX 3070 and RTX 4080 in terms of VRAM.

 

Im sure 12GB would be just fine in that situation, and you did say so yourself that this was an extreme situation that had to be forced.

 

So its not really representative of standard gameplay.

 

Also, Hogwarts legacy doesn't count in general. That game runs like complete ass on every card. Its pretty much Star Citizen just not as extreme.

 

So I disregard ALL Hogwarts Legacy benchmarks. Its a game that is clearly programmed by a dumpster fire of an optimizer team, the game might as well not even exist when it can even push a damn RTX 4090 to its breaking point.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, WallacEngineering said:

 

Doesn't seem too terrible but ya, can clearly see a bit of a difference. However that is the difference between an RTX 3070 and RTX 4080 in terms of VRAM.

 

Im sure 12GB would be just fine in that situation, and you did say so yourself that this was an extreme situation that had to be forced.

 

So its not really representative of standard gameplay.

it IS representative. you just claimed that rtx 3070 is fine on 1080p ultra in a post above and this video clearly shows ITS NOT FINE. idk how much you watched of the video i linked but the fucking textures clearly disappear and appear again - thats insane to call as fine. its extreme example but this or stutter is whats going to happen when youre out of vram, you might or might not notice that now with 12gb but who the hell knows if 12gb is enough with upcoming games

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Neftex said:

it IS representative. you just claimed that rtx 3070 is fine on 1080p ultra in a post above and this video clearly shows ITS NOT FINE. idk how much you watched of the video i linked but the fucking textures clearly disappear and appear again - thats insane to call as fine. its extreme example but this or stutter is whats going to happen when youre out of vram, you might or might not notice that now with 12gb but who the hell knows if 12gb is enough with upcoming games

Hogwarts legacy doesn't count in general. That game runs like complete ass on every card. Its pretty much Star Citizen just not as extreme.

 

 I disregard ALL Hogwarts Legacy benchmarks. Its a game that is clearly programmed by a dumpster fire of an optimizer team, the game might as well not even exist when it can even push a damn RTX 4090 to its breaking point.

 

Its not like when I play Star Citizen I go "Oh damn only 45 FPS in Orison damn this card sucks, this card has issues". Of course not, we are all well aware its the game that is the problem, not the card. Treat Hogwarts Legacy as the same - it doesn't count or matter.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, WallacEngineering said:

Hogwarts legacy doesn't count in general. That game runs like complete ass on every card. Its pretty much Star Citizen just not as extreme.

 

 I disregard ALL Hogwarts Legacy benchmarks. Its a game that is clearly programmed by a dumpster fire of an optimizer team, the game might as well not even exist when it can even push a damn RTX 4090 to its breaking point.

 

Its not like when I play Star Citizen I go "Oh damn only 45 FPS in Orison damn this card sucks, this card has issues". Of course not, we are all well aware its the game that is the problem, not the card. Treat Hogwarts Legacy as the same - it doesn't count or matter.

sure if you want to ignore the reality and make your tier list on baseless opinions, go ahead. i gave you trusted sources and you choose to ignore them

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Neftex said:

sure if you want to ignore the reality and make your tier list on baseless opinions, go ahead. i gave you trusted sources and you choose to ignore them

 

Well its as representative as Star Citizen and that is the FACT of the matter.

 

The only difference is that none of the major reviewers benchmark Star Citizen because they already know that its pointless.

 

I didn't ignore your source, I saw the video a while back myself. Its just that Hogwarts runs so terribly that you can NOT put it into the same category as the typical gamer.

 

When Cyberpunk 2077 runs twice as good as the game in question, you KNOW there is a problem lol 🤣

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

980Ti is 8 years old now and that was $650 at launch. How much further back do we need to go? 1080 not-Ti was $600 at launch 7 years ago.

I was thinking of the 1080. It was the flagship when it launched.

 

7 years may seem like a long time, but it was only 3 launches ago.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WallacEngineering said:

.

putting aside whether the game is a dumpster fire, iirc here's the chart toppers from recent games

 

RE4/diablo4/hogwarts 4k , 14gb, can easily trigger crashes on 12gb card

TLOU1 4k 15gb, also can crash on 12gb card

jedi survivor 1440p 18gb, 4k 22gb reported by reddit

 

can only ignore so many games, i wouldnt buy anything new with less than 16gb atm, and 20-24 might be preferred soon.

it's not just allocation, 12gb isn't enough on a new card.

 

RE4 and jedi usage can be turned down, but diablo 4 can't. It's...weird seeing 90 (6950xt) or 99% (3080ti) vram usage in a long session

 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WallacEngineering said:

 

@YoungBlade

 

It only allocates a certain amount of headroom. Why exactly I do not know, but JaysTwoCents and Steve from Gamers Nexus have already proven it by running a card with higher VRAM and then running a card with lower VRAM with the lower VRAM running just fine at the same settings.

 

If it has enough to run the game without stuttering, then the game will run about the same even though it shows wildly different usage. On one card a game might show its allocated 9GB VRAM for 1080p Ultra because its a 12GB card and it has the room to do so.

 

But swap to a similar performance 8GB card and it will still run 1080p Ultra and deliver similar frame rates but for some reason monitoring software will only report 7.5GB VRAM usage. This is the allocation at work, I admit its extremely confusing and misleading.

 

There is a reason RTX 3070 owners are complaining about 1440p and not 1080p Ultra. 1080p Ultra is running fine for them, its 1440p High/Ultra that is actually overloading their VRAM and forcing data dumps to system RAM, which causes horrible stuttering so severe it makes the game unplayable.

I don't think you understand the extent of the problem that 3070 owners are having. It does also affect performance at 1080p Ultra when you turn on raytracing in many modern titles.

 

You can see for yourself in this video from Hardware Unboxed. Note that the RTX A4000 should always lose to the RTX 3070 in gaming. Its core is slower and its drivers are not as optimized for games. And yet, thanks to its 16GB buffer, it makes otherwise unplayable messes perfectly playable.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, xg32 said:

RE4/diablo4/hogwarts 4k , 14gb, can easily trigger crashes on 12gb card

TLOU1 4k 15gb, also can crash on 12gb card

jedi survivor 1440p 18gb, 4k 22gb reported by reddit

 

Again, 12GB 4070 is NOT a 4K card, its a 1440p card.

 

Although the RTX 4070-Ti IS a 4K Capable card. That card I believe should have 16GB on it.

 

I mean have you guys not seen the benchmarks? 4070-Ti is like literally a full 30% faster than the non-Ti.

 

Its not even related to the 4070, it more like a 4075, which is why calling it a 4070-Ti is so strange. Because Nvidia originally wanted it to be a 4080 12GB in name, now the gap between a non-Ti and Ti is absolutely MASSIVE.

 

So basically, the 4070-Ti is NOT just a Ti version of the 4070, its actually an in-between 4075. Its strange to think about, but thats the way it is.

 

As for Jedi Survivor, yes that one specifically is another I personally ignore. 18GB usage at 1440p is NOT NORMAL. THERE IS SOMETHING WRONG WITH THE GAME.

 

Not sure how many times I have to repeat myself until you guys get it. Just think of Star Citizen. Would you include Star Citizen in a benchmarking comparison? No, its a BROKEN game. That's literally all you have to do and then it makes sense. These are Alpha/Beta games on PC, they are literally BROKEN.

 

Now 14GB at 4K is somewhat normal. Its not great but its tolerable.

 

Its not anyone elses fault that so many console ports are broken. Its on the developers. I mean consoles don't even have 22GB of VRAM, so it wouldn't even work properly at those settings on the console it was even designed for. Obviously, something is not normal.

 

I don't know how to explain it to you guys any more clearly than this. I mean if you want to start buying cards with massive stockpiles of completely unnecessary VRAM just because a few console ports went wrong recently then be my guest I guess.

 

The only reason I ended up with an XTX is because I got the Red Devil for $900 and I was aiming for the XT after the price drop to $800 MSRP and after tax the $849.99 AIB price would have been basically identical.

 

I got Jedi Survivor free with my Ryzen 7800X-3D so I guess Ill give it a shot in a few days and report back.

 

Its not like every new game coming out is going to use close to 16GB at 1440p Ultra. Thats just not gonna happen. You can't let a few broken console ports get to your head.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, WallacEngineering said:

If you play First-Person shooters that are fast paced, ull quickly realize that 60 FPS isn't quite enough to get that responsiveness you need to respond quickly.

I don't play first person shooters at all. It is an important genre but there is a LOT of gaming outside that. That's why I said 60+ fps is fine in most games, I didn't say all games. This was the area where more could help, especially in competitive shooters. 60+ fps average with G-sync is still generally great input latency wise, fixed 60 V-sync a little less so.

 

Many games are console ports and they're often targeted at 60fps there, although some might push up towards 120 at lower visual settings. It's fine for most people, most of the time. I wouldn't say a consistent fps far above 60 was mandatory. That's a personal choice.

 

6 hours ago, xg32 said:

TLOU1 4k 15gb, also can crash on 12gb card

I haven't looked for retesting but the latest patch released a couple days ago or so is said to reduce VRAM usage, and the one before that improved texture quality at low/medium. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

amd did the right thing. everything has 16 besides the lowest end but they honestly should have 12 as minimum.
as for nvidia.

xx50/ti/60/ti should have 12 bare minimum
xx70/ti/80 should have 16 bare minimum
xx80ti/90/90ti should have 20+ bare minimum.

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×