Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

3080: Enough Vram for Next Gen?

 I am impressed by the 3080 specs and benchmark videos. I want to get one but one thing is gnawing at me in the back of my mind. 

 

Is 10gb of Vram enough? 

 

With current gen games like Marvels Avengers using 10gb on max, doesn't that mean they're sure to be more graphically intensive next gen games will need even more? 

 

Am I missing something? 

 

 

Link to post
Share on other sites

For the most part, at least now, 10 gigs is more than enough for any current title.

Wārudobītā

CPU: Ryzen 3 3100. Board: B550m TUF plus. Memory: 16 (2x8) 3200 cl16 Hyper x Fury RGB. GPU: Gigabyte Windforce 1650 super.  Boot Drive: MX500. Case: Antec NX500 (Quite crappy. :p) PSU: CX450. 

 

Hehe, Meshify c is a kewl case. 

 

 

Link to post
Share on other sites
3 hours ago, TofuHaroto said:

For the most part, at least now, 10 gigs is more than enough for any current title.

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

 

Edit: 

Played just now on full max settings. Hit 11.07gb

Link to post
Share on other sites

probably enough, if it's not enough then you're probably have to turn down graphical setting to get some FPS anyways

 

manufacturers pair VRAM with processing capabilities quite well

Things I need help with:

Spoiler

none atm

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer: shit tier quality products, shit tier customer service thus far, they "tried" to solve my issue but they arent really doing anything but delaying and delaying. (on-going case since July)

Gigabyte: horrible customer service (gigabyte had literally 0 customer service, asked me to go to retailer with NO WAY to email them about a question) but at least they fixed my shit in ONE MONTH (would probably take me 1 hour to fix if they let me email them)

XiaoMi Phones: built like a tank but the software is buggy as all hell

Seagate HDD: had too many dead seagate drives

Kingston SSD: 300V controller swap thingy

Razer (except their mouse)

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
4 minutes ago, SNerd7 said:

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

the whole time?!??!!!?!?

Link to post
Share on other sites
2 minutes ago, Moonzy said:

probably enough, if it's not enough then you're probably have to turn down graphical setting to get some FPS anyways

 

manufacturers pair VRAM with processing capabilities quite well

I am just thinking about the future here. Got burned with my 2080ti (my own fault.) so I want to avoid needing another upgrade for sometime. I am worried that 2 years in the 3080 won't be able to run AAA titles on 4k max settings. 

Link to post
Share on other sites
Just now, SuburbanBourbon said:

the whole time?!??!!!?!?

On 4k/mostly max settings year it fluctuated from 9.9 to 10.3 the entire time. 

Link to post
Share on other sites
Just now, SNerd7 said:

On 4k/mostly max settings year it fluctuated from 9.9 to 10.3 the entire time. 

that is probably not  optimized 

Link to post
Share on other sites
1 minute ago, SNerd7 said:

Got burned with my 2080ti

it's still a good card

1 minute ago, SNerd7 said:

I am worried that 2 years in the 3080 won't be able to run AAA titles on 4k max settings

probably will, since it's one of the top tier consumer card for the next 2 years

Things I need help with:

Spoiler

none atm

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer: shit tier quality products, shit tier customer service thus far, they "tried" to solve my issue but they arent really doing anything but delaying and delaying. (on-going case since July)

Gigabyte: horrible customer service (gigabyte had literally 0 customer service, asked me to go to retailer with NO WAY to email them about a question) but at least they fixed my shit in ONE MONTH (would probably take me 1 hour to fix if they let me email them)

XiaoMi Phones: built like a tank but the software is buggy as all hell

Seagate HDD: had too many dead seagate drives

Kingston SSD: 300V controller swap thingy

Razer (except their mouse)

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
1 minute ago, Moonzy said:

it's still a good card

probably will, since it's one of the top tier consumer card for the next 2 years

that to be at 700$!!!!!!!! howwwwwwwwww

Link to post
Share on other sites
1 minute ago, SuburbanBourbon said:

that is probably not  optimized 

From what I gather in my research, PC versions of AAA games are usually never optimized. I want to be sure that my next video card can handle it for years to come. 

Link to post
Share on other sites
2 minutes ago, SuburbanBourbon said:

@SNerd7 what are your specs?

 

2080ti 

i9 9900kf 

 

Link to post
Share on other sites
Just now, SNerd7 said:

From what I gather in my research, PC versions of AAA games are usually never optimized. I want to be sure that my next video card can handle it for years to come. 

true but marvel's avengers is not a good game either like including everything

Link to post
Share on other sites
1 minute ago, SuburbanBourbon said:

true but marvel's avengers is not a good game either like including everything

5 hours in and I am enjoying it a lot. Especially visually on PC

Link to post
Share on other sites

VRAM used is realistically scaled to the performance of the cars. 2GB is still enough for a GTX 750 Ti. For a 980 Ti, a 1060, and a 1660 Super, 6GB is enough. Nothing is going to need 10GB.  People exaggerate how much VRAM is necessary that it's almost as bad as power supplies.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to post
Share on other sites
Just now, Sakkura said:

Real usage or just preallocation?

I am new to the tech world but it looked to me like usage because it would change scene by scene. 

Link to post
Share on other sites
16 minutes ago, SNerd7 said:

and heavy ray tracing.

That actually reduces VRAM usage significantly, since you don't need lighting or shadow maps. (which usually use about 1/2 as much as textures, apiece)

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
2 minutes ago, JoostinOnline said:

VRAM used is realistically scaled to the performance of the cars. 2GB is still enough for a GTX 750 Ti. For a 980 Ti, a 1060, and a 1660 Super, 6GB is enough. Nothing is going to need 10GB.  People exaggerate how much VRAM is necessary that it's almost as bad as power supplies.

This is interesting. However the scaling also comes at a cost to resolution no? I have a 4k tv intend to make use of a lot going forward and I would hate to find out 12 or so months from now that the Vram on my 3080 is no longer enough for 4k ultra settings. 

Link to post
Share on other sites
1 minute ago, BTGbullseye said:

That actually reduces VRAM usage significantly, since you don't need lighting or shadow maps. (which usually use about 1/2 as much as textures, apiece)

Really? If true that is really cool.

Link to post
Share on other sites
1 minute ago, SNerd7 said:

Really? If true that is really cool.

Raytracing only needs to know the color, direction, position, and size of the light, as well as the reflectivity of the textures in the scene. This results in a minimal increase in texture info, and a complete removal of lighting and shadow requirements. If it needed any significant amount of VRAM to use it, I would be shocked. It's designed to be a processed thing, not a projected texture. (which is what lighting and shadows are right now)

5 minutes ago, SNerd7 said:

This is interesting. However the scaling also comes at a cost to resolution no? I have a 4k tv intend to make use of a lot going forward and I would hate to find out 12 or so months from now that the Vram on my 3080 is no longer enough for 4k ultra settings. 

Higher render resolution does increase VRAM usage significantly. Usually add 10% usage going from 1080p to 1440p, and another 20% going to 4k. Remove the lighting and shadow VRAM requirement with raytracing, and it's back down to 1080p VRAM usage.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
1 minute ago, BTGbullseye said:

Raytracing only needs to know the color, direction, position, and size of the light, as well as the reflectivity of the textures in the scene. This results in a minimal increase in texture info, and a complete removal of lighting and shadow requirements. If it needed any significant amount of VRAM to use it, I would be shocked. It's designed to be a processed thing, not a projected texture. (which is what lighting and shadows are right now)

Higher render resolution does increase VRAM usage significantly. Usually add 10% usage going from 1080p to 1440p, and another 20% going to 4k. Remove the lighting and shadow VRAM requirement with raytracing, and it's back down to 1080p VRAM usage.

Wow thank you for educating me on that. If what your saying is true and having RTX on will lead to 1080p levels of VRAM usage at 4k then oh boy I am excited! 

Link to post
Share on other sites
1 minute ago, SNerd7 said:

Wow thank you for educating me on that. If what your saying is true and having RTX on will lead to 1080p levels of VRAM usage at 4k then oh boy I am excited! 

That's my napkin math on it, based on what I've seen of the tech involved. Should be interesting to see how close to real it comes.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×