Jump to content

3080: Enough Vram for Next Gen?

SNerd7

 I am impressed by the 3080 specs and benchmark videos. I want to get one but one thing is gnawing at me in the back of my mind. 

 

Is 10gb of Vram enough? 

 

With current gen games like Marvels Avengers using 10gb on max, doesn't that mean they're sure to be more graphically intensive next gen games will need even more? 

 

Am I missing something? 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

For the most part, at least now, 10 gigs is more than enough for any current title.

PC: Motherboard: ASUS B550M TUF-Plus, CPU: Ryzen 3 3100, CPU Cooler: Arctic Freezer 34, GPU: GIGABYTE WindForce GTX1650S, RAM: HyperX Fury RGB 2x8GB 3200 CL16, Case, CoolerMaster MB311L ARGB, Boot Drive: 250GB MX500, Game Drive: WD Blue 1TB 7200RPM HDD.

 

Peripherals: GK61 (Optical Gateron Red) with Mistel White/Orange keycaps, Logitech G102 (Purple), BitWit Ensemble Grey Deskpad. 

 

Audio: Logitech G432, Moondrop Starfield, Mic: Razer Siren Mini (White).

 

Phone: Pixel 3a (Purple-ish).

 

Build Log: 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TofuHaroto said:

For the most part, at least now, 10 gigs is more than enough for any current title.

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

 

Edit: 

Played just now on full max settings. Hit 11.07gb

Link to comment
Share on other sites

Link to post
Share on other sites

probably enough, if it's not enough then you're probably have to turn down graphical setting to get some FPS anyways

 

manufacturers pair VRAM with processing capabilities quite well

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, SNerd7 said:

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

the whole time?!??!!!?!?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

probably enough, if it's not enough then you're probably have to turn down graphical setting to get some FPS anyways

 

manufacturers pair VRAM with processing capabilities quite well

I am just thinking about the future here. Got burned with my 2080ti (my own fault.) so I want to avoid needing another upgrade for sometime. I am worried that 2 years in the 3080 won't be able to run AAA titles on 4k max settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SuburbanBourbon said:

the whole time?!??!!!?!?

On 4k/mostly max settings year it fluctuated from 9.9 to 10.3 the entire time. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SNerd7 said:

On 4k/mostly max settings year it fluctuated from 9.9 to 10.3 the entire time. 

that is probably not  optimized 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SNerd7 said:

Got burned with my 2080ti

it's still a good card

1 minute ago, SNerd7 said:

I am worried that 2 years in the 3080 won't be able to run AAA titles on 4k max settings

probably will, since it's one of the top tier consumer card for the next 2 years

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Moonzy said:

it's still a good card

probably will, since it's one of the top tier consumer card for the next 2 years

that to be at 700$!!!!!!!! howwwwwwwwww

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SuburbanBourbon said:

that is probably not  optimized 

From what I gather in my research, PC versions of AAA games are usually never optimized. I want to be sure that my next video card can handle it for years to come. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, SuburbanBourbon said:

@SNerd7 what are your specs?

 

2080ti 

i9 9900kf 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SNerd7 said:

From what I gather in my research, PC versions of AAA games are usually never optimized. I want to be sure that my next video card can handle it for years to come. 

true but marvel's avengers is not a good game either like including everything

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SuburbanBourbon said:

true but marvel's avengers is not a good game either like including everything

5 hours in and I am enjoying it a lot. Especially visually on PC

Link to comment
Share on other sites

Link to post
Share on other sites

VRAM used is realistically scaled to the performance of the cars. 2GB is still enough for a GTX 750 Ti. For a 980 Ti, a 1060, and a 1660 Super, 6GB is enough. Nothing is going to need 10GB.  People exaggerate how much VRAM is necessary that it's almost as bad as power supplies.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sakkura said:

Real usage or just preallocation?

I am new to the tech world but it looked to me like usage because it would change scene by scene. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, SNerd7 said:

and heavy ray tracing.

That actually reduces VRAM usage significantly, since you don't need lighting or shadow maps. (which usually use about 1/2 as much as textures, apiece)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JoostinOnline said:

VRAM used is realistically scaled to the performance of the cars. 2GB is still enough for a GTX 750 Ti. For a 980 Ti, a 1060, and a 1660 Super, 6GB is enough. Nothing is going to need 10GB.  People exaggerate how much VRAM is necessary that it's almost as bad as power supplies.

This is interesting. However the scaling also comes at a cost to resolution no? I have a 4k tv intend to make use of a lot going forward and I would hate to find out 12 or so months from now that the Vram on my 3080 is no longer enough for 4k ultra settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, BTGbullseye said:

That actually reduces VRAM usage significantly, since you don't need lighting or shadow maps. (which usually use about 1/2 as much as textures, apiece)

Really? If true that is really cool.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SNerd7 said:

Really? If true that is really cool.

Raytracing only needs to know the color, direction, position, and size of the light, as well as the reflectivity of the textures in the scene. This results in a minimal increase in texture info, and a complete removal of lighting and shadow requirements. If it needed any significant amount of VRAM to use it, I would be shocked. It's designed to be a processed thing, not a projected texture. (which is what lighting and shadows are right now)

5 minutes ago, SNerd7 said:

This is interesting. However the scaling also comes at a cost to resolution no? I have a 4k tv intend to make use of a lot going forward and I would hate to find out 12 or so months from now that the Vram on my 3080 is no longer enough for 4k ultra settings. 

Higher render resolution does increase VRAM usage significantly. Usually add 10% usage going from 1080p to 1440p, and another 20% going to 4k. Remove the lighting and shadow VRAM requirement with raytracing, and it's back down to 1080p VRAM usage.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, BTGbullseye said:

Raytracing only needs to know the color, direction, position, and size of the light, as well as the reflectivity of the textures in the scene. This results in a minimal increase in texture info, and a complete removal of lighting and shadow requirements. If it needed any significant amount of VRAM to use it, I would be shocked. It's designed to be a processed thing, not a projected texture. (which is what lighting and shadows are right now)

Higher render resolution does increase VRAM usage significantly. Usually add 10% usage going from 1080p to 1440p, and another 20% going to 4k. Remove the lighting and shadow VRAM requirement with raytracing, and it's back down to 1080p VRAM usage.

Wow thank you for educating me on that. If what your saying is true and having RTX on will lead to 1080p levels of VRAM usage at 4k then oh boy I am excited! 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SNerd7 said:

Wow thank you for educating me on that. If what your saying is true and having RTX on will lead to 1080p levels of VRAM usage at 4k then oh boy I am excited! 

That's my napkin math on it, based on what I've seen of the tech involved. Should be interesting to see how close to real it comes.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×