Jump to content

10gb or 20gb 3080

BunnyHunter67

I currently have a 3440x1440 160hz ultra wide and am going to be upgrading to a 3080. A 3080 super or whatever it will be called leaked with 20 gigs of vram instead of 10. I know some games at 4k can use more than 10 gigs of vram, but I don't know how that scales down to my resolution. Is 10 gigs enough for my resolution, or would I be better off waiting for a 20gb 3080. Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

If you need it now then get the 3080 now.

Any 20GB, Ti or Super version is all still unknown, sure there may have been a leak but nothing really official.

 

Besides unless AMD really do something interesting in the next few months we will probably have to wait 6 months to a year

Link to comment
Share on other sites

Link to post
Share on other sites

i've thought about this as well, 1440p gaming is unlikely to fill 10G of memory with games at the moment.

4k gaming with textures its possible to fill memory, pcie 4.0 might provide some relief when memory is full at a later stage.

 

I think ill hold onto my X34 Monitor for another 3-4 years depending how long it lasts, and by then not the next gen but the gen after that will be out probably time to upgrade to 4K by then.

not sure if turning on DLSS lowers texture memory sizes might be a way to starve off filling memory in 2 year time games.

but 20G of memory would be more expensive so the card will be more expensive as well.

 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BunnyHunter67 said:

I currently have a 3440x1440 160hz ultra wide and am going to be upgrading to a 3080. A 3080 super or whatever it will be called leaked with 20 gigs of vram instead of 10. I know some games at 4k can use more than 10 gigs of vram, but I don't know how that scales down to my resolution. Is 10 gigs enough for my resolution, or would I be better off waiting for a 20gb 3080. Thanks!

I've used more than 10gb on the VRAM with a 2080 Ti using the same resolution as yours. My thoughts are that you either buy the 3090 now or wait for the models with higher VRAM. Consoles are about to launch, and the "graphical baseline" will rise considerable in no time. I'm personally waiting for a 3080 Ti, which will 100% come at some point.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, HumdrumPenguin said:

I've used more than 10gb on the VRAM with a 2080 Ti using the same resolution as yours. My thoughts are that you either buy the 3090 now or wait for the models with higher VRAM. Consoles are about to launch, and the "graphical baseline" will rise considerable in no time. I'm personally waiting for a 3080 Ti, which will 100% come at some point.

When you see in afterburner the memory usage thats just the game requesting allocation and it can allocate a heap more than it uses.

there was some testing done with recent games that are open world and they consume a heap of memory. Microsoft Flight Sim is a good one.

 

Looks like a big issue at 4K not so much at 1440P.

 

untitled-1.png

2.PNG

3.PNG

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Yoshi Moshi said:

just get a $3090 so you don't have to worry

Easier said than done, it's almost twice the price. But yes.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Fatih19 said:

Easier said than done, it's almost twice the price.

Yeah i don't see the value future proofing memory usage for the next 2 years when you could get a 4080 for probably the same price as a 3080 and still spend the same amount of money and probably get more performance. :)

3090's are for people doing 4K high FPS, entering 8K, Workstation work or just so much money they just need to buy the highest end stuff now...

 

but.. waiting for a 3080 with 20GB could be a good move, how long is that wait? 6 months, a year? who knows. 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Maticks said:

Yeah i don't see the value future proofing memory usage for the next 2 years when you could get a 4080 for probably the same price as a 3080 and still spend the same amount of money and probably get more performance. :)

Probably only 300 dollars more since you could sell the 3080 to subsidize 4080 purchase.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Maticks said:

When you see in afterburner the memory usage thats just the game requesting allocation and it can allocate a heap more than it uses.

there was some testing done with recent games that are open world and they consume a heap of memory. Microsoft Flight Sim is a good one.

 

Looks like a big issue at 4K not so much at 1440P.

 

untitled-1.png

2.PNG

3.PNG

Sometimes it's just not well optimized, and it takes months for the developer to address the VRAM "problem". Resident Evil 2 stuttered quite a lot on 1080s and 2080s solely because of VRAM. So yeah, I do get your point, but I stick to what I've learned by experiencing these things first hand. That said, I'd either go to a 3080 Ti or 3090 from here (2080 Ti on the system), but nothing less, so I'd recommend the same to others if they have the financial means.

Edited by HumdrumPenguin
Link to comment
Share on other sites

Link to post
Share on other sites

For 1440p I didn't worry about the 10GB vram when I picked my 3080. IMO, by the time VRAM becomes a problem at 1440p, the card might not have enough power or bandwidth to run the game at max settings anyway. Also, the new texture compression tech should help with VRAM usage, and future VRAM intensive games would likely take advantage of that feature, making the value of future proofing even more uncertain.

 

In short, I wouldn't worry it. I think the 3080 20GB buffer is meant for 2160p, multiscreen, VR, etc...

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, 05032-Mendicant-Bias said:

For 1440p I didn't worry about the 10GB vram when I picked my 3080. IMO, by the time VRAM becomes a problem at 1440p, the card might not have enough power or bandwidth to run the game at max settings anyway. Also, the new texture compression tech should help with VRAM usage, and future VRAM intensive games would likely take advantage of that feature, making the value of future proofing even more uncertain.

 

In short, I wouldn't worry it. I think the 3080 20GB buffer is meant for 2160p, multiscreen, VR, etc...

3440x1440 is 1440p ultrawide. That's half way through 4k from standard 16:9 1440p, and is far more taxing on the GPU. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, HumdrumPenguin said:

Sometimes it's just not well optimized, and it takes months for the developer to address the VRAM "problem". Resident Evil 2 stuttered quite a lot on 1080s and 2080s solely because of VRAM. So yeah, I do get your point, but I stick to what I've learned by experiencing these things first hand. That said, I'd either go to a 3080 Ti or 3090 from here (2080 Ti on the system), but nothing less, so I'd recommend the same to others if they have the financial means.

I still can't crank up textures in RE2 as much I'd like or I get pretty bad "Vram stutters"  at 1080p, not that I'd expect to run at highest settings with a cheap ass 1060 , just saying, some games easily max out Vram and it's not just "optimization" because I can clearly see how much better it looks with higher settings (and the game, correctly,  even warns you about too high Vram usage)

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

All these leaks are just rumors at this point. What I mean is assuming that they are true (probably is), you can bet that Nvidia is waiting for AMD graphics card, and consumer reactions to them.

 

Depending on AMD offering(s), if anything is needed to be done, they'll prepare a release of an updated card. Might be a 20GB, might be 16GB variant, might be a Ti... all depends on AMD. If AMD has nothing that Nvidia should be worried about, then they might decide to not release anything, or maybe 6 month or even a year from now (if they see that the following gen card needs more time to work on)

Link to comment
Share on other sites

Link to post
Share on other sites

i dont think a 3080 with 20G of VRAM is going to be priced that well, i think it will be just under the 3090 in price, as Yoshi said you might as well buy a 3090 at that point for an extra $200.

Either way the price difference between the 3080 10G and 20G is likely going to be a big one. they've said on a number of sites that GDDR6 Memory was expensive.

Nvidia will also want to cash in if they know people are holding out for a 20G model.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Maticks said:

When you see in afterburner the memory usage thats just the game requesting allocation and it can allocate a heap more than it uses.

there was some testing done with recent games that are open world and they consume a heap of memory. Microsoft Flight Sim is a good one.

 

Looks like a big issue at 4K not so much at 1440P.

Are the graphs you posted confirmed to be used and not just allocated memory, and how would tests tell the difference between them?

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

If you really needed 20GB you'd not even be looking at the 3080, memory allocation might hit 10, but it rarely if ever uses it till you start doing 4k gaming which is still very niche. And if you can afford a decent 4k panel along with the rest of the hardware, you'd still more than likely arrive at the 3090.

Level 2 Tech Support for a Corporation servicing over 12,000 users and devices, AMA

Desktop - CPU: Ryzen 5800x3D | GPU: Sapphire 6900 XT Nitro+ SE | Mobo: Asus x570 TUF | RAM: 32GB CL15 3600 | PSU: EVGA 850 GA | Case: Corsair 450D | Storage: Several | Cooling: Brown | Thermal Paste: Yes

 

Laptop - Dell G15 | i7-11800H | RTX 3060 | 16GB CL22 3200

 

Plex - Lenovo M93p Tiny | Ubuntu | Intel 4570T | 8GB RAM | 2x 8TB WD RED Plus 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Delicieuxz said:

Are the graphs you posted confirmed to be used and not just allocated memory, and how would tests tell the difference between them?

pulled from two sites that stated it was used memory not allocated memory, games can request more allocated memory but not use it.

you can see that in hwinfo as well where you see the allocated and used in directx as two different sets of numbers.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×