Jump to content

For future AAA titles in ultrawide, 3080 or...

Virtualnate

Hi all, I am debating whether to get the 3080 (hypothetically speaking lol) or wait for a 3080 Super or something similar. Not the 3080Ti because 20GB is probably too much. I'd like something with 16GB GDDR6X. I know 10GB GDDR6X is overkill for most games today, but should I wait and future-proof my card for titles that come out during the next couple of years? I might even get a 3840x1600 ultrawide monitor. Could someone please explain why 10GB GDDR6X would be enough and the dilemma of why it's about FASTER VRAM vs MORE VRAM? 

Link to comment
Share on other sites

Link to post
Share on other sites

IMO 10GB is going to be enough at the high end for a couple of years.

  1. Direct texture streaming from SSD to GPU and GPU side texture decompression may offset the increase in texture quality to an extent
  2. Lowering graphics options like texture quality and tassellation to lower VRAM use is always an option
  3. By the time a game needs 16GB of VRAM, a 6800XT or a 3080 series won't be able to run it at ultra anyway

You naturally need to lower graphics options as games get more visually intensive over the years. With 10GB VRAM you'll lower textures and tassellation. With 16GB VRAM you'll lower AA, Shadows, LOD and other options to reduce compute load.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Cvet76 said:

The games almost never reach the amount of VRAM on your GPU because a game only reserves an amount that it "thinks" it should need. And secondly, the data in VRAM is in there for only a brief period of time before it's passed on to the processor and out onto your display. Things are more nuanced than this, but this is the gist.

Would you know of any game that almost reaches the 10GB VRAM limit? I have no idea what I'm asking either lol

 

Did Nvidia screw up when both the 1080Ti and 2080Ti had 11GB of VRAM and then decided to reduce the 3080 VRAM to 10?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, 05032-Mendicant-Bias said:

IMO 10GB is going to be enough at the high end for a couple of years.

  1. Direct texture streaming from SSD to GPU and GPU side texture decompression may offset the increase in texture quality to an extent
  2. Lowering graphics options like texture quality and tassellation to lower VRAM use is always an option
  3. By the time a game needs 16GB of VRAM, a 6800XT or a 3080 series won't be able to run it at ultra anyway

You naturally need to lower graphics options as games get more visually intensive over the years. With 10GB VRAM you'll lower textures and tassellation. With 16GB VRAM you'll lower AA, Shadows, LOD and other options to reduce compute load.

Sorry, I'm trying to understand all this. but basically from what I gather, you would say I should stick to 3440x1440 and HIGH settings?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Virtualnate said:

Sorry, I'm trying to understand all this. but basically from what I gather, you would say I should stick to 3440x1440 and HIGH settings?

My 3080 has no problem running 4K with it's 10GB of VRAM. I've seen games like Final Fantasy 15 or my modded Skyrim install getting close to allocating 10GB of VRAM, but the actual useage is what really matters, and we have pretty much no way of telling the exact VRAM useage.

 

VRAM useage that's shown by overlays for example really is just the allocation. Games always "reserve" more VRAM than they actually need.

 

There is 1 way i know of to measure if you're maxed out in terms of VRAM. Look at your FPS, then lower ONLY the texture quality, then look at your FPS again. If the FPS increased by a noticeable amount, then you were maxed on VRAM before. As long as you have enough VRAM, lowering texture quality shouldn't really impact FPS.

 

That being said, i have yet to see a game that maxes my 3080, even at 4K. Funny enough, the 3080 is actually better at 4K than the 6800XT. But i'm not technical enough to know if that's because of anything related to the VRAM, core or general GPU architecture.

 

We also have no way to tell how new techs, such as RTX I/O (GPU directly accessing storage for models and textures) will impact on VRAM useage and the relevancy of capacity compared to speed.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

My 3080 has no problem running 4K with it's 10GB of VRAM. I've seen games like Final Fantasy 15 or my modded Skyrim install getting close to allocating 10GB of VRAM, but the actual useage is what really matters, and we have pretty much no way of telling the exact VRAM useage.

 

VRAM useage that's shown by overlays for example really is just the allocation. Games always "reserve" more VRAM than they actually need.

 

There is 1 way i know of to measure if you're maxed out in terms of VRAM. Look at your FPS, then lower ONLY the texture quality, then look at your FPS again. If the FPS increased by a noticeable amount, then you were maxed on VRAM before. As long as you have enough VRAM, lowering texture quality shouldn't really impact FPS.

 

That being said, i have yet to see a game that maxes my 3080, even at 4K. Funny enough, the 3080 is actually better at 4K than the 6800XT. But i'm not technical enough to know if that's because of anything related to the VRAM, core or general GPU architecture.

Thanks for the explanation! That was very detailed. Sadly no 3080s on earth. I HATE SCALPERS!!!

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Virtualnate said:

Sorry, I'm trying to understand all this. but basically from what I gather, you would say I should stick to 3440x1440 and HIGH settings?

games that at least reserve 10-11gb vram zero dawn, bf5, games that i can actually crash on demand due to lack of vram, gta5 with mods, bl3, RE3R, haven't played too much ubisoft's new games yet. Overall 10gb vram shouldn't be a deal breaker.

 

I recommend waiting for the rumored 3080 ti, not for the vram, but for the extra horsepower you can always use for uw. Either way you can't screw up this gen too badly.

 

Nvidia didn't screw up, the cards were gimped to be 700usd and has massive demand while maintaining margins, though purposely gimped for the 3080ti/3090, it has the same die size as the 3080ti and 3090 and literally 2 empty slots for 12gb vram, but it was purposely gimped.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, xg32 said:

games that at least reserve 10-11gb vram zero dawn, bf5, games that i can actually crash on demand due to lack of vram, gta5 with mods, bl3, RE3R, haven't played too much ubisoft's new games yet.

 

I recommend waiting for the rumored 3080 ti. Nvidia didn't screw up, the cards were gimped to be 700usd and has massive demand while maintaining margins, though purposely gimped for the 3080ti/3090, it has the same die size as the 3080ti and 3090 and literally 2 empty slots for 12gb vram, but it was purposely gimped.

Maybe Nvidia was targeting the 3080 for ultrawide users at ULTRA settings and then release the 3080Ti to both ultrawide and 4k users?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Virtualnate said:

Maybe Nvidia was targeting the 3080 for ultrawide users at ULTRA settings and then release the 3080Ti to both ultrawide and 4k users?

the math is very simple if you think of 1440p uw as 4k (it's close, maybe +10% fps compared to 4k), 3070=4k 60fps, 3080=80fps 3090 90fps~ on ultra, there are exceptions, mainly ubisoft games but those are the numbers i'm using for now. I'm waiting for 6900xt benchmarks to decide on a card atm while we wait for stock, 3080 ti/6900xt is the likely choices unless i find something for msrp from evga for the warranty.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, xg32 said:

I recommend waiting for the rumored 3080 ti, not for the vram, but for the extra horsepower you can always use for uw.

I wouldn't expect a Super/Ti to perform noticeable better, because even a small increase in performance with 20GB of VRAM would instantly kill every argument for the 3090. My guess is, that the core performance is 5% better at most and maybe some advantage from more memory bandwidth with more VRAM chips on board.

 

I'm one of those people who recommend to just buy a card when you need more performance. In this market there is always something new around the corner, so you could be waiting for the next best thing indefinetly. (Altough you can't buy something right now, so there's that...)

 

The 3080 10GB is enough for 4K games these days, and in the future when memory capacity would be a problem, it's likely not fast enough for the latest games at 4K 60Hz anyway. Basically, right now, i wouldn't worry about it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, xg32 said:

the math is very simple if you think of 1440p uw as 4k (it's close, maybe +10% fps compared to 4k), 3070=4k 60fps, 3080=80fps 3090 90fps~ on ultra, there are exceptions, mainly ubisoft games but those are the numbers i'm using for now. I'm waiting for 6900xt benchmarks to decide on a card atm while we wait for stock, 3080 ti/6900xt is the likely choices unless i find something for msrp from evga for the warranty.

now that's shedding light into my confusion. Thanks. I guess I'll get a 3080 if/when I see one. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Virtualnate said:

now that's shedding light into my confusion. Thanks. I guess I'll get a 3080 if/when I see one. 

yap like @Stahlmannsaid, just grab whatever's in stock if it fits your budget these days,

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Stahlmann said:

I wouldn't expect a Super/Ti to perform noticeable better, because even a small increase in performance with 20GB of VRAM would instantly kill every argument for the 3090. My guess is, that the core performance is 5% better at most and maybe some advantage from more memory bandwidth with more VRAM chips on board.

 

I'm one of those people who recommend just buy a card when you need more performance. In this marked there is always something new around the corner, so you could be waiting for the next best thing indefinetly.

 

The 3080 10GB is enough for 4K games these days, and in the future when memory capacity would be a problem, it's likely not fast enough for the latest games at 4K 60Hz anymore. Basically, i wouldn't worry about it.

Your comments have been very informative. I think I'll wait around until I get my hands on a 3080. Also, would there be a noticeable difference between the FE cards at base clockspeeds vs the AORUS XTREME, etc, with OC clockspeeds up to 1900MHz?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Virtualnate said:

Your comments have been very informative. I think I'll wait around until I get my hands on a 3080. Also, would there be a noticeable difference between the FE cards at base clockspeeds vs the AORUS XTREME, etc, with OC clockspeeds up to 1900MHz?

There is basically a 2-3% difference between all the cards when you're not manually OCing. Because of GPU boost they will all boost higher than the advertised clock speeds.

 

When you get into manual OCing, the higher-end models get noticeably better than the MSRP ones like the FE.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Virtualnate said:

Your comments have been very informative. I think I'll wait around until I get my hands on a 3080. Also, would there be a noticeable difference between the FE cards at base clockspeeds vs the AORUS XTREME, etc, with OC clockspeeds up to 1900MHz?

For example, my PNY 3080 has an advertised boost clock of 1710MHz, but it has been boosting to ~1900MHz without my doing anything in terms of OCing. Of course i have it OCed now, but that was the stock behavior.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

For example, my PNY 3080 has an advertised boost clock of 1710MHz, but it has been boosting to ~1900MHz without my doing anything in terms of OCing.

Wow, I've overclocked my 8700K a few times but then reverted because of insufficient cooling. Overclocking is a little more intimidating for me. I'll keep the card the way it is. 

I just hope 18AWG can handle the power draw, as I've heard 16AWG is better (even Seasonic went out of their way to make 16AWG for the 30 series) and I was even thinking of getting sleeved cables with that gauge from either Mod-One or Ensourced.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Virtualnate said:

Wow, I've overclocked my 8700K a few times but then reverted because of insufficient cooling. Overclocking is a little more intimidating for me. I'll keep the card the way it is.

GPU overclocking is much easier than CPU overclocking. It has nothing to do with BIOS and is basically just changing some sliders. But if you don't plan on overclocking, there is not much sense in buying a high-end model. Just go with a good MSRP one like the Asus TUF.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

GPU overclocking is much easier than CPU overclocking. It has nothing to do with BIOS and is basically just changing some sliders. But if you don't plan on overclocking, there is not much sense in buying a high-end model. Just go with a good MSRP one like the Asus TUF.

You've helped me a lot. Thanks for letting me know the best choice!

Link to comment
Share on other sites

Link to post
Share on other sites

The 3080 should be fine for UW

You can wait for the 3080TI, though I don't think you'll get the 3080 before the TI releases.

 

Depending on the MSRP and actual retail price it might be a better deal, or it might not be.

I'm gaming un UW and I have a order for the 3080 pending, though If the 3080TI turns out to be "affordable" I might just cancel my order.

 

It all depends on how long you can wait and what your budget is, 10GB's should be fine for UW for a while, worst case scenario you have to turn down texture packs.

In DLSS titles this shouldn't be a problem at all, but there aren't many out there yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×