Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
farmfowls

20 GB VRAM 3080's?

Recommended Posts

Posted · Original PosterOP

So I know there have been rumors/confirmations(?) that a 20 GB VRAM SKU for the 3080 exists but am trying to understand something. As far as I am aware (could be wrong), games aren't using even 10 GB VRAM (isn't it around 6?), so why would NVIDIA make a 20 GB version? Is it just for the "big numbers" to lure people in? Is there actual reason to wait for a 20 GB card vs trying to get one's hands on the 10 GB cards that are almost impossible to get now anyway?

 

I didn't think that having more VRAM (past what you need) improved performance or offered anything extra in terms of gaming. Has this changed? Or is it just for people who hold onto cards for a few years and maybe in that time there will be need for VRAM past 10 GB? I for one am looking to upgrade before I head back to school and won't be able to upgrade again for a while after this. In this case, is waiting for a 20 GB 3080 the smarter choice vs trying to get a 10 GB one now? 

 

I seem to remember a Gigabyte (I think it was them) leak where something was said that maybe the 20 GB VRAM cards could be clocked higher? I'm not sure so correct me if I'm wrong. I would assume that if a 20 GB card would come, it'd be in a year from now, so is it worth waiting? 

 

Thoughts?  

Link to post
Share on other sites
10 minutes ago, farmfowls said:

Thoughts? 

many people said that games today dont use more than 10GB, and that may be true. but who knows what's coming for the next 3-5 years.

even if it does becomes a problem, it depends on your intended resolution and willingness to step down in textures anyways.

 

i cant tell you the future but i can tell you the past, where people were buying RX480 4GB instead of 8GB because it was cheaper, now the 4GB cards cant run some games at high settings well compared to the 8GB.

but again, they can always just turn down the settings and continue to game anyways, so i don't see a big deal

 

but having more option is good nonetheless


Things I need help with: *new* What can Facebook do to me? Privacy Inquiries

Spoiler

 

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer monitors, Gigabyte, Seagate HDD, Kingston SSD, Razer (except their mouse), XiaoMi Phones

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites

TBH the earliest I can see this happening is at least 6-9 months from now, otherwise they're gonna piss off anyone who bought any base model GPU from them.  Yes, most games don't use that much VRAM and people are conflating "VRAM usage" with "VRAM allocation" and there's no easy way to separate the two from any monitoring software.  Unfortunately, most of us will never be able to see how much VRAM a game actually needs for any given title.

 

That being said, with a 3080 being a 4k monster, it feels a bit weird limiting it to less than the top end amount of VRAM from the last gen.  Like Steve from Hardware Unboxed mentions in their latest video, people who spend $700-800 on a GPU are unlikely to be the kind of people who would want to turn down settings for anything, so having to turn down Texture quality for games in the future is not really what the market for these GPUs want to do.  Yes, you can turn XYZ settings down, but if you're spending this kinda of money, should you?


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites
Posted · Original PosterOP
Just now, Moonzy said:

many people said that games today dont use more than 10GB, and that may be true. but who knows what's coming for the next 3-5 years.

even if it does becomes a problem, it depends on your intended resolution and willingness to step down in textures anyways.

 

i cant tell you the future but i can tell you the past, where people were buying RX480 4GB instead of 8GB because it was cheaper, now the 4GB cards cant run some games at high settings well compared to the 8GB.

but again, they can always just turn down the settings and continue to game anyways, so i don't see a big deal

 

but having more option is good nonetheless

I run 1440p (PG279Q). Back when I had a 780 ti before upgrading, I absolutely hated that my VRAM would run out in the titles that came years later. I guess then maybe waiting for a 20 GB version may be the best idea? Especially considering I'd be holding onto the card for a good few years.   

Link to post
Share on other sites
7 minutes ago, farmfowls said:

I guess then maybe waiting for a 20 GB version may be the best idea?

if it exists

1) depend on price

2) depend on how willing are you to turn down graphical settings, esp texture

3) what are the other differences between the two, if any

 

7 minutes ago, farmfowls said:

I absolutely hated that my VRAM would run out in the titles that came years later.

high VRAM usage =/= it's using every single bit of it btw

this post explains it well

 

when you run out of VRAM, you'll know

because you'll start stuttering, or game at 3fps


Things I need help with: *new* What can Facebook do to me? Privacy Inquiries

Spoiler

 

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer monitors, Gigabyte, Seagate HDD, Kingston SSD, Razer (except their mouse), XiaoMi Phones

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
Posted · Original PosterOP
3 minutes ago, Samfisher said:

TBH the earliest I can see this happening is at least 6-9 months from now, otherwise they're gonna piss off anyone who bought any base model GPU from them.  Yes, most games don't use that much VRAM and people are conflating "VRAM usage" with "VRAM allocation" and there's no easy way to separate the two from any monitoring software.  Unfortunately, most of us will never be able to see how much VRAM a game actually needs for any given title.

 

That being said, with a 3080 being a 4k monster, it feels a bit weird limiting it to less than the top end amount of VRAM from the last gen.  Like Steve from Hardware Unboxed mentions in their latest video, people who spend $700-800 on a GPU are unlikely to be the kind of people who would want to turn down settings for anything, so having to turn down Texture quality for games in the future is not really what the market for these GPUs want to do.  Yes, you can turn XYZ settings down, but if you're spending this kinda of money, should you?

Yeah I really don't like turning settings down myself. I'm going to have to push through that when I'm in Vet school, but in the meantime, if I'm going to get something I want it to perform and last me for a little bit. And considering I'll be keeping the card for a good bit, the 10 GB worries me. Like now it's fine. But in two years will it be? 3? 

Link to post
Share on other sites
1 minute ago, farmfowls said:

Yeah I really don't like turning settings down myself. I'm going to have to push through that when I'm in Vet school, but in the meantime, if I'm going to get something I want it to perform and last me for a little bit. And considering I'll be keeping the card for a good bit, the 10 GB worries me. Like now it's fine. But in two years will it be? 3? 

If you have no immediate need to upgrade, I would say wait for what AMD has to offer and if they release anything even remotely close to the 3080 in terms of performance and/or price, NVIDIA will have to respond.  That's when the real rumour mills and leaks will start on the potential release time frame for the 20GB variants.  I doubt they would be cheap, since doubling VRAM, especially GDDR6x is no cheap matter.  Wait till the end of the year if you can, and then wait a little longer.  If not, just get the 3080 now (available stock notwithstanding) and just be happy with it.  Sell and upgrade to the 20GB variant sometime in the future to someone who wants a cheaper card and has no use for 20GB VRAM.


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites

Competition! If AMD cannot compete in raw compute power, the only way to make their card competitive is to have more ram than nvidia. It's a cheap fix rather than going to the drawing board. AMD got 16gb, then Nvidia have 20gb.

9 minutes ago, Samfisher said:

If not, just get the 3080 now (available stock notwithstanding) and just be happy with it.  Sell and upgrade to the 20GB variant sometime in the future to someone who wants a cheaper card and has no use for 20GB VRAM.

Logical!


Ryzen 2600 @ 4ghz | Radeon RX580 | 32gb HyperX 3200mhz | 500gb Samsung PM981a | 5 TB HDD | Corsair CX450

Link to post
Share on other sites
Posted · Original PosterOP
15 minutes ago, Samfisher said:

If you have no immediate need to upgrade, I would say wait for what AMD has to offer and if they release anything even remotely close to the 3080 in terms of performance and/or price, NVIDIA will have to respond.  That's when the real rumour mills and leaks will start on the potential release time frame for the 20GB variants.  I doubt they would be cheap, since doubling VRAM, especially GDDR6x is no cheap matter.  Wait till the end of the year if you can, and then wait a little longer.  If not, just get the 3080 now (available stock notwithstanding) and just be happy with it.  Sell and upgrade to the 20GB variant sometime in the future to someone who wants a cheaper card and has no use for 20GB VRAM.

I'm running a 1080 ti and it's fine and everything but I've found myself turning down settings (I really dislike this) and/or suffering FPS losses. RDR2 is what really made it hurt, although it is RDR2. I am hoping to play CyberPunk but if my performance is off, I will set it aside until I do upgrade.  

 

But that is a good point about Big Navi. If Big Navi did come with enough heat (a reach of a pun) to force a 20 GB variant (actually hoping AMD does this now), would they really release such a variant so close to the 10 GB launch? Like would the competition value outweigh the upset customers?

 

Also, how much of a price increase would you say if you had to guess would come with a 20 GB variant? I know this is all guess work. But are we talking about shifting to the $999 USD price range in between the current $699 USD 10 GB 3080 and the $1499 USD 3090? As I am Canadian, $699 USD equates to about $1000 CAD and $1499 USD equates to a cool $2000 CAD. So the 3090 is way past my limits. The 3080 is already encroaching a bit but if the 20 GB is worth it, the 20 GB is worth it. 

Link to post
Share on other sites
9 minutes ago, farmfowls said:

Also, how much of a price increase would you say if you had to guess would come with a 20 GB variant? I know this is all guess work. But are we talking about shifting to the $999 USD price range in between the current $699 USD 10 GB 3080 and the $1499 USD 3090? As I am Canadian, $699 USD equates to about $1000 CAD and $1499 USD equates to a cool $2000 CAD. So the 3090 is way past my limits. The 3080 is already encroaching a bit but if the 20 GB is worth it, the 20 GB is worth it. 

TBH I don't see them pricing it anywhere below +$150 over the base 3080 10GB model.  Or if Nvidia is feeling particularly generous, they would do what they did with the Super variants.  Knock down the old base model price and replace it with a Super variant at the same starting price.  We can only hope.  As it is even the cheapest 3080 cards in my country are selling for $800+ usd.  And there's literally 0 in stock.  Most big stores only got 1 unit each.  That's the extent of the supply here.


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites
1 hour ago, farmfowls said:

I'm running a 1080 ti and it's fine and everything but I've found myself turning down settings (I really dislike this) and/or suffering FPS losses. RDR2 is what really made it hurt, although it is RDR2.

And it's definitely not because of VRAM limitations. My 5700XT only has 8GB, but I can manually max each individual setting and still get playable framerates at 1080p.


CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
2 minutes ago, BTGbullseye said:

And it's definitely not because of VRAM limitations. My 5700XT only has 8GB, but I can manually max each individual setting and still get playable framerates at 1080p.

But that's at 1080p...


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites
1 hour ago, Samfisher said:

TBH the earliest I can see this happening is at least 6-9 months from now, otherwise they're gonna piss off anyone who bought any base model GPU from them. 

Why the 6-9 month timeline? Dont you think this is NVIDIAs response if Big Navi delivers? Thus meaning that if Big navi underperform, we might not see them at all and if it does perform we will see it very soon as AMD is forcing NVIDIAs hand. I just cant imagine NVIDIA sitting on the sideline for a few months waiting to combat AMDs offering (assuming it delivers) to not piss off buyers who already paid for the card. 

Link to post
Share on other sites
4 minutes ago, Haak said:

Why the 6-9 month timeline? Dont you think this is NVIDIAs response if Big Navi delivers? Thus meaning that if Big navi underperform, we might not see them at all and if it does perform we will see it very soon as AMD is forcing NVIDIAs hand. I just cant imagine NVIDIA sitting on the sideline for a few months waiting to combat AMDs offering (assuming it delivers) to not piss off buyers who already paid for the card. 

Cos anything shorter than that and they're just screwing over their most loyal customers, the ones who buy early and pay the most.


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites
1 hour ago, Samfisher said:

Cos anything shorter than that and they're just screwing over their most loyal customers, the ones who buy early and pay the most.

Remember 780Ti launched 6 months after 780 and Titan. 780 got a $150 price drop. Anyone bought a 780 prior that was screwed.

 

I can see Nvidia pulling the same trick again.

Link to post
Share on other sites
6 minutes ago, Deli said:

Remember 780Ti launched 6 months after 780 and Titan. 780 got a $150 price drop. Anyone bought a 780 prior that was screwed.

 

I can see Nvidia pulling the same trick again.

That's why I mentioned 6-9 months at the earliest lol.


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites
4 minutes ago, Samfisher said:

That's why I mentioned 6-9 months at the earliest lol.

Next year March..... :)

Link to post
Share on other sites
17 minutes ago, Deli said:

Remember 780Ti launched 6 months after 780 and Titan. 780 got a $150 price drop. Anyone bought a 780 prior that was screwed.

 

I can see Nvidia pulling the same trick again.

Isn’t that the early adopters tax? 

Link to post
Share on other sites

Name me a single game that uses over 8GB of VRAM? Even at 4K. Note: I end up putting back in the 760 w/4GB I brought my current Rig, due to the 970 having issues. Both dGPUs game at 2560x1600 ~66 fps with no problems.

 

How many gamers even notice enough difference between High. Very High, and Ultra graphic settings anyway?

Link to post
Share on other sites
5 minutes ago, whm1974 said:

Name me a single game that uses over 8GB of VRAM? Even at 4K. Note: I end up putting back in the 760 w/4GB I brought my current Rig, due to the 970 having issues. Both dGPUs game at 2560x1600 ~66 fps with no problems.

 

How many gamers even notice enough difference between High. Very High, and Ultra graphic settings anyway?

You only think today. What's going to happen 3 years from now?

 

Many people buying high end graphics cards because they don't want to make many compromises in settings. If turning down settings is no big deal, 2070 and 2080 will be plenty already.

Link to post
Share on other sites
1 minute ago, Deli said:

You only think today. What's going to happen 3 years from now?

 

Many people buying high end graphics cards because they don't want to make many compromises in settings. If turning down settings is no big deal, 2070 and 2080 will be plenty already.

Given we haven't been seeing the benefits of major advancements in GPU performance lick we used to for some time now, I will have to say three years from now 8GB will be plenty for most Gamers. Of course this really depends on what type of games, but if I was a betting man, I would make that bet.

 

1080p is still the most common resolution in use. Some like me own 2560x1440 and x1600p displays. Very few Gamers have 4K ones with suitable dGPUs for 4K gaming.

Link to post
Share on other sites
7 minutes ago, whm1974 said:

Given we haven't been seeing the benefits of major advancements in GPU performance lick we used to for some time now, I will have to say three years from now 8GB will be plenty for most Gamers. Of course this really depends on what type of games, but if I was a betting man, I would make that bet.

 

1080p is still the most common resolution in use. Some like me own 2560x1440 and x1600p displays. Very few Gamers have 4K ones with suitable dGPUs for 4K gaming.

1440p is the sweet spot. I don't expect there will be VRAM problem for that resolution any time soon. Buying a 3080 for 1080p gaming, in my opinion, is overkill. The main concern is 4K. We'll see what happens then.

 

However I remember two years after GTX780 was released. Quite a few games started to run into VRAM problem in 1080p, when GTX980Ti has 6GB VRAM.

Link to post
Share on other sites
3 minutes ago, Deli said:

1440p is the sweet spot. I don't expect there will be VRAM problem for that resolution any time soon. Buying a 3080 for 1080p gaming, in my opinion, is overkill. The main concern is 4K. We'll see what happens then.

 

However I remember two years after GTX780 was released. Quite a few games started to run into VRAM problem in 1080p, when GTX980Ti has 6GB VRAM.

Which games ran into VRAM problem @1080p? With a GTX 780 which I could have gotten back in 2013 when I built the computer I'm still using? The reason I choose the 760 w/4GB instead is due to heat output of the higher end dGPU.

Link to post
Share on other sites

Call of Duty and Assassin Creed are two. Trying to remember what happened in 2015...

 

Many won't even recommend GPUs with 4GB VRAM for budget gaming today, let alone 3GB.

Link to post
Share on other sites
3 hours ago, Samfisher said:

But that's at 1080p...

And it's using the same amount of VRAM as it would at 4k native with the settings I'm using.


CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×