Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

RTX 3080 **10GB VRAM ENOUGH?!**

Go to solution Solved by Exeon,
On 10/30/2020 at 4:30 PM, Leisor said:

(which will definitly be the case with 10GB) than not being able to play poor optimised games due tu a small vram pool.

 

Facts are the only thing that really matter.

The fact is most games don't even reach 8GB's let alone 10GB's for 1440

 

The only game that currently reaches 10GB's is Crysis remastered however since at max settings the 3080 can't even reach 60FPS this isn't a realistic scenario

Upcoming games like watchdogs Legion don't get to 10GB's on 4K, let alone 1440p

 

If you follow gaming trends we won't reach 10GB's 1440p for some years to come, by which time one could consider upgrading again.

DLSS also turns down the amount of VRAM used by quite a bit and we don't know how many titles will adopt this technology (AMD is also working on a similar thing)

 

"Not being able to play" is a big overstatement, when I bought the 980TI at release date 5 years ago I had to turn down settings on unoptimized games.

This wasn't because my card couldn't handle it, no card could handle it and even today's card can't, not at max settings.

 

 

At this point people shouldn't order and wait for AMD's benchmarks, I hope they are as promising as they make it seem.

I'm still waiting to see how they perform with ray tracing turned on.

 

 

Hi guys,

 

I am not an expert with computers etc but my friend is telling me the RTX 3080 is a waste of money because of its VRAM. For example he was telling me that in the new Watchdogs game, the VRAM is like 99% being used. He says when this happen the textures of the game will be all messed up and issues will happen.

 

I did some research and some people agree on this and some dont. I was hoping someone with a lot of experience can address this concern for people wondering out there.

 

Is 10GB VRAM enough for next gen gaming? And will the textures be reduced and messed up by using too much VRAM? Please see the photo below, he is using 9.8GB VRAM on only 1440p without RTX enabled.

 

Should I be worried about spending so much money on a RTX 3080 to not even be able to run new games on the settings we deserve? 

image.thumb.png.79e7c3e4a4f41aae19d81ea77bbd6ef4.png

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites

In my opinion it's definitly not enough, there are a lot of games right now which need more than 10GB even at 1440P, that's the main reason why i plan to get a 3090 or 6800XT/6900XT.

It is certainly not future proofed, unless you are willing to reduce texture quality or play in 1080P.

Link to post
Share on other sites
14 minutes ago, makadee said:

Please see the photo below, he is using 9.8GB VRAM on only 1440p without RTX enabled.

Games allocate more ram than they actually use, so software is almost always unreliable. It has lead to many misconceptions about how much ram is needed. 

 

For 4K gaming at really high details, benchmarks show that 10GB is hurting performance in some specific scenarios, but for 1440p you'd be covered across the board.

 

However, with much more Vram on those new AMD cards, and that little extra performance they can get with 5000 series Ryzen CPUs, it appears to be a better choice (and we'll know for sure once people can get their hands on them and run the numbers.)

 

As for your friend's comment, a 3080 is by no means a "waste" of money cause it kicks ass in gaming but there might be better value from team red in about a month's time.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Kill Trident Z RGB - Force MP500 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G2 650W - Black and green theme, Razer branwashed me.

Draws 400 watts under max load, for reference.

 

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - Dark moded Ubuntu (and Win7, cuz why not)

 

How many watts do I need? Seasonic Focus threadUserbenchmark (Et al.) is trash explained, PSU misconceptions, protections explainedgroup reg is bad

Link to post
Share on other sites
9 minutes ago, Fasauceome said:

Games allocate more ram than they actually use, so software is almost always unreliable. It has lead to many misconceptions about how much ram is needed. 

 

For 4K gaming at really high details, benchmarks show that 10GB is hurting performance in some specific scenarios, but for 1440p you'd be covered across the board.

 

However, with much more Vram on those new AMD cards, and that little extra performance they can get with 5000 series Ryzen CPUs, it appears to be a better choice (and we'll know for sure once people can get their hands on them and run the numbers.)

 

As for your friend's comment, a 3080 is by no means a "waste" of money cause it kicks ass in gaming but there might be better value from team red in about a month's time.

I wonder if DLSS can possibly solve this VRAM issue even if there is an issue

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
2 minutes ago, makadee said:

I wonder if DLSS can possibly solve this VRAM issue even if there is an issue

DLSS isn't in every game, in the games that have it the vram seems to be less used but either way, unless a ton of developers start adopting it, it's not reliable as a solution.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Kill Trident Z RGB - Force MP500 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G2 650W - Black and green theme, Razer branwashed me.

Draws 400 watts under max load, for reference.

 

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - Dark moded Ubuntu (and Win7, cuz why not)

 

How many watts do I need? Seasonic Focus threadUserbenchmark (Et al.) is trash explained, PSU misconceptions, protections explainedgroup reg is bad

Link to post
Share on other sites

Keep in mind VRAM is just the highest level of memory for your GPU. Textures and such will still live in memory and Virtual memory (pagefile) if needed. You'll never "run out" of video memory. 

Spoiler

Desktop: Ryzen 7 2700x | Aorus X470 Gaming Ultra | EVGA RTX2080 Super | 32GB (4x8GB) Corsair Vengeance RGB Pro 3200Mhz | Corsair H105 AIO, NZXT Sentry 3 | Corsair SP120's | 1TB Crucial P1 NVMe, 4TB WD Black | Phanteks Enthoo Pro | Corsair RM650v2 PSU | LG 32" 32GK850G Monitor | Ducky Shine 3 Keyboard, Logitech G502, MicroLab Solo 7C Speakers, Razer Goliathus Extended, X360 Controller | Windows 10 Pro | SteelSeries Siberia 350 Headphones

 

Spoiler

Server 1: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM650v2 PSU | Fractal S36 Triple AIO | 10 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 1TB Crucial P1 NVMe | LSI 9211-8i HBA

 

Server 2: Corsair 400R | IcyDock MB998SP & MB455SPF | Seasonic Focus Plus 650w PSU | 2 x Xeon X5650's | 48GB DDR3-ECC | Asus Z8NA-D6C Motherboard | AOC-SAS2LP-MV8 | LSI MegaRAID 9271-8i | RES2SV240 SAS Expander | Samsung 840Evo 120GB | 5 x 8TB Seagate Archives | 10 x 3TB WD Red

 

Link to post
Share on other sites
31 minutes ago, makadee said:

the VRAM is like 99% being used.

being "allocated", not used. It's like renting a warehouse to store my stuff, just because I've allocated that much area doesnt mean I'm using all of it. A lot of badly optimized games have way higher allocated memory than used memory, WD Legions is one of them (come on, it's from Ubisoft)

 

31 minutes ago, makadee said:

He says when this happen the textures of the game will be all messed up and issues will happen.

When usage does go beyond what the card has, performance (i.e. frame rate) will tank but it will not ruin textures in normal circumstances. I say normal, because some game engines simply skip textures in order to maintain frame rates, most notably Rockstar's RAGE engine on for example, GTA 5.

ualklgpnl4541.jpg

 

31 minutes ago, makadee said:

Is 10GB VRAM enough for next gen gaming? 

If you're not the max out everything idiot, instead focus on whether it makes a difference shown on your monitor when you're moving around (instead of staring at a wall for example), then so far I havent really seen a game that makes 10GB irrelevant at 1440p, simulators excluded.

 

31 minutes ago, makadee said:

Should I be worried about spending so much money on a RTX 3080 to not even be able to run new games on the settings we deserve? 

#HailAMD

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: 1TB HP EX920 PCIe x4 M.2 SSD + 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172), 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to post
Share on other sites

There will be a huge jump in Vram requierment when games are not developed on current consoles anymore.

If 10GB is barely enough right now even in 1440P, im pretty sure there will be a massive problem if you dont lower resolution or texture quality, which I wouldnt want if I buy an expensive card. It's a bit like GTX 680 (2GB) and HD 7970 (3GB) back then, the HD 7970 was much better in terms of "future proof".

Link to post
Share on other sites

I wouldn't worry too much about it, especially at that resolution. The others have already explained why.

 

28 minutes ago, Leisor said:

In my opinion it's definitly not enough, there are a lot of games right now which need more than 10GB even at 1440P, that's the main reason why i plan to get a 3090 or 6800XT/6900XT.

It is certainly not future proofed, unless you are willing to reduce texture quality or play in 1080P.

Not accurate.

2 minutes ago, Leisor said:

There will be a huge jump in Vram requierment when games are not developed on current consoles anymore.

If 10GB is barely enough right now even in 1440P, im pretty sure there will be a massive problem if you dont lower resolution or texture quality, which I wouldnt want if I buy an expensive card. It's a bit like GTX 680 (2GB) and HD 7970 (3GB) back then, the HD 7970 was much better in terms of "future proof".

Maybe, maybe not. If someone's buying a 3080 chances are in a couple years when that might be come a problem, they'll simply upgrade anyway.

Current PC:

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites
4 minutes ago, Jurrunio said:

being "allocated", not used. It's like renting a warehouse to store my stuff, just because I've allocated that much area doesnt mean I'm using all of it. A lot of badly optimized games have way higher allocated memory than used memory, WD Legions is one of them (come on, it's from Ubisoft)

 

When usage does go beyond what the card has, performance (i.e. frame rate) will tank but it will not ruin textures in normal circumstances. I say normal, because some game engines simply skip textures in order to maintain frame rates, most notably Rockstar's RAGE engine on for example, GTA 5.

ualklgpnl4541.jpg

 

If you're not the max out everything idiot, instead focus on whether it makes a difference shown on your monitor when you're moving around (instead of staring at a wall for example), then so far I havent really seen a game that makes 10GB irrelevant at 1440p, simulators excluded.

 

#HailAMD

I understand what you are saying but I just hear so many mixed opinions so I am not sure who is right or wrong here lol!!

 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites

Right now in 1440P you are ok, but expect to lower resolution or texture quality when games are not developed for current gen consoles anymore.

Remember 2013 when the new consoles came and Vram requierment literally exploded, be sure that will happen again this time.

The raw performce is really good, but i would either recommend a 3090 if you have enough budget or just stick to the RX 6800XT, with similar or higher raw performance, but with 16GB Vram.

Link to post
Share on other sites
5 minutes ago, dizmo said:

I wouldn't worry too much about it, especially at that resolution. The others have already explained why.

 

Not accurate.

Maybe, maybe not. If someone's buying a 3080 chances are in a couple years when that might be come a problem, they'll simply upgrade anyway.

I mean i did my own test on my RTX 3080 ASUS TUF OC. I tried warzone for example and was using 9.8GB vram. But I could not tell a differnce in textures or anything at all. There was not FPS drops or stuttering. 

 

I am running a 3440x1440p monitor and I even have RTX enabled in warzone, and I dont see my FPS going below 110FPS

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
1 minute ago, Leisor said:

Right now in 1440P you are ok, but expect to lower resolution or texture quality when games are not developed for current gen consoles. 

Remember 2013 when the new consoles came and Vram requierment literally exploded, be sure that will happen again this time.

Yeah I am a bit worried because the RTX 3080 was not a cheap card at the end of the day. And to not be able to run games on ultra already thats pretty fucking retarded. Thats like nvidia scamming us... 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
1 minute ago, makadee said:

Yeah I am a bit worried because the RTX 3080 was not a cheap card at the end of the day. And to not be able to run games on ultra already thats pretty fucking retarded. Thats like nvidia scamming us... 

It is not a scam, you knew that the card has 10GB of Vram and shouldnt only listen to their marketing, and you know that even right now 10GB is very much on the edge even at 1440P.

Nvidia also offers a RTX 3090 with a massive 24 GB Vram pool, for people who always want to have the highest texture quality. Its only a shame that right now they dont offer anything between 10 and 24 GB.

AMD does that with their 16GB cards, which is pretty much perfect with enough headroom for the future.

Link to post
Share on other sites

Not 4k but 35% higher number of pixels with RTX and everything on Ultra.  It is allocating 8.5-9GB on mine.  How much it *needs* vs allocated who knows.

 

Edit: I don't think the glitches have to do with memory its launch day it is poorly optimized and Nvidia hasn't released an updated driver yet (supposed there is a beta that gives a decent fps boost).  I mean it gets 58fps average at 3440x1440 with DLSS on (quality +1fps for optimized) with RTX on a 3080 and a lot of models and animations still look like poo.

AMD 5900X / Gigabyte X570 Auros Pro / 32GB @ 3600c16 / 1TB Samsung 980 Pro 4.0x4 / 4TB total Inland TLC 3.0x4 / EVGA FTW3 3080 / Corsair RM750x /Thermaltake View71

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites
16 minutes ago, makadee said:

I mean i did my own test on my RTX 3080 ASUS TUF OC. I tried warzone for example and was using 9.8GB vram. But I could not tell a differnce in textures or anything at all. There was not FPS drops or stuttering. 

 

I am running a 3440x1440p monitor and I even have RTX enabled in warzone, and I dont see my FPS going below 110FPS

Depends how you're looking. Just because it's allocated doesn't mean it's actually being used, and a lot of games, especially CoD ones, will take all the resources they can, even if it doesn't actually need them.

Current PC:

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites
5 minutes ago, ewitte said:

Not 4k but 35% higher number of pixels with RTX and everything on Ultra.  It is allocating 8.5-9GB on mine.  How much it *needs* vs allocated who knows.

 

Edit: I don't think the glitches have to do with memory its launch day it is poorly optimized and Nvidia hasn't released an updated driver yet (supposed there is a beta that gives a decent fps boost).  I mean it gets 58fps average at 3440x1440 with DLSS on (quality +1fps for optimized) with RTX on a 3080 and a lot of models and animations still look like poo.

Which game are you talking about?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
12 minutes ago, Leisor said:

It is not a scam, you knew that the card has 10GB of Vram and shouldnt only listen to their marketing, and you know that even right now 10GB is very much on the edge even at 1440P.

Nvidia also offers a RTX 3090 with a massive 24 GB Vram pool, for people who always want to have the highest texture quality. Its only a shame that right now they dont offer anything between 10 and 24 GB.

AMD does that with their 16GB cards, which is pretty much perfect with enough headroom for the future.

Only issue is my monitor is quite expensive and it has g sync, thats why I rather stick to nvidia but im just worried about all of these mixed opinions lol

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
12 minutes ago, Leisor said:

It is not a scam, you knew that the card has 10GB of Vram and shouldnt only listen to their marketing, and you know that even right now 10GB is very much on the edge even at 1440P.

Its not though there has been 1-2 games with in game settings (not mods) that have a huge performance drop for 8GB cards at 4K let alone 10GB at 1440p.

AMD 5900X / Gigabyte X570 Auros Pro / 32GB @ 3600c16 / 1TB Samsung 980 Pro 4.0x4 / 4TB total Inland TLC 3.0x4 / EVGA FTW3 3080 / Corsair RM750x /Thermaltake View71

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites
1 minute ago, ewitte said:

Its not though there has been 1-2 games with in game settings (not mods) that have a huge performance drop for 8GB cards at 4K let alone 10GB at 1440p.

Sorry are you saying 10GB is not enough or it is? I didnt quite understand 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
2 minutes ago, makadee said:

Sorry are you saying 10GB is not enough or it is? I didnt quite understand 

Yes 10GB is fine there is barely anything that pushes 8GB is what I said.  *need* vs allocation.

AMD 5900X / Gigabyte X570 Auros Pro / 32GB @ 3600c16 / 1TB Samsung 980 Pro 4.0x4 / 4TB total Inland TLC 3.0x4 / EVGA FTW3 3080 / Corsair RM750x /Thermaltake View71

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites
5 minutes ago, ewitte said:

Its not though there has been 1-2 games with in game settings (not mods) that have a huge performance drop for 8GB cards at 4K let alone 10GB at 1440p.

Flight Simulator or Horizon Zero Dawn for example are very thankfull for more Vram. I experienced frametime spikes in 1440P due to a shortage of vram. 

And those are Games which are already released, future games will have higher quality textures and require more vram.

Link to post
Share on other sites
1 minute ago, ewitte said:

Yes 10GB is fine there is barely anything that pushes 8GB is what I said.  *need* vs allocation.

oh right i didnt know this, so there is a need and allocation and thats the big difference. Is there a way to truly be able to tell what is being use?

 

my friend was mentioning to me that if I was to run everything ultra on a demanding game for example, he was saying that your texture might be running on ultra but really its high not ultra because of your limited vram and blah blah so im a big confused about this 

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
2 minutes ago, Leisor said:

Flight Simulator or Horizon Zero Dawn for example are very thankfull for more Vram. I experienced frametime spikes in 1440P due to a shortage of vram. 

On my 1440p monitor i ran FS on high settings and was getting around 65FPS, sometimes it would dip to like 45FPS but even a RTX 3090 has this issue with this game... Am i right on that?

Ryzen 3900XT OC to 4.4Ghz -B550 Aorus Pro - 32Gb 3600Mhz Corsair - RTX 3080 ASUS TUF OC - O11D Dynamic Tempered Glass - 2x 1TB m.2 SSD - 850W GOLD Corsair - Alien Ware Ultra Wide 3440x1440p - MSI MAG Core Liquid 360R RGB - Razer X Chroma - Razer DeathAdder Elite - Windows 10 Pro

Link to post
Share on other sites
6 minutes ago, Leisor said:

Flight Simulator or Horizon Zero Dawn for example are very thankfull for more Vram. I experienced frametime spikes in 1440P due to a shortage of vram. 

And those are Games which are already released, future games will have higher quality textures and require more vram.

I can't find it but GN did an analysis video that showed the only game that really suffered with a huge performance hit at 4k on 8GB was nightmare in Doom Eternal and "ultra" cleared it right up.  I believe flight sim was also mentioned but I'm not sure if anything satisfies it.

 

Horizon Zero Dawn benchmark shows I'm CPU limited based on the #s.  Hopefully the 5900x fixes that...

AMD 5900X / Gigabyte X570 Auros Pro / 32GB @ 3600c16 / 1TB Samsung 980 Pro 4.0x4 / 4TB total Inland TLC 3.0x4 / EVGA FTW3 3080 / Corsair RM750x /Thermaltake View71

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×