Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

3080: Enough Vram for Next Gen?

1 hour ago, xAcid9 said:

I use ProcessHacker. https://processhacker.sourceforge.io/downloads.php

Or you can use Special K tool but not really compatible with all games. 

I downloaded Special K and it doesn't work with Marvel's Avengers so I went ahead and tried downloading ProcessHacker but my Norton shut it down and removed it. 

 

Not sure how you got it to work

Link to post
Share on other sites
3 hours ago, Yoshi Moshi said:

Just get a 3090 so you don't have to worry

This would be ideal. Though it costs almost double the 3080. Fair enough it has more than double the vram and resolution capability but I don't have that kind of doeee

Link to post
Share on other sites
37 minutes ago, SNerd7 said:

I downloaded Special K and it doesn't work with Marvel's Avengers so I went ahead and tried downloading ProcessHacker but my Norton shut it down and removed it. 

 

Not sure how you got it to work

What was the reason for Norton to block it?  🤣

 

Try this Nightly. https://processhacker.sourceforge.io/nightly.php

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
2 minutes ago, xAcid9 said:

What was the reason for Norton to block it?  🤣

Said it was not safe and just removed it without even giving me an option. 

Link to post
Share on other sites
4 minutes ago, SNerd7 said:

Said it was not safe and just removed it without even giving me an option. 

The nightly work fine for me, the one i use at home is older version few years back. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
5 minutes ago, xAcid9 said:

The nightly work fine for me, the one i use at home is older version few years back. 

Just tried the Nightly you linked. Norton not only shut it down but told me to restart my computer. 

 

Oh well. I tried lol 

 

Thank you anyway 

Link to post
Share on other sites
1 hour ago, SNerd7 said:

Norton

Norton itself is a virus. Get rid of it.

 

It's blocking ProcessHacker because it has the ability to inject code into other programs. All antiviruses will detect it as a threat because of this.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
5 hours ago, SNerd7 said:

Just tried the Nightly you linked. Norton not only shut it down but told me to restart my computer. 

 

Oh well. I tried lol 

 

Thank you anyway 

That's because norton completely sucks, get rid off it asap... Use windows defender or bitdefender or something...

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
5 hours ago, BTGbullseye said:

Norton itself is a virus. Get rid of it.

 

It's blocking ProcessHacker because it has the ability to inject code into other programs. All antiviruses will detect it as a threat because of this.

If it's something like special k or afterburner or even trainers,not all AV dedect it, malwarebytes gives an option for trainers for example because it's just a "pup" and windows defender doesn't even care at all 🤣

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
On 9/7/2020 at 3:16 AM, SNerd7 said:

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

 

Edit: 

Played just now on full max settings. Hit 11.07gb

That's VRAM allocation not actual usage. 

CPU | Intel i7-7700K | GPU | EVGA 1080ti Founders Edition | CASE | Phanteks Enthoo Evolv ATX | PSU | Seasonic X850 80 Plus Gold | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z270E Strix | STORAGE | Adata XPG 256GB NVME + Kingston 120GB SSD + WD Blue 1TB + Adata 480GB SSD | COOLING | Hard Line Custom Loop + 7 Corsair 120 Air Series QE Fans | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to post
Share on other sites
2 hours ago, jasonc_01 said:

That's VRAM allocation not actual usage. 

Yeah, so what difference makes a couple of hundred MB when you are super close to hitting the limit @  "medium settings" already anyway, as demonstrated above?

 

This meme really needs to die already and unlike EA, Fortnite, pubg, CSGO... not every game needs to be aimed at running on potatoes.

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
53 minutes ago, Mark Kaine said:

Yeah, so what difference makes a couple of hundred MB when you are super close to hitting the limit @  "medium settings" already anyway, as demonstrated above?

 

This meme really needs to die already and unlike EA, Fortnite, pubg, CSGO... not every game needs to be aimed at running on potatoes.

No your not getting it. That's not 11.07gb or even 10.3gb of VRAM usage, that is VRAM allocation.

 

Allocation is not acual usage, or need. 

CPU | Intel i7-7700K | GPU | EVGA 1080ti Founders Edition | CASE | Phanteks Enthoo Evolv ATX | PSU | Seasonic X850 80 Plus Gold | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z270E Strix | STORAGE | Adata XPG 256GB NVME + Kingston 120GB SSD + WD Blue 1TB + Adata 480GB SSD | COOLING | Hard Line Custom Loop + 7 Corsair 120 Air Series QE Fans | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to post
Share on other sites
3 minutes ago, jasonc_01 said:

No your not getting it. That's not 11.07gb or even 10.3gb of VRAM usage, that is VRAM allocation.

 

Allocation is not acual usage, or need. 

20200908_031055.thumb.jpg.d2c8d57dd6a67fc67d98674bcba39ef3.jpg

 

Oh yeah?  

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
13 hours ago, MadPistol said:

Because it can and will. Nvidia knows what they're doing. They would not purposely gimp a GPU like that; it would be a PR nightmare to have a ton of angry consumers due to  their "flagship" GPU not having enough VRAM. 

They do know what they're doing, but that includes knowing how tight they can squeeze the VRAM amounts without losing sales early on. Later on, well that doesn't matter so much.

 

It's not an unreasonable concern if you're running 4K. It'll probably be okay for the most part, but in the long run you may be forced to drop settings and rely on DLSS.

 

Nvidia has a history of cutting it close with VRAM, as well as questionable stuff like the GTX 970 setup.

Link to post
Share on other sites

I had the same question and Nvidia did a Q&A with reddit users last week to somewhat answer your question. They did test it, and I thought 4k would eat up a lot more VRAM than it actually does. 

 

Quote

Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

 

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

 

https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/

Link to post
Share on other sites

@SNerd7 Try Process Explorer from Sysinternals. 

https://docs.microsoft.com/en-us/sysinternals/downloads/process-explorer
 

If Norton flag that too... 😅

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites

I have reservations on long-term future usage of 8GB or 10GB 30-series cards for sure. If we keep the trend of new generations taking 2 years and you figure most people upgrade every couple of generations (so that would look like holding onto the card for at least 4 years), I do wonder if it will be enough.

 

On the same token, the use of GDDR6X in the 3080 does increase the memory bandwidth substantially over GDDR6 or older GDDR5/X cards that may balance the actual amount of VRAM available.

 

I don't know yet so I will wait and see the reviews. I am also waiting for RDNA2 to drop as I am sure that will bring about higher VRAM variants of the 3070/3080.

CPU: Intel Core i7-4790k || GPU: ASUS GTX 980 Strix || Memory: Corsair Vengeance Pro 16GB DDR3-2400 || Motherboard: ASUS Z97-DELUXE || SSD: Crucial M550 512 GB (OS Drive) || Hard Drives: WD Black 2 x 3TB (Games/Storage Drives) || PSU: Corsair AX750 || Monitor: LG 32GK650F-B  || Cooling: EK Supremacy EVO || D5 X-RES Top 100 w/ D5 Vario || XSPC EX240 x 2 || Corsair SP120 x 5 / AF120 x 1 || Case: Corsair Carbide Air 540 || Keyboard: Corsair K70 || Mouse: Razer Naga Epic || OS: Windows 8.1 Pro 64-bit ||

Link to post
Share on other sites
7 hours ago, Mark Kaine said:

20200908_031055.thumb.jpg.d2c8d57dd6a67fc67d98674bcba39ef3.jpg

 

Oh yeah?  

Yes. Task manager does not show actual usage, it shows you usage as is how much the program is allocating. Only the program running can tell you actual VRAM usage and in most cases it only show you allocation. 

 

Warzone for example tells me VRAM usage is 10.5gb, which incorrect it's allocation.

 

This can be tested next month when the 3070 is available. Similar performance to a 2080ti but with 8gb of VRAM, should mean by everyone's misconceptions that the 3070 with have memory buffer issues and perform worse than a 2080ti.

 

I bet it won't and will allocate 7.5gb of VRAM.

CPU | Intel i7-7700K | GPU | EVGA 1080ti Founders Edition | CASE | Phanteks Enthoo Evolv ATX | PSU | Seasonic X850 80 Plus Gold | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z270E Strix | STORAGE | Adata XPG 256GB NVME + Kingston 120GB SSD + WD Blue 1TB + Adata 480GB SSD | COOLING | Hard Line Custom Loop + 7 Corsair 120 Air Series QE Fans | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to post
Share on other sites
13 hours ago, jasonc_01 said:

. Task manager does not show actual usage, it shows you usage as is how much the program is allocating. Only the program running can tell you actual VRAM usage and in most cases it only show you allocation. 

 

Warzone for example tells me VRAM usage is 10.5gb, which incorrect it's allocation.

But do you not see that without proper measurement tools this way of arguing is pointless and futile?

 

you're saying "it's just allocation" but you have apparently no real data about how much it really uses... And neither do I, but as said I did some tests that heavily imply that the"allocation" figure is very close to actual usage in certain scenarios.

 

For example setting textures higher than they should be in RE2 results in glitches, poor framerates and unforseeable behavior AND the game even gives a warning exactly that may happen... Other tests I did with other games imply the same thing and totally "incidentally" correlate with what the available tools, afterburner, gpuz, task manager,etc say - it's never exactly accurate but close enough and it's very obvious that - depending on the game - it's indeed possible to run out of VRAM and even the windows internal. "memory swapping feature" won't save you.

 

Tldr: without proper measurement tools the discussion about this will remain fruitless

 

 

 

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
2 hours ago, Mark Kaine said:

But do you not see that without proper measurement tools this way of arguing is pointless and futile?

The only way to measure it at all is inside the program using it, and that's only if the game engine supports it, and only if there isn't a problem in the code that causes a memory leak. (memory leak = memory fills up until overloading, or the file with the leak is unloaded from the memory)

2 hours ago, Mark Kaine said:

you're saying "it's just allocation" but you have apparently no real data about how much it really uses...

No one does. NO ONE.

2 hours ago, Mark Kaine said:

For example setting textures higher than they should be in RE2 results in glitches, poor framerates and unforseeable behavior AND the game even gives a warning exactly that may happen... Other tests I did with other games imply the same thing and totally "incidentally" correlate with what the available tools, afterburner, gpuz, task manager,etc say - it's never exactly accurate but close enough and it's very obvious that - depending on the game - it's indeed possible to run out of VRAM and even the windows internal.

All examples of badly optimized code/textures, or memory leaks. None of that should happen, and reporting the issue to the developer tends to get those problems fixed very rapidly without any reduced quality.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
2 hours ago, BTGbullseye said:

No one does. NO ONE.

I read repeatedly that Special K can do it... couldn't someone  write a standalone program that does just that (read vram usage etc) it's kinda hard to believe no one did yet...

 

2 hours ago, BTGbullseye said:

examples of badly optimized code/textures

Well I doubt that... RE2 specifically warns you even... (and is very well optimized anyhow)

 

I mean the conclusion is that game developers apparently have no idea what they're doing (unlikely though it happens) and that Vram never runs out (nonsense).

 

So what if the game says it's gonna use '9GB' but actually uses only 7... Well that means everyone with a 4GB card ,etc for example will run into issues unless they lower settings... 

 

Again, when we don't have tools to even measure this it's all just down to personal opinions and experience.

 

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
3 hours ago, Mark Kaine said:

I read repeatedly that Special K can do it... couldn't someone  write a standalone program that does just that (read vram usage etc) it's kinda hard to believe no one did yet...

That injects code into programs. 100% not supported by ANY multiplayer game, and definitely doesn't work with all game engines. You can't know anything about what's actually in the VRAM without being inside the program that put it there. That's how it works. No standalone app is capable of reading anything more than what is allocated.

3 hours ago, Mark Kaine said:

Well I doubt that... RE2 specifically warns you even... (and is very well optimized anyhow)

Give me even a single example of a program made in the last 20 year that never had a single bug, or was incapable of being further optimized in any way. Sometimes things get overlooked, or one file has a tiny unforseen interaction that causes a problem.

3 hours ago, Mark Kaine said:

I mean the conclusion is that game developers apparently have no idea what they're doing (unlikely though it happens)

Dunno where you got that from.

3 hours ago, Mark Kaine said:

and that Vram never runs out (nonsense).

Technically, it doesn't... Until the system crashes.

3 hours ago, Mark Kaine said:

So what if the game says it's gonna use '9GB' but actually uses only 7... Well that means everyone with a 4GB card ,etc for example will run into issues unless they lower settings...

Yes, that's what that means, if it actually uses that 7 and doesn't/can't scale the VRAM usage properly.

3 hours ago, Mark Kaine said:

Again, when we don't have tools to even measure this it's all just down to personal opinions and experience.

No, because we already know this happens, just not the exact amount, or every single time it happens. Just because we can't quantify it very well right now, doesn't mean it doesn't happen.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
5 minutes ago, BTGbullseye said:

 100% not supported by ANY multiplayer game, and definitely doesn't work with all game engines.

Isn't it just a dll? I have several that do "things" in multiplayer games and they work just fine...

 

Also doesn't afterburner do that also? Inject stuff so it can read things like fps (which btw can also interfere with multiplayer games, but not necessarily - I haven't encountered a single one yet, it just works)

 

8 minutes ago, BTGbullseye said:

Yes, that's what that means, if it actually uses that 7 and doesn't/can't scale the VRAM usage properly

That's right. problem is most games don't scale properly and as such, more VRAM= higher settings = better graphics ...

 

I tested this extensively, the results are all very similar, if the game *tries* to use more Vram than is available there *will* be issues.

 

That doesn't mean you can't play a game with a low Vram GPU, but you'll have to settle with lower settings,even though *otherwise* the GPU still could keep up . 

 

So more is better, it's really that simple and some games need more Vram than others.

 

This isn't a optimization issue necessarily either, no one is forced to use high resolution texture packets either for example, but it sure makes games prettier (and cost more Vram)

 

 

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
11 minutes ago, Mark Kaine said:

Isn't it just a dll? I have several that do "things" in multiplayer games and they work just fine...

DLLs are technically hacking the game. Even if they function, that can still get you permanently banned from multiplayer games.

12 minutes ago, Mark Kaine said:

Also doesn't afterburner do that also? Inject stuff so it can read things like fps (which btw can also interfere with multiplayer games, but not necessarily - I haven't encountered a single one yet, it just works)

It does not inject code. Not even a little.

13 minutes ago, Mark Kaine said:

That's right. problem is most games don't scale properly and as such, more VRAM= higher settings = better graphics ...

It's not quite that simple.

14 minutes ago, Mark Kaine said:

I tested this extensively, the results are all very similar, if the game *tries* to use more Vram than is available there *will* be issues.

I have as well, and I've seen several games allocate far more than they say they are going to use, with no issues. I've also seen them allocate far less than they claim to need, and no issues as well.

16 minutes ago, Mark Kaine said:

That doesn't mean you can't play a game with a low Vram GPU, but you'll have to settle with lower settings,even though *otherwise* the GPU still could keep up . 

Except that in this case, going for a higher setting (enabled raytracing) will actually significantly reduce VRAM requirements.

17 minutes ago, Mark Kaine said:

So more is better, it's really that simple and some games need more Vram than others.

More is sometimes better, but there are tradeoffs. (like increased price, and board complexity)

18 minutes ago, Mark Kaine said:

This isn't a optimization issue necessarily either, no one is forced to use high resolution texture packets either for example, but it sure makes games prettier (and cost more Vram)

99.9% of the time, yes it is an optimization issue. Even with extreme high rez texture packs.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
On 9/8/2020 at 7:15 AM, PianoPlayer88Key said:

Hmm... 3080 is supposed to be good with 4K gaming, and has 10GB VRAM.  3090 can apparently do 8K, and has 24GB.

 

In GTA V, the game wants about 9 GB of VRAM at max settings at 1080p.  Proof, with screenshots of the settings:

 

That INCLUDES cranking everything up in the Advanced Graphics menu.  (The "Frame Scaling Mode" had the biggest difference - turning that down to its lowest setting would get VRAM usage down to I think 2 GB or somewhere way down there even when I didn't touch anything else.)

 

 

 

I think that frame scaling mode means you had just supersampled the game to 8k and then resized to monitor screen. GTA V is one game that can tank any GPU.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×