Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

3080: Enough Vram for Next Gen?

18 minutes ago, SNerd7 said:

This is interesting. However the scaling also comes at a cost to resolution no? I have a 4k tv intend to make use of a lot going forward and I would hate to find out 12 or so months from now that the Vram on my 3080 is no longer enough for 4k ultra settings. 

It scales because you're going to be running them at different settings. While for benchmarks you definitely want to use the same settings across the board, you definitely wouldn't use a low end card to play games at Ultra settings, or a high end card to play at Low settings. The GPU itself is going to become an issue running at reasonable frame rates before the VRAM is.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to post
Share on other sites
6 minutes ago, JoostinOnline said:

It scales because you're going to be running them at different settings. While for benchmarks you definitely want to use the same settings across the board, you definitely wouldn't use a low end card to play games at Ultra settings, or a high end card to play at Low settings. The GPU itself is going to become an issue running at reasonable frame rates before the VRAM is.

*usually

 

There are almost always exceptions to the rule. (like really bad optimizations)

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
20 minutes ago, BTGbullseye said:

*usually

 

There are almost always exceptions to the rule. (like really bad optimizations)

That's not really an exception to the rule, that's just a broken game. If the settings aren't designed properly it's not a fault of the hardware.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to post
Share on other sites
2 hours ago, SNerd7 said:

I would say 11gb would be enough for ANY current title. I am concerned about next gen titles which are sure to have more geometry, higher resolution textures, and heavy ray tracing. Hell I am even a bit concerned about Cyberpunk. 

 

I was playing Marvel's Avengers and my Vram usage hit 10.3gb

I was playing RE:2 at 1440p with Max settings and it said I might not have enough vram even with 11GB on my 2080 Ti. 🤷🏻‍♂️ 

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to post
Share on other sites
Just now, GamerBlake said:

I was playing RE:2 at 1440p with Max settings and it said I might not have enough vram even with 11GB on my 2080 Ti. 🤷🏻‍♂️ 

It said the same for my 8GB on a 5700XT... It had no issues.

CPURyzen 5 3600 Cooler: Arctic Liquid Freezer II 120mm AIO RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-36 MoboASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT Case: Antec P5 PSU: Rosewill Capstone 750M Monitor: MSI Optix MAG241C Case Fans: 2x Arctic P12 PWM

Storage: HP EX950 1TB NVMe, HP EX900 1TB NVMe, dual Constellation ES 2GB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to post
Share on other sites
2 hours ago, SNerd7 said:

Really? If true that is really cool.

And with DLSS, it reduce also VRAM usage but no all times (it depends on the places)

 

You can to check VRAM between DLSS off and DLSS on :

 

 

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 16GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 49.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2667 v3 | 2x Intel BXSTS200C | 32GB DDR4-2133 ECC Reg | Gigabyte GeForce RTX 2080 SUPER Gaming OC 8G | 6x 120GB SSD SATA RAID0 SanDisk Plus | Seasonic SSR-850TR | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T61p | T9500 | 4GB DDR2 667 | Quadro FX 570m | 120GB SSD OCZ Vertex 2 | 15.4" TFT 1920x1200 | Win10

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 5 others computers (2 Apple classic, 1 mini PC WinXP and 2 PC pocket WinCE)

Link to post
Share on other sites
1 hour ago, GamerBlake said:

I was playing RE:2 at 1440p with Max settings and it said I might not have enough vram even with 11GB on my 2080 Ti. 🤷🏻‍♂️ 

Damn. Just played some Avengers and bumped up everything. It hit 11.07 gbs with maxed settings. Either this is the worst optimized game in the history of optimization or it's a sign of things to come. 

 

Here's to hoping I win the lottery and can buy a 3090 lol 

Link to post
Share on other sites
4 hours ago, SNerd7 said:

Damn. Just played some Avengers and bumped up everything. It hit 11.07 gbs with maxed settings. Either this is the worst optimized game in the history of optimization or it's a sign of things to come. 

 

Here's to hoping I win the lottery and can buy a 3090 lol 

MSI Afterburner or GPU-Z only report VRAM allocation requested by game, not actual VRAM usage. 

 

I tried this yesterday in Titanfall 2 with max out texture that according to the game need 8GB frame buffer. 

image.thumb.png.8113272ccd544d5fb9ec5f4ae11c8797.png

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites

!0gbs is enough for vanilla games. 

 

Gamers Nexus as addressed this.  26:00

I only use more on my modded games that use a lot of 8k textures.  Those the game engines are old(Bethesda) and crash a lot if they have to do vram swops.

RIG#1 CPU: Intel i9 10900k | Motherboard: ASUS ROG Maximus XII Hero | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: EVGA  RTX 2080 ti FTW3 ULTRA | PSU: EVGA 1300 G2 | Case: Cooler Master H500M | Cooler: SilverStone PF360 | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: LG 55" 4k B9 OLED TV


RIG#2 CPU: Intel i7 8086k, 5ghz all cores| Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA RTX 2080 ti FTW3 ULTRA | PSU: Corsair RMx1000W | Case: Cooler Master HAF X | Cooler: Noctua NH-D15 | SSD#1: Crucial MX300 2.5" 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#3 CPU: Intel i9 9900k, 5ghz all cores| Motherboard: AORUS Z390 Ultra | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA GTX 1080 ti SC | PSU: EVGA 1000 GQ | Case: Streacom BC1.1S | Cooler: Noctua NH-D15 | SSD: Samsung 870 EVO 2TB | Monitor:Samsung 28" 4k 60hz  

Link to post
Share on other sites
10 hours ago, xAcid9 said:

MSI Afterburner or GPU-Z only report VRAM allocation requested by game, not actual VRAM usage. 

 

I tried this yesterday in Titanfall 2 with max out texture that according to the game need 8GB frame buffer. 

image.thumb.png.8113272ccd544d5fb9ec5f4ae11c8797.png

Interesting, I wasn't aware there was a way to check actual usage during gameplay. Would love to know how to do this

Link to post
Share on other sites

Nope won't be enough, at least not if you want better textures than consoles, otherwise it'll just be about the same.

.but the question is almost redundant as we all know there will be better cards, most likely with more RAM throughout the upcoming new generation (and I have a feeling it will be a real "new gen" this time, from a technical point of view the time to  releasing new tech, ie consoles is perfect right now as we're reaching the end of what is currently possible with current tech, the inevitable death of moores law)

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
15 hours ago, GamerBlake said:

I was playing RE:2 at 1440p with Max settings and it said I might not have enough vram even with 11GB on my 2080 Ti. 🤷🏻‍♂️ 

Also this is a very good example how accurate the ingame information can be... As soon I go over the *actually* available VRAM in RE2 things become muddy, glitchy and prone to crash... As long I stay like 500MB under that limit everything looks and runs fine (which basically means"medium settings")

 

Capcom are one of the view devs who got that down pretty much - also the graphical ingame options are exceptionally good.

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
18 hours ago, TofuHaroto said:

For the most part, at least now, 10 gigs is more than enough for any current title.

meh...with supersampling...some VR simulators can use 11 gigs i tell you that much.

Here's one my video:

Now would the game perform the same with 8GB...yes probably...

with 6 GB? probably not

 

Imagine say the next gen of simulators...or flying sims...yeah.

 

10GB is a good amount...depending on what you do.

For 4K its fine...12gb would've been highly desirable on a GPU of that caliber i tell you that tho.

Nvidia has a history of putting just not quite enough memory on the cards.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

Link to post
Share on other sites
15 hours ago, SNerd7 said:

.

the compression is improved on the 3080 comparing to turing but i've had a few games that crashes when textures are maxed out on the 2080 ti.

 

The yields on ampere are likely great compared to turing and 10gb of ram is "just enough" for a majority of use cases.

9900k 1.36v max 1.3v avx 5.2ghz-1avx 4.8 cache 95C 175w 1.05v 4.4ghz 95w 55C R20/blender temps ll D15 ll Z390 taichi ult 1.60 bios ll gskill 2x8gb 16-16-16-34-280-24 ddr3866  bdie 1.42v dram 1.22v io/sa (anything higher needs more voltage on all (dram/io/sa) ll EVGA 2080 ti XC 1995//7600 power limited 79C max, stock voltage (bad ocer) ll 2x samsung 860 evo 500gb raid 0 ll 500gb nvme 970 evo ll Corsair graphite 780T ll EVGA G2 1300w ll Windows 10 Pro ll NEC PA272w (movie, work mon) 2k60 14bit lut ll Predator X27 4k144 hdr (using at 4:4:4 98)

 

old rig 8600k d15 evga 1080 ti hybrid  z370 extreme 4 2x8gb ddr3000 512gb nvme evo+ 860 evo 500gb raid 0 evga p2 1000w 

Link to post
Share on other sites

 

MONSTER HUNTER WORLD (Capcom) PC, High Resolution Textures, a highly modified "last gen" engine (MT Frameworks)

 

20200908_031055.thumb.jpg.d2c8d57dd6a67fc67d98674bcba39ef3.jpg

 

IMG_20200908_030727.thumb.jpg.7307a416099552d69be182b79d088f5a.jpg

 

 

We really need better tools to measure this tbh.

But if you compare , it's pretty close anyway and it's in the "hub" ... actually ingame with several huge monsters on screen VRAM usage will be much higher. This game can easily max out 8-12 GB, even at 1080p mind you.

 

PS: the GPU usage in task manager is obviously wrong, so who knows how accurate this all really is, see above we need better tools.

 

(Sorry about the probably terrible  pics, i suck at editing lol)

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites

VRAM is just like regular RAM.

 

The more you give the system, the more it'll take even if it doesn't truly need it due to allocation.

 

The Gamers Nexus video above explains a bit more...definitely worth a look.

 

10GB VRAM should be more than enough for years to come. I'm more concerned with what the 3060 will have, as it'll probably be 6GB and that is going to be bare minimum in my opinion, which may hurt it in the long run.

Link to post
Share on other sites

Hmm... 3080 is supposed to be good with 4K gaming, and has 10GB VRAM.  3090 can apparently do 8K, and has 24GB.

 

Umm, yeah, I'm thinking, that sounds a bit low.

 

My desktop and laptop both have 1920x1080 60Hz displays.  Desktop has an i7-4790K and a GTX 1060 3GB, laptop has an i7-6700K and a 6GB GTX 970M.  (The "reversal" of pairings is because I built the desktop in January 2015 and used iGPU for a while, then got the laptop in December 2015 with an i3-6100, then got the i7-6700K for the laptop and the GTX 1060 3GB for the desktop in November 2016.)

 

 

In GTA V, the game wants about 9 GB of VRAM at max settings at 1080p.  Proof, with screenshots of the settings:

 

5b3a6ff4a182b_Screenshot(9).thumb.png.5a2aecd9ce4abf3da23e8e3ac6bc26c1.png

 

5b3a6fedcd021_Screenshot(10).thumb.png.68c1f6f5ecd2009b927d22d1c73a2678.png

 

5b3a6fdcdb2e8_Screenshot(11).thumb.png.8773fc2df410940ef995597f10ef3763.png

 

That INCLUDES cranking everything up in the Advanced Graphics menu.  (The "Frame Scaling Mode" had the biggest difference - turning that down to its lowest setting would get VRAM usage down to I think 2 GB or somewhere way down there even when I didn't touch anything else.)

5b3a6fe65d38a_Screenshot(12).thumb.png.a8e13a488f77947971f4c64678661523.png

 

I don't have video of it, but I actually ran some tests a while back with those settings, by running the in-game benchmark sequence.

With the laptop's GTX 970M 6GB, I was getting about 6 fps at the beginning of the benchmark.

With the desktop's GTX 1060 3GB, I was getting 0.3 fps, and aborted the benchmark about a minute or two into it.  (Which wasn't very far - the actual pace of the game was really slow - was taking a full minute JUST to do that initial fly in *TO* the houses in the first scene.)

 

Interesting thing.  I took the 1060 out of the desktop and ran the benchmark on just the integrated graphics.  (It was taking 2 GB from system RAM - that's what I took those settings screenshots with.)

Remember the 1060 got 0.3 fps?  Well, the Intel HD 4600 got 1.8 fps.  That's about 6x faster, even though in FireStrike with default settings, the 1060 scores about 10.5K whereas the HD 4600 scores around 700-750 or so, or about 14-15X slower than the 1060 for "normal" things.

 

 

I wonder why the "slower" HD 4600 with less VRAM would have been faster in a severely VRAM-limited scenario, than the "faster" 1060 with more VRAM?

My hunch is it has something to do with the fact that when you're out of VRAM, you have to swap to system RAM.... when using the dGPU you have to go out of the GPU, through the PCIe slot, across the motherboard, to the CPU (I think), to the system RAM, and back again, and since you're using 3x the VRAM the card has, you have to do that swap 3x per frame.
OTOH, when using the iGPU, its VRAM *IS* the system RAM, so it HAS no latency when it needs more (as long as you have enough - the desktop has 32GB, the laptop at the time had 40GB and now has 64GB), and can do that swap much, much quicker.

Speaking of the laptop, I think its 6GB VRAM (and only needing to swap once per frame instead of twice or 3x like the 3GB 1060) contributed to it getting a significantly higher FPS, even though it's about 1.6X weaker in FIre Strike than the 1060 3GB (laptop has gotten around 6.5K score in that benchmark.)

 

 

 

Anyway...

GTA V came out on PC about 5 years 4 months ago.  I imagine that newer games are even more demanding.


People that say a particular new game runs well at 60+ fps at 4K or whatever on a modern high-end GPU ... I kind-of think there's one of two things happening.

EIther you're not cranking up absolutely EVERY single graphics option in the game to its absolute max (I don't call it max settings unless that's done),

Or, the game doesn't HAVE the option to go to such high settings.  (As I hinted at earlier, changing frame scaling, render resolution, etc does count - if that option exists, max settings means that would be maxed out too.

 

 

 

If I understand right, upping the resolution with the same settings increases the VRAM requirement by an amount equal to the difference in horizontal times vertical resolution.

For example, if 1920 x 1080 (1080p, 2,073,600 pixels) needs 9123 MB (8.9 GiB) of VRAM with those settings,

then 3840 x 2160 (4K, 8,294,400 pixels, 4x 1080p) would need 36,492 MB (35.64 GiB) of VRAM,

and 7680 x 4320 (8K, 33,177,600 pixels, 16x 1080p) would require 145,968 MB (142.5 GiB) of VRAM.

 

 

 

I don't have access to one of them, but sometime I'd like to see someone benchmark GTA V (or a newer more-demanding game) at max settings (like I described above) on something like a Quadro RTX 8000 with 48 GB VRAM, and compare it to the RTX 3090 with 24GB and RTX 3080 with 10GB when they come out, at 4K, max settings.
 

I want to guess that the Quadro would do 4K just fine, like 60+ fps or more (maybe 120 or 144+?), the RTX 3090 at 4K would be similar to my GTX 970M at 1080p, and the RTX 3080 at 4K might be like my GTX 1060 3GB at 1080p. :) 

 

 

Or is GTA V an outlier and most games don't require 9 GB of VRAM at 1080p when you crank all the settings to the max?

Or am I the only one who defines max settings as cranking absolutely EVERYTHING up?

 

BTW I was googling Fortnite and Overwatch when mentioning the render scaling earlier ... Overwatch I think supports up to 200% in the menu, Fortnite up to 100% in the menu, but, apparently Fortnite, IIRC, has text files so you can edit them to be pretty much almost any setting you want.  I don't have either of those games (well technically I have Overwatch since Oct 2016 but have never had time to install / play it, also never have downloaded Fortnite) so can't check it out for myself right now, but for the above "max settings" discussion, I'll be a little bit merciful on the PC hardware and say it would only need to be max settings in the game, not going so far as to put ridiculously high custom numbers in text configuration file. :P

 

Speaking of super high resolution / settings though ...

 

What currently-existing (or soon-to-be-released) GPU would be required to play modern AAA titles ...

on one of those several-foot-tall video walls consisting of, for example, 32" 8K monitors edge-to-edge across the ENTIRE wall,

at max settings, never having a single frame take longer than 16.66 ms to render?

😝

 

 

Link to post
Share on other sites

watch this...early next year they release 3070 Super and 3080 Super and they will have more memory 😂

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

Link to post
Share on other sites
16 minutes ago, PianoPlayer88Key said:

GTA V, the game wants about 9 GB of VRAM

Well and do you know for sure there's no dynamic scaling (resolution) going on because that's what games usually do nowadays when they see they don't have enough VRAM (to avoid loading into system RAM)

 

That could also explain what you've observed with the Intel GPU btw

RYZEN 5 3600 | EVGA GTX 1070 FTW2 | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites

I am waiting a bit to see how NextGen looks.  I also kinda want to see how the 3080 Ti stacks up.

Quinnell - PC Gaming Enthusiast / Patriot

br.quinnell.io | Belligerent Renegades (American Gaming Clan)

Link to post
Share on other sites
13 hours ago, SNerd7 said:

Interesting, I wasn't aware there was a way to check actual usage during gameplay. Would love to know how to do this

I use ProcessHacker. https://processhacker.sourceforge.io/downloads.php

Or you can use Special K tool but not really compatible with all games. 

 

Maybe you guys can digest this post, he only tested one game though. 

https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/page-2

 

TL;DR
The regular software(MSI AB, GPU-Z, Task Manager) we use to check VRAM usage doesn't show ACTUAL VRAM the game using. 

 

I actually want to test MHW with High Rest texture but don't feel like downloading 80gb-90gb of game files with my slow internet right now.

 

10 hours ago, i_build_nanosuits said:

watch this...early next year they release 3070 Super and 3080 Super and they will have more memory 😂

 That will happen for sure. 😆

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites

I dont understand why Nvidia would make a high end gpu and claim that it can run 4k 60fps with settings maxed out but 10gb vram isn't enough. Doesn't make sense to me.

Link to post
Share on other sites
1 hour ago, PianoPlayer88Key said:
Spoiler

 

Hmm... 3080 is supposed to be good with 4K gaming, and has 10GB VRAM.  3090 can apparently do 8K, and has 24GB.

 

Umm, yeah, I'm thinking, that sounds a bit low.

 

My desktop and laptop both have 1920x1080 60Hz displays.  Desktop has an i7-4790K and a GTX 1060 3GB, laptop has an i7-6700K and a 6GB GTX 970M.  (The "reversal" of pairings is because I built the desktop in January 2015 and used iGPU for a while, then got the laptop in December 2015 with an i3-6100, then got the i7-6700K for the laptop and the GTX 1060 3GB for the desktop in November 2016.)

 

 

In GTA V, the game wants about 9 GB of VRAM at max settings at 1080p.  Proof, with screenshots of the settings:

 

5b3a6ff4a182b_Screenshot(9).thumb.png.5a2aecd9ce4abf3da23e8e3ac6bc26c1.png

 

5b3a6fedcd021_Screenshot(10).thumb.png.68c1f6f5ecd2009b927d22d1c73a2678.png

 

5b3a6fdcdb2e8_Screenshot(11).thumb.png.8773fc2df410940ef995597f10ef3763.png

 

That INCLUDES cranking everything up in the Advanced Graphics menu.  (The "Frame Scaling Mode" had the biggest difference - turning that down to its lowest setting would get VRAM usage down to I think 2 GB or somewhere way down there even when I didn't touch anything else.)

5b3a6fe65d38a_Screenshot(12).thumb.png.a8e13a488f77947971f4c64678661523.png

 

I don't have video of it, but I actually ran some tests a while back with those settings, by running the in-game benchmark sequence.

With the laptop's GTX 970M 6GB, I was getting about 6 fps at the beginning of the benchmark.

With the desktop's GTX 1060 3GB, I was getting 0.3 fps, and aborted the benchmark about a minute or two into it.  (Which wasn't very far - the actual pace of the game was really slow - was taking a full minute JUST to do that initial fly in *TO* the houses in the first scene.)

 

Interesting thing.  I took the 1060 out of the desktop and ran the benchmark on just the integrated graphics.  (It was taking 2 GB from system RAM - that's what I took those settings screenshots with.)

Remember the 1060 got 0.3 fps?  Well, the Intel HD 4600 got 1.8 fps.  That's about 6x faster, even though in FireStrike with default settings, the 1060 scores about 10.5K whereas the HD 4600 scores around 700-750 or so, or about 14-15X slower than the 1060 for "normal" things.

 

 

I wonder why the "slower" HD 4600 with less VRAM would have been faster in a severely VRAM-limited scenario, than the "faster" 1060 with more VRAM?

My hunch is it has something to do with the fact that when you're out of VRAM, you have to swap to system RAM.... when using the dGPU you have to go out of the GPU, through the PCIe slot, across the motherboard, to the CPU (I think), to the system RAM, and back again, and since you're using 3x the VRAM the card has, you have to do that swap 3x per frame.
OTOH, when using the iGPU, its VRAM *IS* the system RAM, so it HAS no latency when it needs more (as long as you have enough - the desktop has 32GB, the laptop at the time had 40GB and now has 64GB), and can do that swap much, much quicker.

Speaking of the laptop, I think its 6GB VRAM (and only needing to swap once per frame instead of twice or 3x like the 3GB 1060) contributed to it getting a significantly higher FPS, even though it's about 1.6X weaker in FIre Strike than the 1060 3GB (laptop has gotten around 6.5K score in that benchmark.)

 

 

 

Anyway...

GTA V came out on PC about 5 years 4 months ago.  I imagine that newer games are even more demanding.


People that say a particular new game runs well at 60+ fps at 4K or whatever on a modern high-end GPU ... I kind-of think there's one of two things happening.

EIther you're not cranking up absolutely EVERY single graphics option in the game to its absolute max (I don't call it max settings unless that's done),

Or, the game doesn't HAVE the option to go to such high settings.  (As I hinted at earlier, changing frame scaling, render resolution, etc does count - if that option exists, max settings means that would be maxed out too.

 

 

 

If I understand right, upping the resolution with the same settings increases the VRAM requirement by an amount equal to the difference in horizontal times vertical resolution.

For example, if 1920 x 1080 (1080p, 2,073,600 pixels) needs 9123 MB (8.9 GiB) of VRAM with those settings,

then 3840 x 2160 (4K, 8,294,400 pixels, 4x 1080p) would need 36,492 MB (35.64 GiB) of VRAM,

and 7680 x 4320 (8K, 33,177,600 pixels, 16x 1080p) would require 145,968 MB (142.5 GiB) of VRAM.

 

 

 

I don't have access to one of them, but sometime I'd like to see someone benchmark GTA V (or a newer more-demanding game) at max settings (like I described above) on something like a Quadro RTX 8000 with 48 GB VRAM, and compare it to the RTX 3090 with 24GB and RTX 3080 with 10GB when they come out, at 4K, max settings.
 

I want to guess that the Quadro would do 4K just fine, like 60+ fps or more (maybe 120 or 144+?), the RTX 3090 at 4K would be similar to my GTX 970M at 1080p, and the RTX 3080 at 4K might be like my GTX 1060 3GB at 1080p. :) 

 

 

Or is GTA V an outlier and most games don't require 9 GB of VRAM at 1080p when you crank all the settings to the max?

Or am I the only one who defines max settings as cranking absolutely EVERYTHING up?

 

BTW I was googling Fortnite and Overwatch when mentioning the render scaling earlier ... Overwatch I think supports up to 200% in the menu, Fortnite up to 100% in the menu, but, apparently Fortnite, IIRC, has text files so you can edit them to be pretty much almost any setting you want.  I don't have either of those games (well technically I have Overwatch since Oct 2016 but have never had time to install / play it, also never have downloaded Fortnite) so can't check it out for myself right now, but for the above "max settings" discussion, I'll be a little bit merciful on the PC hardware and say it would only need to be max settings in the game, not going so far as to put ridiculously high custom numbers in text configuration file. :P

 

Speaking of super high resolution / settings though ...

 

What currently-existing (or soon-to-be-released) GPU would be required to play modern AAA titles ...

on one of those several-foot-tall video walls consisting of, for example, 32" 8K monitors edge-to-edge across the ENTIRE wall,

at max settings, never having a single frame take longer than 16.66 ms to render?

😝

 


 

 

At 2.5x frame scaling of 1920x1080, the actual render resolution is 4800x2700.

You just blew past 4K in your test... way past.

I understand what you're trying to say, but GTA V doesn't even use close to 10GB VRAM at 4K, and newer titles have already started using DLSS and other AI upscaling techniques, which actually reduce VRAM utilization (they render at a lower resolution).

Nvidia knows what they're doing. I've done extensive testing with actual VRAM hog games like Wolfenstein Youngblood, Control, and Borderlands 3. None of these games go over 8GB VRAM at 4K max settings. Youngblood and Control also have DLSS, which reduces VRAM utilization further if enabled, and because there is no image penalty and a tangible increase in performance, DLSS should be enabled.

CPU: Ryzen 7 3700x || GPU: RTX 3080 Founders Edition || Memory: 48GB Corsair 3000mhz DDR4 (22GB PrimoCache) || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G810 || Mouse: Logitech G502 || PSU: EVGA 750-watt Gold
 

Link to post
Share on other sites
1 hour ago, Gillbatez_sicromoft said:

I dont understand why Nvidia would make a high end gpu and claim that it can run 4k 60fps with settings maxed out but 10gb vram isn't enough. Doesn't make sense to me.

Because it can and will. Nvidia knows what they're doing. They would not purposely gimp a GPU like that; it would be a PR nightmare to have a ton of angry consumers due to  their "flagship" GPU not having enough VRAM. 

CPU: Ryzen 7 3700x || GPU: RTX 3080 Founders Edition || Memory: 48GB Corsair 3000mhz DDR4 (22GB PrimoCache) || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G810 || Mouse: Logitech G502 || PSU: EVGA 750-watt Gold
 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×