Jump to content

The video RAM information guide

D2ultima
13 minutes ago, LynxThe1st said:

rip

changed text to automatic. Broke all spoilers. This is getting fucking ridiculous right now.

 

Can you at least see it on dark theme?

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, D2ultima said:

Can you at least see it on dark theme?

Yup, still bit bright but it's visible :)

Well, now you're reading this, I might as well inspire you with a deep quote

If you fall, I'll be there

             - Floor

so inspiring, so deep, the feels ;-;  THE FEELS ;-;

 

Newer Build which I'm gonna buy when I haz moneyz

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£0.00)
CPU Cooler: be quiet! PURE ROCK 51.7 CFM Sleeve Bearing CPU Cooler  (£25.98 @ Novatech)
Motherboard: MSI B85-G43 GAMING ATX LGA1150 Motherboard  (£78.10 @ Amazon UK)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£0.00)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£29.76 @ Ebuyer)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive  (£0.00)
Video Card: MSI Radeon R9 380 4GB Video Card  (£179.99 @ Amazon UK)
Case: Corsair 780T ATX Full Tower Case  (£149.99 @ Amazon UK)
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  (£70.97 @ Amazon UK)
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For £0.00)
Operating System: Microsoft Windows 8.1 OEM (64-bit)  (£59.00 @ Amazon UK)
Wireless Network Adapter: TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter  (£24.99 @ Amazon UK)
Keyboard: Corsair K95 RGB Wired Gaming Keyboard  (£184.99 @ Amazon UK)
Total: £803.77

 

somethings says '£0.00' because I already bought them

 

 
probably best remix out there

 

 

If I follow you, consider your self lucky, I rarely follow people and when I do it's either because

1) I look up to you

2)I like your picture

3) Or you have an epic ass rig (40 titan x's 9 6700k's clocked at 29GHz that is what I call a epic ass rig, but no one has that soo...)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, LynxThe1st said:

Yup, still bit bright but it's visible :)

Ok. Now to fix all the spoilers. AGAIN. Oh my everliving word, I should get compensation for dealing with this.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, D2ultima said:

Ok. Now to fix all the spoilers. AGAIN. Oh my everliving word, I should get compensation for dealing with this.

You should seriously, these threads you make are informative af :)

rip you ;-;

@Godlygamer23 y u nu fix?

(pls im jokin ;-; pls dun ban)

Well, now you're reading this, I might as well inspire you with a deep quote

If you fall, I'll be there

             - Floor

so inspiring, so deep, the feels ;-;  THE FEELS ;-;

 

Newer Build which I'm gonna buy when I haz moneyz

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£0.00)
CPU Cooler: be quiet! PURE ROCK 51.7 CFM Sleeve Bearing CPU Cooler  (£25.98 @ Novatech)
Motherboard: MSI B85-G43 GAMING ATX LGA1150 Motherboard  (£78.10 @ Amazon UK)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£0.00)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£29.76 @ Ebuyer)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive  (£0.00)
Video Card: MSI Radeon R9 380 4GB Video Card  (£179.99 @ Amazon UK)
Case: Corsair 780T ATX Full Tower Case  (£149.99 @ Amazon UK)
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  (£70.97 @ Amazon UK)
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For £0.00)
Operating System: Microsoft Windows 8.1 OEM (64-bit)  (£59.00 @ Amazon UK)
Wireless Network Adapter: TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter  (£24.99 @ Amazon UK)
Keyboard: Corsair K95 RGB Wired Gaming Keyboard  (£184.99 @ Amazon UK)
Total: £803.77

 

somethings says '£0.00' because I already bought them

 

 
probably best remix out there

 

 

If I follow you, consider your self lucky, I rarely follow people and when I do it's either because

1) I look up to you

2)I like your picture

3) Or you have an epic ass rig (40 titan x's 9 6700k's clocked at 29GHz that is what I call a epic ass rig, but no one has that soo...)

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, LynxThe1st said:

You should seriously, these threads you make are informative af :)

rip you ;-;

@Godlygamer23 y u nu fix?

(pls im jokin ;-; pls dun ban)

Okay, check it for me. I literally copied/pasted the text and removed the formatting and basically re-added every single spoiler tag and link and colour change.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, D2ultima said:

Okay, check it for me. I literally copied/pasted the text and removed the formatting and basically re-added every single spoiler tag and link and colour change.

sorry about the 3 year late response xp

was asleep

I'll check it now

Well, now you're reading this, I might as well inspire you with a deep quote

If you fall, I'll be there

             - Floor

so inspiring, so deep, the feels ;-;  THE FEELS ;-;

 

Newer Build which I'm gonna buy when I haz moneyz

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£0.00)
CPU Cooler: be quiet! PURE ROCK 51.7 CFM Sleeve Bearing CPU Cooler  (£25.98 @ Novatech)
Motherboard: MSI B85-G43 GAMING ATX LGA1150 Motherboard  (£78.10 @ Amazon UK)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£0.00)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£29.76 @ Ebuyer)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive  (£0.00)
Video Card: MSI Radeon R9 380 4GB Video Card  (£179.99 @ Amazon UK)
Case: Corsair 780T ATX Full Tower Case  (£149.99 @ Amazon UK)
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  (£70.97 @ Amazon UK)
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For £0.00)
Operating System: Microsoft Windows 8.1 OEM (64-bit)  (£59.00 @ Amazon UK)
Wireless Network Adapter: TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter  (£24.99 @ Amazon UK)
Keyboard: Corsair K95 RGB Wired Gaming Keyboard  (£184.99 @ Amazon UK)
Total: £803.77

 

somethings says '£0.00' because I already bought them

 

 
probably best remix out there

 

 

If I follow you, consider your self lucky, I rarely follow people and when I do it's either because

1) I look up to you

2)I like your picture

3) Or you have an epic ass rig (40 titan x's 9 6700k's clocked at 29GHz that is what I call a epic ass rig, but no one has that soo...)

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, D2ultima said:

Okay, check it for me. I literally copied/pasted the text and removed the formatting and basically re-added every single spoiler tag and link and colour change.

Yup!

All good, the text is very visible and the links too

Well, now you're reading this, I might as well inspire you with a deep quote

If you fall, I'll be there

             - Floor

so inspiring, so deep, the feels ;-;  THE FEELS ;-;

 

Newer Build which I'm gonna buy when I haz moneyz

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£0.00)
CPU Cooler: be quiet! PURE ROCK 51.7 CFM Sleeve Bearing CPU Cooler  (£25.98 @ Novatech)
Motherboard: MSI B85-G43 GAMING ATX LGA1150 Motherboard  (£78.10 @ Amazon UK)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£0.00)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£29.76 @ Ebuyer)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive  (£0.00)
Video Card: MSI Radeon R9 380 4GB Video Card  (£179.99 @ Amazon UK)
Case: Corsair 780T ATX Full Tower Case  (£149.99 @ Amazon UK)
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  (£70.97 @ Amazon UK)
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For £0.00)
Operating System: Microsoft Windows 8.1 OEM (64-bit)  (£59.00 @ Amazon UK)
Wireless Network Adapter: TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter  (£24.99 @ Amazon UK)
Keyboard: Corsair K95 RGB Wired Gaming Keyboard  (£184.99 @ Amazon UK)
Total: £803.77

 

somethings says '£0.00' because I already bought them

 

 
probably best remix out there

 

 

If I follow you, consider your self lucky, I rarely follow people and when I do it's either because

1) I look up to you

2)I like your picture

3) Or you have an epic ass rig (40 titan x's 9 6700k's clocked at 29GHz that is what I call a epic ass rig, but no one has that soo...)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LynxThe1st said:

Yup!

All good, the text is very visible and the links too

Really? You know I didn't change any of the colours or anything. I simply copied/pasted the entire guide without formatting and then added back in all the colours, links, etc that I had from before. The guide should look EXACTLY the same as when I originally wrote it.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, D2ultima said:

Really? You know I didn't change any of the colours or anything. I simply copied/pasted the entire guide without formatting and then added back in all the colours, links, etc that I had from before. The guide should look EXACTLY the same as when I originally wrote it.

Huh, that's weird, because at first you couldn't see the text but now I can see it fine

odd

 

Well, now you're reading this, I might as well inspire you with a deep quote

If you fall, I'll be there

             - Floor

so inspiring, so deep, the feels ;-;  THE FEELS ;-;

 

Newer Build which I'm gonna buy when I haz moneyz

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£0.00)
CPU Cooler: be quiet! PURE ROCK 51.7 CFM Sleeve Bearing CPU Cooler  (£25.98 @ Novatech)
Motherboard: MSI B85-G43 GAMING ATX LGA1150 Motherboard  (£78.10 @ Amazon UK)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£0.00)
Memory: Kingston HyperX FURY 8GB (2 x 4GB) DDR3-1600 Memory  (£29.76 @ Ebuyer)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive  (£0.00)
Video Card: MSI Radeon R9 380 4GB Video Card  (£179.99 @ Amazon UK)
Case: Corsair 780T ATX Full Tower Case  (£149.99 @ Amazon UK)
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  (£70.97 @ Amazon UK)
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For £0.00)
Operating System: Microsoft Windows 8.1 OEM (64-bit)  (£59.00 @ Amazon UK)
Wireless Network Adapter: TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter  (£24.99 @ Amazon UK)
Keyboard: Corsair K95 RGB Wired Gaming Keyboard  (£184.99 @ Amazon UK)
Total: £803.77

 

somethings says '£0.00' because I already bought them

 

 
probably best remix out there

 

 

If I follow you, consider your self lucky, I rarely follow people and when I do it's either because

1) I look up to you

2)I like your picture

3) Or you have an epic ass rig (40 titan x's 9 6700k's clocked at 29GHz that is what I call a epic ass rig, but no one has that soo...)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

Soon I will be enjoying the extra brilliance of +2.5 Gb when I can play with my GTX 980 Tis.

 

 

(Awaiting replacement mobo). 

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 8/16/2014 at 10:16 AM, D2ultima said:

Ok. I did a SLI guide and now time to do a vRAM/memory bandwidth guide. A lot of people seem to be confused about vRAM in general. Well, here we go.

 

Let's clear up some misconceptions about vRAM! 

 

  Reveal hidden contents

 

Assumption: You need a powerful video card to make use of a lot of vRAM.

FALSE. vRAM usage is independent of GPU usage. For example, here is a screenshot of Call of Duty: Ghosts using 4GB of vRAM while happily using ~15% of my video cards sitting at the main menu. Ghosts is a bad example, however, so here is also a screenshot of Titanfall using only 60% of my GPU while happily gobbling up 3.9GB vRAM. The second screen was in the shot to show RAM usage to someone; so you can ignore that. 

 

Assumption: vRAM amount is related to the memory bus width.

Partially true. vRAM amount is only loosely related to the memory bus width. 128/256/512 bit memory buses have RAM sizes like 1GB, 2GB, 4GB, 8GB, etc. 96/192/384 bit memory buses have RAM sizes like 768MB, 1.5GB, 3GB, 6GB, etc. You can have 4GB vRAM on a 128-bit memory bus (like HERE) and 6GB vRAM on a 192-bit memory bus (like HERE). On the flip side, HERE is a 384-bit memory bus card with only 3GB vRAM. This question will be extrapolated under the its own section later down, as there is a lot of information to add.

 

Assumption: You need huge amounts of vRAM if you're going to use multiple monitors or high resolution screens.

Partially true. You do not NEED it. It will help, but not in the way you may think. If you're considering gaming, especially fullscreened, your vRAM usage depends about 95% on game settings and resolution matters very little; 2GB would work easy for triple monitor 1080p gaming (with games prior to 2014 at least).

 

Assumption: A lot of vRAM being used must mean the textures are great.

FALSE. There's texture size and texture quality. Texture size is the resolution which they are rendered at. Texture quality is how well it is drawn/rendered. Just because a game has very large textures does NOT automatically mean it's drawn or rendered well, and thus doesn't automatically mean it looks brilliant. Also, things like shadow maps can use lots of vRAM, and thus use up your vRAM without actually improving texture quality.

 

Assumption: Adding two cards gives me double the vRAM!

FALSE. SLI and CrossfireX copy the content of their memory across the cards, therefore your vRAM amount does not increase. Your memory access bandwidth, however, does (nearly) double.

 

Assumption: GDDR5 is always better than GDDR3.

FALSE. Memory bus width is taken into account here. GDDR3 doubles memory bandwidth, and GDDR5 doubles it again. A 1000MHz mem clock on a 512-bit mem bus, GDDR3 card is EQUAL TO a 1000MHz mem clock on a 256-bit mem bus, GDDR5 card. Most new cards don't touch GDDR3 anymore, but I'm including this anyway to clear up anything people may be misinformed on.

 

Assumption: I need tons more vRAM to turn my Anti Aliasing up.

Partially true. Normal levels of AA, like 4x MSAA or TXAA, is not going to use a lot of extra memory... 100MB here, 200MB there, for the most part. Anything that re-samples textures will do this. FXAA and MLAA should not boost memory drain at all, as they do not mess with textures directly... SMAA is a post-process form of AA but has a SLIGHT impact on vRAM. High levels of CSAA however, can use a lot of vRAM. 16xQ CSAA can use 600MB of vRAM and 32x CSAA is even more (for example, my Killing Floor uses 2.1GB vRAM due 32x CSAA at 1080p). I DO NOT KNOW how SLI-enabled AA levels count toward vRAM usage (64x CSAA possible). Due to the fact that AA is used on the sacrificed card itself and SLI is in fact "off", it might use the vRAM buffer of the second card for AA in that way. I will test this one day I'm not bothering unless nVidia's Pascal GPUs can use CSAA again, as Maxwell GPUs cannot. I do not know how MFAA works with the new Maxwell GPUs with respect to vRAM usage, and I will not know unless someone buys and sends me 970M or 980M GPUs or a desktop with Maxwell cards (which is not going to happen lel). I DO know that MFAA does not work with SLI, so forcing high levels of AA is now officially dead. If you want to volunteer for testing this however, and have the hardware to test with, you can PM me here or send me a tweet.

 

Assumption: There's no way I'll need 4GB or more for 1080p!

FALSE. I explain a lot of it above and below about how vRAM is used up, and 4GB can definitely be used up at 1080p. I will also point out that games which lack sufficient compression techniques in their code CAN and WILL use up 4GB or more of your vRAM, at 1080p, regardless of if they have advanced lighting and reflections and high quality textures (separate from high resolution textures) or not. Because as is said elsewhere in this guide, game resolution is NOT the main factor in how much vRAM you use. This is not to say that most games that use 3GB and 4GB at 1080p (like Evolve, or CoD: AW) actually have a need for it... they're unoptimized. No denying that. But it doesn't mean that it's going to hurt to have enough vRAM to satisfy their needs. Remember: more vRAM is better than less.

 

What does vRAM size have to do with gaming?

 

  Reveal hidden contents

 

Video games use textures. Textures have a size. I already explained this a little above, but I'll be a bit more thorough here. Texture size is pretty native to a game, usually. Most games won't allow you to make much of a change to the texture sizes, but some do. Titanfall is an example of one such game. Modifying Skyrim is another example. A large texture size simply uses up more vRAM, and does little else in terms of performance impact in most cases. Some games are coded badly though ^_^. Anyway, texture size improvements are how crisp things are more than anything else. Just because your textures are crisp does NOT mean they are good; please remember this. You can have huge, crisp, badly drawn and badly rendered textures, using 4GB of vRAM or more and looking like a piece of poo. Like Call of Duty Ghosts. Or try comparing Titanfall to BF4. Worlds apart, but the former uses more vRAM. In other words, vRAM is just a per-game thing. Adjusting texture quality actually doesn't help vRAM too much either. Only adjusting the size does. BF4 on lowest graphics uses ~2.2GB vRAM for me just as it does on ultra.

 

Next, you have something slightly different. vRAM can be dedicated to more than just texture size! One such game with does this is Arma 2. There, one can adjust the amount of video memory the game is allowed to use. Oddly enough, the highest allowance is "default" which caps out at about 2.5GB of used memory. "Very high" is what most people would probably use, but that's a 1.5GB limitation for that game. It applies to the DayZ mod for Arma 2 as well. What that does is it reduces popping by keeping more textures in memory. So when you zoom in with a powerful rifle, you don't have to wait for the surroundings to catch up as much. It does not improve framerates at all to give the game more vRAM, but the texture pop-in reduction etc is nice.

 

Also, shadow resolution, number of dynamic light sources and reflections, etc can all use up extra video memory. In this case, a fully-optimized game which looks worse than Crysis 3 but uses more vRAM than Crysis 3 can be achieved by bumping the shadow resolutions to high levels and improving the amount of dynamic lighting and reflections that is available, especially if the game is more open-world and has many more objects loaded in to be affected by said lights or cast shadows or whatever. Draw distances also take up extra vRAM for things like open world games. So don't compare apples-to-apples "texture quality" as an indicator of good vRAM usage, but please DO compare everything else. An open world game is likely to use a bit more than a corridor shooter. But the corridor shooter likely will have better textures. On the other hand, for a game like Shadows of Mordor where bumping the textures is the MAIN drain on the system's vRAM buffer, you can quite clearly understand that it's just unoptimization there. But a game like skyrim with very high-res texture mods and lighting overhauls eating up a solid 3GB etc is fine due to the large open-world nature, even though it looks worse than anything you'd find in Crysis 3,  etc.

 

Bonus: What happens if I don't have enough vRAM that a game asks for for certain settings?

 

  Reveal hidden contents

 

I have to add this section because of all these new games where developers have decided that everybody owns two Titan Blacks with 6GB vRAM and will just throw uncompressed everything into vRAM even though the quality isn't all that great (stares at Shadow of Mordor and Evil Within and any other ID Tech 5 engine game ever).

 

Now, first and foremost, some games will lock some options away from you and you won't ever see it, unless you hack it in (though this is rare). Wolfenstein: The New Order does this; if your GPU is under 3GB in vRAM size you will never see "ultra" textures in the options menu. Forcing it on usually results in stuttering and crashing or generally undesirable behaviour on an extreme scale. For some proof, you can read this article about what happens when forcing ultra textures on Wolfenstein: TNO using a 2GB 770.

 

Next, and the most common option, the game will allow you turn on the settings, and you may experience some playable framerates, especially in fullscreen with no second monitors attached so you cut down on OS-used vRAM, but the minimum framerates may be quite low and some stutter may be apparent (depending on how much extra vRAM you would need) and you might even crash the game. When the vRAM limit is hit, windows (or the drivers; I'm not sure) starts compressing what's in the vRAM buffer and tossing out things used for caching. Typically, you can go a couple hundred MB above what the game would require if you had a higher vRAM card without issues, but there WILL come a point where you can compress no longer, and you will end up using virtual memory of some kind, and performance will decline. It may not be considered "playable" for some people (depending on what happens), but for others they'll deal with it/not turn up settings. So be wary of this.

 

Finally, Windows, which often needs vRAM for its desktop, may bug you about having "out of memory" errors and ask if you wish to switch off Aero (Vista and 7; never seen this in 8 though I never used 8 with under 4GB of vRAM) and may pull you out of your game to do so. This will come with framedrops from a game which cannot compress vRAM usage any longer. The reason this happens is because the rest of the texture data must be held in RAM or even on the Hard Drive/Solid State Drive, and accessing these (yes, even the SSD) is much slower than pulling the raw information from the vRAM buffer, and is what usually causes the slowdowns and stutters. To this effect, I ASSUME that extremely fast & low latency system memory and installation of the game on a SSD as well as very high GPU memory bandwidth will likely reduce the stuttering/slowdown occurrences due to faster access of the data not stored in the vRAM buffer, and the higher GPU memory bandwidth means it can empty/refill the vRAM buffer faster, and thus the game won't need to "wait" as often.

 

So basically, it may work or it may not work, but no matter how it does, if you try to force buffer information built for more vRAM than you own into a card, it will decrease the performance of the game, though the game may very well still be playable.

 

 

And now resolution, Anti Aliasing and operating systems

 

  Reveal hidden contents

 

Now, onto something more impacting. Resolution. Or in this case, not so impacting at all. Rendered resolution of a game has about 5% of an impact on how much vRAM the game is using. The only game I've seen make a large jump in used vRAM from a resolution increase is Watch Dogs, which (according to benchmarks) goes from ~3100MB vRAM to ~3800MB vRAM bumping from 1080p to 4k. Not a big amount either; considering that's 4x as many pixels. ~700-750MB vRAM increase is tiny, and the game was still easily playable on the 3GB vRAM 780Ti cards they were benchmarking with too. Most other games have a tiny impact too. Upscale Dark Souls 2 to 4k? Use about 400MB extra vRAM (caps under 1.4GB at 4K even). Sniper Elite v2 at 1080p with 4.0 SSAA (4k res downsample) + "anti aliasing" set to "high" doesn't crack 2GB of vRAM either. BF4 on ultra uses 2.2GB and doesn't pass 2.5GB if you turn the resolution scale to 200% in-game either (unless you turn on AA as well; in which case it totals near 3GB or less). Honestly? Most games up until 2013 are going to be fine with 2-3GB of vRAM, even at 4K or triple-monitor 1080p gaming. 2014 and onward unoptimized AAA ports though appear to require ridiculous amounts of vRAM, so you may wish to REALLY consider that. Please note that using SSAA is akin to actually bumping the resolution up manually. 4x SSAA at 1080p would accurately show the vRAM usage of running a game at 4K resolution. PLEASE NOTE HOWEVER: Multisample-type AA at higher resolutions resamples those higher resolutions, so the vRAM increase for using is a bit bigger (though proportionally so). This is NOT affected by post-process-type and temporal-type filters such as SMAA 1x, 1Tx, 2x, 2Tx, FXAA, MLAA, etc. SMAA 4x however has a decent amount of multisampling in it, so it may impact vRAM a little more (not as much as 8x MSAA).

 

Next, the operating system you're using has an impact on how much vRAM you're using. WHUT? Yup! Your OS has an impact. Windows 7 uses 128MB of your vRAM minimum, and can use extra vRAM if it needs to and you can spare it (once Aero is enabled). Windows 8 & 8.1 can actually use more vRAM, up to 400MB in total. Switching Aero off like some people do will work to reduce the used vRAM from your desktop, but if you're on Windows 8 or Windows 8.1, switching Aero off without causing some issues isn't possible (that I know of). So be wary of this too! Of course, fullscreening a game will remove the desktop's rendering (for that screen only) and free up your system resources some, in case you're vRAM starved with an older card or something with 1-2GB. I also believe that multiple monitors = more vRAM used while sitting at the desktop (via a correlation to number of pixels rendered), but I cannot conclusively prove this and it is harder to find the information (or test it successfully) than one would think. The reason I am unsure is because my vRAM usage seems to change every time I restart my PC and programs. I've seen it idle at 800MB in Win 8.1 and I've also seen it idle at 300MB.

 

Bonus! I recently discovered that opening pictures in windows photo viewer WILL increase your vRAM used, at least on Windows 8 (not tested on Win 7 and earlier OSes; nor on Win 10). So if you find you're vRAM starved and have a couple pictures open and minimized, you should close those.

 

 

And now about multiple monitors

 

  Reveal hidden contents

 

So, how do multiple monitors benefit from more vRAM? Well you see, having extra screens uses vRAM even while gaming fullscreened. If you have two screens and fullscreen your game on one, then you're still rendering the desktop on the other(s), and thus using extra vRAM. This is why some games in the past (think back to 1GB cards and Just Cause 2, for example) would pop up and say you're out of memory and ask to change Windows to the basic theme (to save on vRAM). As said above, fullscreening frees up the vRAM for the screens you're taking up, but if you have 3-4 monitors connected you're gonna want some excess vRAM so that neither your games nor windows are starved; far less if you're running in either regular or borderless windowed mode (where it still renders the desktop behind the game on the monitor you're running the game on). PLEASE NOTE THAT THIS IS DIFFERENT FROM GAMING ON MULTIPLE SCREENS. In this case, more is better, and more helps, but is entirely dependent on the games you are playing. If your game is one of these new unoptimized AAA titles like Watch Dogs, which can pull 3GB vRAM easily at 1080p, having three screens connected and running watch dogs maxed out on a 3GB card may run into some memory issues. For a game like BF4 however, which grabs at most 2.2GB of vRAM in DX11 at 1080p using the ultra preset, having two extra screens and a single 3GB card connected would be fine. A 2GB card might run into problems at that point though, while having only one monitor attached would work on the 2GB card in that scenario. Also note: the higher the resolution of the monitor when not running a game is indeed a factor in how much vRAM it uses.

 

Now while 2-3 monitors could use up a decent bit of vRAM (game + 256MB while fullscreened for Win 7), usually it'll get compressed if you say... only have a 2GB GPU and are running a game using 1.8-2GB of vRAM on its own. But of course, the more headroom you have the better, and the less chance you'll get windows giving you an out-of-memory error in your game, as this WILL happen if your game cannot compress vRAM usage anymore for your current settings. This is why people recommend more vRAM for multiple monitors, though they probably blow this reason out of proportions in their heads a little (I've seen people say a weaker 4GB card will do better for multiple monitors than a stronger 3GB card when purely talking about gaming... very rare this would be the case).

 

Now as far as gaming on multiple monitors goes: it DOES use a bit more vRAM, but it's only an incremental increase as you saw with resolution. 5760 x 1080 is actually LESS pixels than 3840 x 2160, so if you can run a game at 4k res (or at 1080p with 4x Supersampling), you can run it easier on 3 monitors at 1080p, both vRAM-wise and performance-wise  ^_^. In that case, you have less to worry about than the user who is NOT going to use all three screens and only going to game on one and use the others for productivity. Three 1440p screens however, is a HUGE amount of pixels; more than a single 4K monitor would bring. For gaming at that resolution, under no circumstances would I suggest less than 4GB of vRAM if you plan to play any game that uses 2GB at 1080p. 1080p --> 3840 x 2160 may only be 600-800MB of an increase, but 1080p --> 7680 x 1440 would likely use over a full GB extra; and more if you throw even low-level multisample AA on the game, so I'd suggest having more than 3GB as a safety net for this particular setup (or larger).

 

 

Now, onto memory bandwidth and memory bus and such. You may wanna skip this if you know already and are only here for the vRAM size portion above, but I might as well be thorough if I'm doing this. Spoiler tags save the day!

 

vRAM types & memory clocks

 

  Reveal hidden contents

 

Usually, cards come with one of three kinds of memory today. GDDR3, GDDR5 and HBM (High Bandwidth Memory). They also have a memory clock, and a memory bus. All of these variables combine for the memory bandwidth. Just like clock speed, more memory bandwidth is a good thing. Games do not usually benefit much from increased memory bandwidth though, so don't expect huge gains from overclocking memory in most games. Some games do, but I don't remember any of their names off-hand.

 

HBM is different from GDDR3 and GDDR5 in mostly physical ways. Calculation-wise it's very similar (as I expand on below), and thus I am not giving it its own section. HBM 1.0 (currently on R9 Fury and R9 Fury X cards) is limited to 4GB. HBM 2.0 will not be. Since googling about HBM provides many articles explaining how it works physically, I will defer to those rather than attempt to explain it again here (if you've noticed, I did not explain about GDDR3/GDDR5's physical makeup more than was necessary).

 

Your memory clock is represented in multiple different ways. There is your base clock, which is usually an exceedingly low number. nVidia gaming-class cards since Kepler (GTX 600/700 series) came out have had a standard of 1500MHz for the desktop lineup in terms of memory speed, and Maxwell (GTX 900 series) has had a standard of 1750MHz. AMD has been using less than 1500MHz for the most part (with 7000 and R9 2xx series) but has bumped the speed to 1500MHz recently (R9 3xx series). This clock speed is not what you're going to be too concerned with; you should be concerned with your effective memory clock. Your effective memory clock depends on the type of video memory you have, which I will explain below:

 

- GDDR3 memory (which you won't find in midrange or high-end cards these days) doubles that clock speed. So a card with 1500MHz memory clock using GDDR3 RAM will have a 3000MHz effective memory clock.

- GDDR5 memory (which you will find everywhere in midrange and high-end cards these days) doubles GDDR3's doubler. In other words, it multiplies the clock speed by 4. So a card with 1500MHz memory clock using GDDR5 RAM will have a 6000MHz effective memory clock.

- HBM memory (only present in three cards right now) also doubles the clock speed, similarly to GDDR3. So a card with "500MHz" memory clock (like this) will have an effective memory clock of 1000MHz (despite that link ironically claiming the effective clock is 500MHz).

 

Now there are three ways one usually reads the memory from a card with GDDR5 RAM. Let's use the GTX 680 as an example. Some programs and people list the actual clock, which is 1500MHz. One such program is GPU-Z. Other programs list the doubled clock speed; which would be 3000MHz. Those programs are often overclockers such as nVidia InspectorMSI Afterburner also works on the doubled clock speed, though it does not list the clocks themselves. Then finally, the effective clock speed is often seen in sensor-type parts of programs; such as GPU-Z's sensor page. Please remember which clock your program works with when overclocking. If you want to go from 6000MHz to 7000MHz for example, you would need +500MHz boost in MSI Afterburner.

 

HBM is read by both GPU-Z and MSI Afterburner at its default clock rate, and is indeed overclockable via MSI Afterburner (though not by Catalyst Control Center). I am unsure of other tools people use for AMD OCing that aren't Catalyst Control Center or MSI Afterburner, but there is a chance it may be OC-able by other programs.

 

N.B. - Apparently, monitoring software using AMD cards in Crossfire seem to add the vRAM being used across each card. This results in two 4GB cards using something like 6000MB of vRAM and you scratching your head wondering if your PC is in fact skynet. Don't worry. Just cut the vRAM counter in half (in your head, please don't cut your monitor) and it should accurately reflect how much vRAM is being used in crossfire.

 

 

Next, memory bus and memory bandwidth!

 

  Reveal hidden contents

 

Each card has a memory bus. This memory bus is like the number of lanes in a highway, and the memory speed is how fast the cars travel. Most recent gaming cards from nVidia have a 256-bit memory bus or a 384-bit memory bus. AMD's a bit similar with 256-bit and 384-bit mem bus cards; but their R9 290x has a 512-bit memory bus. Dat be a wide highway, boys.

 

Anyway, memory bandwidth is calculated pretty easily. What you do is you take the effective memory clock of a card, multiply it by the bus width and then divide by 8 (because 8 bits = 1 byte). So see those GTX 680 cards with their fancy schmancy 192GB/s memory bandwidth? Here's how you tell if it's true:

256/8 = 32 * 6000 MHz = 192000 MB/s = 192 GB/s.

Simple, no?

 

It's just as easy for HBM as well. Take the R9 Fury X:

4096/8 = 512 * 1000MHz = 512000MB/s = 512GB/s.

 

Now as I said before, memory clock and memory bus tell the whole story. AMD's R9 290X has a 512-bit memory bus but only a 5000MHz memory clock, whereas nVidia's GTX 780Ti has only a 384-bit memory bus but a 7000MHz memory clock! So it works out like this:

290X = 512/8 = 64 * 5000 MHz = 320000 MB/s = 320 GB/s

780Ti = 384/8 = 48 * 7000 MHz = 336000 MB/s = 336 GB/s

So here we see even though the bus width is larger on AMD's bad boy, the bandwidth is in fact less due to significantly slower clock speeds. Now you know not to just buy a card just because it's got a huge memory clock over the other, or in reverse, because it has a huge memory bus over the other. Did you know the GTX 285 from like 7 years ago had a 512-bit, GDDR3 memory setup? When AMD brought out GDDR5 and nVidia hadn't moved to that yet, they competed by increasing the bus width a bunch, and it was able to compete easily. Interesting tidbit few remember =D.

 

nVidia Maxwell GPUs are slightly different to the above formulas. They have memory bandwidth engine optimizations that surpass the rated bandwidth they have. nVidia touted this as a feature, but it was proved during the GTX 970 scandal when users on notebookreview took 980M cards (256-bit, 5000MHz memory) and 780M cards (256-bit, 5000MHz memory) and tested using the CUDA benchmarker test designed to prove the 970 faulty. The result was that Maxwell GPUs had a higher actual bandwidth than was mathematically determinable using memory bus and clock speeds. As far as I remember, this is on average a 15% bonus in memory bandwidth (minimum of 10%). A 224GB/s card like the GTX 980 would actually have something like 257GB/s. The GTX Titan X and GTX 980Ti with 336GB/s should actually have near 386GB/s of bandwidth. I do not know if this affects all types of information that is usually stored in vRAM, however I must note that it has a distinct benefit and closes the gap between the fabled "HBM 512GB/s", especially with memory overclocks on the nVidia cards. 336GB/s to 512GB/s is more of a far cry than 386GB/s to 512GB/s, and if someone manages to OC a 980Ti or Titan X's memory to say... 7400MHz from 7000MHz? That gap jumps to 355.2GB/s mathematically and then to 408.5GB/s with Maxwell's speed improvements.

 

Now, there's another important thing you should know. There is a difference between a vRAM bottleneck and a memory bandwidth bottleneck. Some games are not designed to use more than a certain amount of vRAM, like Crysis 1. Instead, they make heavy use of the memory bandwidth on your card to keep the relevant information in your vRAM buffer. You can tell if you need a memory bandwidth increase to help a game if your game runs your memory controller's load at high percentages. To check this, GPU-Z could possibly be used. It's important to note that you CAN run into a memory bandwidth choke without running out of vRAM, and vice versa.

 

What I discovered is that increasing memory bandwidth does not do a whole lot for games once above ~160GB/s. My GPUs default to 160GB/s, and it's easy for me to overclock them to 192GB/s (5000MHz to 6000MHz) and I've never really noticed any games actually benefitting from this. 160GB/s has been used for years; as far back as the GTX 285 had 160GB/s and the bandwidth did not improve massively in years past, until recently when nVidia's Maxwell GPUs and AMD's R9 300 series opted for larger out-of-the-box memory bandwidths. Generally, it's better to have more memory bandwidth than less, but it's more of a "slight increase if any" and a "better to have more than less" rather than "this memory OC will give me a solid 5-10fps in <insert game>!" or anything similar. At least, there's no tech RIGHT NOW that will need a lot more memory bandwidth than is currently available as far as I can see.

 

Also, I mentioned this above but I'll repeat it again here: SLI and CrossFireX systems for the most part "add" the memory bandwidth for vRAM access. 2-way SLI of 192GB/s cards? 384GB/s. 3-way SLI of 192GB/s cards? A cool 576GB/s. Tossed three 980Ti cards in SLI? Enjoy a sexy ~1TB/s memory access bandwidth. Now, this doesn't affect memory "fill" time (that is still limited to each card's bandwidth, your RAM and your data storage, and likely your paging file too), and the multiGPU overhead will not allow you to see a true doubling of bandwidth, but the benefits definitely exist. Please note however that if "Split Frame Rendering" with DX12 becomes a thing (or any mode where multiple GPUs act as "one big GPU") then memory bandwidth improvements are likely going out the window (maybe why they're bumping mem bandwidth now?).

 

 

Extra: Memory bus + mismatched memory size section

 

  Reveal hidden contents

 

Now, here's the funny part. Since writing this guide I was told that I was wrong about a couple of things. I've updated and fixed and you can see how stuff works down below. But here's where things get tricky. Apparently, there should technically be a direct, hard-limit to the size of memory chunks one can use on a card depending on the memory bus. By this law, cards like the GTX 660Ti should not have 2GB of vRAM, but instead ought to have 1.5GB or 3GB of vRAM attached to them. Instead, nVidia uses 2GB chips somehow. This apparently could have been done one of two ways.

 

The first way, as described to me, would be to use mismatched-yet-compatible memory chip sizes to get ALMOST as much vRAM out of the cards. In this case, the GTX 660Ti could use a 768MB chip + a 1152MB chip, totaling 1920 memory size. This is apparently NOT what nVidia uses. What nVidia apparently uses, is a bit more complex. Basically a 192-bit memory bus has three blocks of 64-bit memory controllers, and to get 1.5GB of vRAM on them, you would add 512MB vRAM to each 64-bit mem bus, but nVidia adds an extra 512MB block to one 64-bit block. The thing about this "trick", is that all of the memory does not work at the faster bus speed. This means that only the first 1.5GB of vRAM is run at ~144GB/s, and the last 512MB block is run at only ~48GB/s, due to asymmetrical design. This is present in other cards as well from nVidia, such as the GTX 650Ti Boost, GTX 560 SE, GTX 550Ti and the GTX 460 v2. So hey, the more you know right? Proof of this happening. I'd like to point out that the GTX 550Ti has a different layout to the 660/660Ti (and possibly the rest of the cards I've listed that have mismatched memory bandwidth), so thanks to a helpful forum user, I've got a picture to better explain the difference right here. The article already linked does explain it, but not well enough in my eyes.

 

The good thing about this is that overclocking your memory will still benefit your transfer rate even with such a memory configuration, as memory bus works with memory controller for bandwidth. But as most benchmarks pointed out, the 660Ti was able to perform fairly similarly to the 670 in a lot of games, which proves that memory bandwidth is nowhere near as important for gaming as clock speeds are. But in the games where it DOES matter, even the weaker GTX 760 has been shown to pull ahead of the 660Ti by a little bit, according to the sheer number of people who tell me that the 760 beats the 660Ti in some games but not all. Now I know the reason. Go figure huh?

 

So what can I say here? If you see mismatched memory sizes to the memory bus from nVidia, now you know what it be. If you can, get matched memory =D.

 

 

And the GTX 970 gets its own section! Hooray!

 

  Reveal hidden contents

 

The GTX 970 is apparently wired like frankenstein. Let me try and explain. You've no doubt seen multiple articles about how it has 4GB memory and people are considering it like a 3.5GB card. That's wrong, and you shouldn't do that, but not for the reasons you're probably thinking. See, as I mentioned earlier in the guide, when you hit the maximum vRAM on a GPU, you start to compress memory. Send some of it into virtual memory etc if you can compress no longer, etc etc. The problem is that when you hit a 3.5GB vRAM mark on the GTX 970, this does not happen. Because the card actually DOES have the extra vRAM, instead of compressing what's in the buffer, it attempts to make use of the ungodly slow last bit of vRAM. This causes the stuttering and slowdowns many people witnessed when playing. It would actually be better for everyone if nVidia were to either use drivers or a vBIOS to lock off the final 512MB forever, or to somehow tweak drivers to force all of windows' memory requirements (aero, multiple monitors, etc) into the slow 512MB portion, and give non-windows-native applications access to the fast 3.5GB outright, AND block them from accessing the final 512MB no matter what, causing the games/etc to start compressing memory at the 3584MB vRAM mark and eventually hunt for virtual memory, like it does on 3072MB with a GTX 780, for example. For some proof on what happens at the 3.5GB vRAM mark and why it differs to a 3GB card's limit, HERE is a comparison video of a GTX 780 running Dying Light next to a GTX 970 running Dying Light. In the video, if you set it to 720p60 or 1080p60, you can clearly see that the 970's footage is somewhat less smooth than the 780's, despite having more vRAM for the game to make use of. The software also shows only 3.5GB of vRAM being used because as nVidia said, normal programs can't see the last 512MB as it's in a different partition so to speak; but it's clear the game is attempting to use more and the transfer to the slow memory bus is the problem (I wish to point out that since this video has been released, the game updated to use less video memory, and you won't find this bug happening anymore as the game was in an unoptimized state + had extra view distance at the time of that video). Please note: No other maxwell card to date has this bug. The 950, 960, 960 OEM (192-bit mem bus, 3GB/6GB vRAM) 980, 980Ti, Titan X, 940M, 950M, 960M, 965M, 970M, 980M and mobile 980 ALL lack this error.

 

Also, please make note: The 970 does NOT share the same issue that the 660Ti and other mismatched memory cards do. The 28GB/s slow vRAM portion is beyond slow; even the GeForce 7800 card had higher than that. Also, the rest of the card only has access to seven total 32-bit memory controller chips (unlike the 8 blocks it should have), meaning the rest of the 3.5GB is actually on a 224-bit memory bus (notice it's falsely marketed as a 256-bit mem bus card?). I also do not know if the access bandwidth for the slow memory doubles in SLI, which would at LEAST alleviate the problem somewhat. The CUDA memory tester that was used to check the 970 does not test differently in SLI; the bandwidth does not double with SLI on/off in that test. So we have no real way of knowing (at least that I can test) to see whether or not SLI helps. At the very least, increasing the memory clock will help the slow vRAM portion of the card, but that is very little relief. In this case, I can only recommend the 970 to users who are CERTAIN of what they are going to play, and can say with 100% certainty that they will not approach that 3.5GB vRAM buffer. If you know you will, a 980 (too expensive), 980Ti, R9 290 R9 390, R9 290X R9 390X or R9 Fury would be a far better buy. If you are the kind of guy who's only gonna play BF4/BF Hardline and some older game titles, then you'll be perfectly happy with a 970 as you'll likely never hit the vRAM bottleneck issue that only applies to the 970. Also, please note: FPS counts do NOT tell the whole story in this case. As shown in the video I linked, the 970 was actually getting higher FPS most of the time even though the 780 had the smoother experience.

 

Also, according to an article I was recently linked to by PCGamer, it seems that when accessing the slow portion of vRAM, the rest of the memory on the 970 slows down as the seventh 32-bit memory controller that runs in tandem with the others to make up the 224-bit memory bus has to be a "middleman" with the slow portion, and thus memory bandwidth on the card itself is crippled. To fully quote the part of the article (which was originally quoted from PCPer):

 

“if the 7th port is fully busy, and is getting twice as many requests as the other port, then the other six must be only half busy, to match with the 2:1 ratio. So the overall bandwidth would be roughly half of peak. This would cause dramatic underutilization and would prevent optimal performance and efficiency for the GPU.”

 

You can read the full article here, and that might explain other issues with accessing the slow portion with only a few games.

 

Also, a little addendum here. Many users have been claiming to me that most games don't use near 3.5GB of vRAM etc. This is true... but as mentioned earlier in this guide, I would like to point out that running a second monitor, and/or windowed mode/borderless games, ESPECIALLY using Windows 8, 8.1, and 10, will be also competing for space in the vRAM buffer. If your OS is using 600MB without you gaming and you load up a 3GB game in borderless windowed mode, you *WILL* encroach on the 3.5GB vRAM mark, potentially causing hitches and stuttering in your game without actually playing a "4GB game" or whatever. Single-monitor-only gamers need not worry as much, but multitaskers who are gamers? Their worries are different and valid. I should know, because I am one of them myself. If I can help it my game's going borderless windowed. So 970s would be a bad buy for me if I wanted to play games that used anywhere near 3GB of vRAM due to how much else windows itself takes.

 

*NB - I previously recommended the 780 6GB, but since devs seem to no longer code for current + previous gen like they usually did, Maxwell's better tessellation engine and general driver improvements have the 970 pull far enough ahead of the 780 for me to remove it from the direct recommendation, leaving me in the awkward position of requiring to recommend you only stronger (and of course, more expensive) nVidia cards or equal/stronger AMD cards; and even if you could get a 780Ti on the cheap (which would mitigate the performance gap), that card was never made with a 6GB vRAM buffer (likely because the Titan Black, which is still $1000 OR MORE at the time of this writing, was essentially a 6GB 780Ti with a double precision rendering block).

 

 

FAQ

 

  Reveal hidden contents

 

If I could afford it, should I get the version of the card with more vRAM?

YES. A lot of games lately are coming out using much much more vRAM than is necessary, and someday soon they might actually use all that vRAM for a good purpose. When that time comes, you'll be glad you got that card with 4GB instead of 2GB a year ago =3. Look at me, I thought 4GB of vRAM was overkill for my 780Ms, until Ghosts, Titanfall, Wolfenstein, Watch Dogs.... And more games are going to do the same thing. Then I found out about Arma 2's "default" setting for vRAM, and then Skyrim with my mods uses 2.7GB in fullscreen, etc... eventually I realized I was lucky as hell to have 4GB on these cards, and how much people should go for higher vRAM cards if they could. It really can't hurt to have more. So as long as the version with extra vRAM doesn't have another downside, go for it.

 

What about those dual-GPU cards like the R9 295X2 with 8GB or the GTX Titan Z with 12GB vRAM? Should I get those instead of two separate cards?

NO. Cards like that are marketed underhandedly. Dual-GPU cards are listed almost unanimously with the sum of the vRAM on each card. When you use them in their intended SLI/CrossfireX formats, the vRAM data is copied across both cards, so you end up with 1/2 the listed vRAM in effect. The 295X2 is simply two 290X 4GB cards. The Titan Z is even worse; it's two downclocked GTX Titan Black 6GB cards. Most often too, they have an inexplicable markup in price (the Titan Z launched at $3000). NEVER buy them unless you know EXACTLY what you're doing (in which case you wouldn't be reading this guide). Don't let anybody you know buy them. Stab em if you have to. Do *NOT* let them waste that kinda cash. Even if you make the arguement that you could buy 2 of them and get 4-way SLI/Xfire going on a board not normally built for 4-way card setups, the money you save ALONE from not buying most of them could likely get you an enthusiast intel board and high end CPU anyway.

 

I have a 4GB R9 290X, should I sell this and get the 8GB R9 390X?

I CAN'T SAY. Developers have calmed down from going overboard with the vRAM buffer lately (kind of) Nope, they're still at it... though 4GB appears to have become the standard "minimum" amount of vRAM necessary for AAA titles (beyond "low" graphics, anyway), but it also seems to be a bit of a sweet spot except in many of the games. Shadows of Mordor wants 6GB for the texture quality and draw distance of a 2011 game for example, Black Ops 3 does not allow "extra" texture quality without 6GB vRAM or more, and it's easy to pass 4GB vRAM on Assassin's Creed Syndicate (especially above 1080p), but 4GB cards play these mostly fine. I STILL recommend the highest amount of vRAM a card offers if you can afford it, but selling a 4GB card to buy the same one with a higher vRAM buffer may not be worth the hassle for you (though the 390X is overclocked a bit on core and a lot on memory). This is on a case-by-case basis and it still might suffice to simply buy a stronger card with lots of vRAM, like a 980Ti.

 

 

Windows 10 and nVidia 353.62

 

  Reveal hidden contents

 

Windows 10 has made changes to itself under the hood. Including some changes to something called the "WDDM". This has programs reporting vRAM usage/utilization with errors. 2GB GPUs appearing to use 2800MB of vRAM, benchmarks that use set amounts of vRAM appearing to use double, etc. If you're on Windows 10, as far as I know, there is no way AS OF THIS WRITING to properly check the amount of used vRAM. A large reason why I don't know is because I refuse to use Windows 10 for another few months, mainly because of these kinds of stupid issues, so I haven't ran a million and one programs to see if things line up with how they were on Win 8.1. If anyone knows any way to accurately get the readings, you can let me know in the thread and I'll amend it here. 

 

Apparently, Windows 10 and 353.62 nVidia drivers are stealing the max amount of vRAM available to GPUs. In particular, about 1/8 of the vRAM seems to be reserved and not usable to the system for whatever reason. A user on NBR reported his 980M only having 3.5GB of RAM available, and after some further checking, it seems all cards exhibit this behaviour, even using CUDA programs to check how much memory is available to the system to use. HERE is the post with some proof. If you have a relatively low vRAM card and are suddenly running into stuttering in games that use just about the amount of vRAM you have (let's say... BF4 on a 2GB GPU) then here's your problem. =D.

 

I am not aware of AMD GPUs having vRAM allocation stolen, or if it's a DX12 thing and not a nVidia driver thing, but if ANYONE can check for me (I don't remember how you'd check on AMD cards exactly, since you don't have access to CUDA, but the is a way I've seen before) and report it, it would be fantastic. Also, I'm not sure if AMD cards experience the vRAM error reporting bug, but I believe they might. Again, anyone who can check, let me know! Most of NBR use nVidia because AMD has no laptop GPUs worth even remembering the existence of these days, so I haven't had much chance to ask people using AMD cards.

 

Finally, Windows 10 and DirectX 12 ARE NOT GIVING YOU EXTRA VIDEO RAM. There have been some users noticing that their GPUs suddenly have a large amount of shared memory, and are assuming this is DX12. No. It is not. As far back as Windows Vista did this. It doesn't really seem to do anything in practice, actually. People are blowing this whole DX12 thing out of proportion. Here's my Windows 8.1 system showing the same thing.

 

When these Windows 10 issues are CONFIRMED resolved, I'll remove this section from the guide. Since I'm not using Windows 10, I'll need people to report to me.

 

 

Final tidbits and stuff

 

  Reveal hidden contents

 

A lot of games are going to come out soon using a whole lot more vRAM. They're probably not gonna look 300% better even though they use 300% the vRAM, but eventually, 2GB is just gonna be a bare minimum requirement for video cards. I highly suggest if you're able to get the most vRAM and just... leave it. You're probably not like me who uses playclaw 5 and has system info as overlays all the time so that you know which game uses how much memory, how much % GPU usage, how hot your parts are, etc all the time, so you probably won't be as interested in GPU memory usage as I am. But it's better to have when the need arises than to not have. All the people with 2GB GPUs can't even play Wolfenstein with ultra texture size; the option doesn't even appear in the menu unless you have 3GB or more memory. Call of Duty Black Ops 3 limits 2GB vRAM users to "medium" textures, and 4GB card users can only use "high"... you need 6GB or above to even access "extra" texture quality.

 

As for memory bandwidth? It's not that important really, at least not for games. Most games don't really care; they're more interested in core processing power. Of course, higher resolution textures will eventually need better memory bandwidth to load quickly enough, but we're not much at that point yet that I've seen. Hell, some games will max out your memory controller while using small amounts of vRAM because they were designed in a time where vRAM was scarce, so they abuse the speed of the memory, like Crysis 1. So while more bandwidth is always better, don't kill your wallet for it. If there's a better, stronger card out there with more vRAM but less memory bandwidth, then go for it, unless you want the other one for a specific reason. If you're wondering why there'd be a stronger card with more vRAM and less memory bandwidth for the same or cheaper price, remember that nVidia and AMD make different cards.

 

 

I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.

 

If you want the SLI information or the mobile i7 CPU information guide, they're in my sig!

 

Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23

so many colours

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/1/2016 at 6:37 PM, Bhav said:

Soon I will be enjoying the extra brilliance of +2.5 Gb when I can play with my GTX 980 Tis.

 

 

(Awaiting replacement mobo). 

Lucky.....

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/6/2016 at 7:38 AM, JaffaTheCat said:

this explained alot

On 9/6/2016 at 3:02 PM, IwannaWant said:

Very nice, thank you.

19 hours ago, DudeRanchBrosta said:

That was really helpful!

1 hour ago, GettinBissi said:

Excellent post, thank you D2ultima!

No problem, you're welcome

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

This probably sounds like a dumb question, but I'm just starting out with picking parts that're compatible with each other (according to pcpartspicker.com), and i noticed something about the specs for the video card i chose. It says that it has 8GB of GDDR5 memory, but the motherboard i chose supports DDR4, and the RAM sticks are also DDR4. Will this Video Card work with the setup i chose or do I need to pick another video card? Or am I just worrying about nothing? 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PlatonicRhino31 said:

This probably sounds like a dumb question, but I'm just starting out with picking parts that're compatible with each other (according to pcpartspicker.com), and i noticed something about the specs for the video card i chose. It says that it has 8GB of GDDR5 memory, but the motherboard i chose supports DDR4, and the RAM sticks are also DDR4. Will this Video Card work with the setup i chose or do I need to pick another video card? Or am I just worrying about nothing? 

RAM is for your motherboard. vRAM is on your card and doesn't matter to the motherboard.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, D2ultima said:

RAM is for your motherboard. vRAM is on your card and doesn't matter to the motherboard.

Phew! Thanks! I thought i was screwed for a second there. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

I wanna ask something ? what is the difference between Gtx1060 3gb variant for laptop and gtx 1060 6gb for laptop as well ? Is there any difference between them ? how does the benchmark and all that par with those too especially in gaming  thanks ^^ 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Nagisa Shiota said:

I wanna ask something ? what is the difference between Gtx1060 3gb variant for laptop and gtx 1060 6gb for laptop as well ? Is there any difference between them ? how does the benchmark and all that par with those too especially in gaming  thanks ^^ 

I haven't seen a 3GB 1060 for notebooks, but if it exists, it's probably similar of a performance drop to the desktop cards.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 12/7/2016 at 0:16 AM, D2ultima said:

I haven't seen a 3GB 1060 for notebooks, but if it exists, it's probably similar of a performance drop to the desktop cards.

please fix post for dark theme.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, AluminiumTech said:

please fix post for dark theme.

I... have. The text is on "automatic" colour. Literally any non-coloured text is on "automatic" colour. The forum broke when they updated to IPB 4.

 

I would need to completely start a new post and write the entire guide from scratch if I wanted to make it properly work with Dark Theme.

 

And I DO mean "write the guide from scratch". I can't copy. I've wiped this guide entirely and "copied" from a tab already. It still breaks. I can't even do nested spoilers anymore. Hell, look at my last edit reason on the main post.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 2/6/2017 at 10:33 AM, D2ultima said:

I... have. The text is on "automatic" colour. Literally any non-coloured text is on "automatic" colour. The forum broke when they updated to IPB 4.

 

I would need to completely start a new post and write the entire guide from scratch if I wanted to make it properly work with Dark Theme.

 

And I DO mean "write the guide from scratch". I can't copy. I've wiped this guide entirely and "copied" from a tab already. It still breaks. I can't even do nested spoilers anymore. Hell, look at my last edit reason on the main post.

Plz trycto fix for dark mode tho

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×