Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
D2ultima

The video RAM information guide

Recommended Posts

On 3/10/2019 at 4:30 AM, D2ultima said:

Read it a few times, I think we're talking about the same thing.

 

Well, like I said, It's technically correct.

The issue I have is that it might be somewhat misleading, just adding up the bandwidth - kind of like just adding up VRAM. You can do neighter, that's not how it works. A cards performance is a cards performance. You can sort of add up the results in SLI, but if a card has some issue, SLI won't fix it.
Anyways, that's just nitpicking on my part. I learned quite a bit here, and will definitely be redirecting people with questions this way. 👍

Link to post
Share on other sites
Posted · Original PosterOP
18 hours ago, Sett said:

Well, like I said, It's technically correct.

The issue I have is that it might be somewhat misleading, just adding up the bandwidth - kind of like just adding up VRAM. You can do neighter, that's not how it works. A cards performance is a cards performance. You can sort of add up the results in SLI, but if a card has some issue, SLI won't fix it.
Anyways, that's just nitpicking on my part. I learned quite a bit here, and will definitely be redirecting people with questions this way. 👍

Haha that's all fine, I'm good with the technicalities. There's often a line where you simply CANNOT explain something in more depth or people just won't get it, this would be one of those things. But yeah, it doesn't make one card faster at all.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

We do not need 16GB HBM2 for 1080p? O3O


Piledriver-Tux / Vishera-X8-8370 (gaming + workstation PC)
Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 Black Edition Eight-Core @4.3GHz (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-2133MHz CL10-12-12-30 (PC3-17000) 16.38GB (4x4GB) / Operating System 1: Android x86 / Operating System 2: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wifi: TP-Link TL-WN951N 11n Wireless Adapter / https://pcpartpicker.com/list/ZW4Jtg

Steamroller-Tux / Godavari-X4-880 (secondary gaming PC)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Play Xbox 360 Wired / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core @4.3GHz (Global Foundries 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: Samsung DDR3-2133MHz CL9-11-11-31 (PC3-17066) 16.38GB (4x4GB) / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Desktop 2TB SSHD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wifi: TP-Link TL-WN851N 11n Wireless Adapter / https://pcpartpicker.com/list/LBjbw6

Complete portable device SoC history:
Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
MIPS R4000 - Sony PlayStation Portable (PSP-3000)
MSM8926 - Microsoft Lumia 640 LTE
MSM8974AA - Blackberry Passport
MT2601 - TicWatch E
MT6580 - Tecno Spark 2 (1GB RAM)
MT6592M - my|phone my32 (orange)
MT6592M - my|phone my32 (yellow)
MT6735 - HMD Nokia 3 Dual SIM
MT6737 - Cherry Mobile Flare S6
MT6739 - my|phone myX8 (blue)
MT6739 - my|phone myX8 (gold)
MT6750 - Huawei honor 6C Pro / V9 Play
MT6797D (Helio X23) - my|phone Brown Tab 1
SDM710 - Oppo Realme 3 Pro
 
Link to post
Share on other sites
Posted · Original PosterOP
12 hours ago, Nena360 said:

We do not need 16GB HBM2 for 1080p? O3O

Give it another 2 years, soon 8GB will become minimum for 1080p or something with new consoles.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

One thing I've been looking at recently is whether or not VRAM usage as reported by some utilities like GPU-Z is an accurate representation of what is required. That is, if you run a game and GPU-Z says say 5GB is being used, that means the game requires 5GB of VRAM for whatever particular settings you are using and having less results in memory swapping issues. As much research as I can find on the interwebs suggests that games may not actually use all of the RAM that's being reported as "used", but rather the game requested to reserve that space and this is merely reported as used by whatever reporting mechanism.

 

The thing that made me start to question the accuracy of VRAM reporting was this: https://www.techspot.com/article/1600-far-cry-5-benchmarks/

 

The tl;dr is that Far Cry 5 reported using up to 4GB of VRAM at 4K, which you'd expect on a card with less memory to perform horribly. Though the author states that on a card with 3GB of VRAM (notably the 1060 3GB since it's easy to compare to the 6GB version), it wasn't a stuttery mess on 1440p and 4K, the resolutions where the game stated it used more VRAM than the card had. What's even stranger is on a 2GB card like the GT 1030, performance didn't suffer beyond what was expected and more or less kept a linear performance drop with higher resolutions.

 

The website also ran a test with the RTX 2060, a 6GB VRAM card: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/

 

They picked the games that reported using more VRAM on the RTX 2070 (an 8GB card) than the RTX 2060 had. They also found that the RTX 2060 can handle those games, reporting no issue with stuttering or whatnot.

 

So there appears to be something about how much video RAM the game requested vs. how much it's actually using to the point where it negatively affects performance.

Link to post
Share on other sites
Posted · Original PosterOP
On 4/9/2019 at 5:43 PM, Mira Yurizaki said:

One thing I've been looking at recently is whether or not VRAM usage as reported by some utilities like GPU-Z is an accurate representation of what is required. That is, if you run a game and GPU-Z says say 5GB is being used, that means the game requires 5GB of VRAM for whatever particular settings you are using and having less results in memory swapping issues. As much research as I can find on the interwebs suggests that games may not actually use all of the RAM that's being reported as "used", but rather the game requested to reserve that space and this is merely reported as used by whatever reporting mechanism.

 

The thing that made me start to question the accuracy of VRAM reporting was this: https://www.techspot.com/article/1600-far-cry-5-benchmarks/

 

The tl;dr is that Far Cry 5 reported using up to 4GB of VRAM at 4K, which you'd expect on a card with less memory to perform horribly. Though the author states that on a card with 3GB of VRAM (notably the 1060 3GB since it's easy to compare to the 6GB version), it wasn't a stuttery mess on 1440p and 4K, the resolutions where the game stated it used more VRAM than the card had. What's even stranger is on a 2GB card like the GT 1030, performance didn't suffer beyond what was expected and more or less kept a linear performance drop with higher resolutions.

 

The website also ran a test with the RTX 2060, a 6GB VRAM card: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/

 

They picked the games that reported using more VRAM on the RTX 2070 (an 8GB card) than the RTX 2060 had. They also found that the RTX 2060 can handle those games, reporting no issue with stuttering or whatnot.

 

So there appears to be something about how much video RAM the game requested vs. how much it's actually using to the point where it negatively affects performance.

Okay, so, it's a bit convoluted. But basically, it's similar to how windows uses more RAM the more you have in your system.

 

It's caching, really. But the problem with testing high resolutions on fairly weak video cards like 1060 3GB is that if your GPU fps is lower than what the vRAM choke would bring, you'd see no change. The best way to test is with high memory requirement, low resolution games. Games that can eat up craptons of vRAM at low resolutions... for example, Killing Floor 2 with texture streaming disabled. That will DEVOUR vRAM like nothing else, well up to and over 7GB at 1080p. On a 2GB or 3GB card, compared to higher vRAM variants, the difference should be visible. Try a RX 580 4GB vs 8GB, or a 980M 4GB vs 8GB, or a 1050 2GB vs 4GB, etc. See if performance changes between them when they are otherwise similar with texture streaming enabled.

 

Basically, it's hard to test. But high resolutions are NOT the way to test properly, because the GPU needs to have room to flex to be limited by vRAM. Also, it may not be stutters all the time, it could be pop-in issues and other such things. Like if you've ever managed to drive fast enough for long enough in a GTA game it eventually only loads in a basic road texture and no people or cars or buildings take form, because they can't be streamed in quickly enough due to the way the engine is designed (GTA SA, GTA 4 do this reliably).


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
16 minutes ago, D2ultima said:

It's caching, really. But the problem with testing high resolutions on fairly weak video cards like 1060 3GB is that if your GPU fps is lower than what the vRAM choke would bring, you'd see no change. The best way to test is with high memory requirement, low resolution games. Games that can eat up craptons of vRAM at low resolutions... for example, Killing Floor 2 with texture streaming disabled. That will DEVOUR vRAM like nothing else, well up to and over 7GB at 1080p. On a 2GB or 3GB card, compared to higher vRAM variants, the difference should be visible. Try a RX 580 4GB vs 8GB, or a 980M 4GB vs 8GB, or a 1050 2GB vs 4GB, etc. See if performance changes between them when they are otherwise similar with texture streaming enabled.

I don't see how a lower resolution would mean anything other than freeing up VRAM. Though I would see how disabling texture streaming would, except this isn't a feature you can toggle in most games these days.

 

16 minutes ago, D2ultima said:

Basically, it's hard to test. But high resolutions are NOT the way to test properly, because the GPU needs to have room to flex to be limited by vRAM. Also, it may not be stutters all the time, it could be pop-in issues and other such things. Like if you've ever managed to drive fast enough for long enough in a GTA game it eventually only loads in a basic road texture and no people or cars or buildings take form, because they can't be streamed in quickly enough due to the way the engine is designed (GTA SA, GTA 4 do this reliably).

Either way, this is telling me that whatever the game or tools reports as VRAM consumption doesn't exactly correlate to how this affects overall performance. All this is telling me is even if a game wants say 7GB and you only have 6GB, it won't result in appreciably degraded performance unless you're severely starving the game of VRAM in some other way.

Link to post
Share on other sites
Posted · Original PosterOP
23 hours ago, Mira Yurizaki said:

I don't see how a lower resolution would mean anything other than freeing up VRAM. Though I would see how disabling texture streaming would, except this isn't a feature you can toggle in most games these days.

 

Either way, this is telling me that whatever the game or tools reports as VRAM consumption doesn't exactly correlate to how this affects overall performance. All this is telling me is even if a game wants say 7GB and you only have 6GB, it won't result in appreciably degraded performance unless you're severely starving the game of VRAM in some other way.

Low render resolution (number of pushed pixels) will reduce vRAM usage A LITTLE, but improve performance a LOT. If your GPU is producing 30fps then a vRAM choke is probably not going to show up. If your game is getting 100fps then it's more likely you'll notice the hitching. Raising texture resolution and shadowmap resolution etc can vastly increase vRAM requirements without increasing the graphical power required to run the game. This is why low resolution + high resolution assets is the way to check what vRAM limits do to games. If you're already at a VERY hard GPU bottleneck, then vRAM downsides won't even need to show themselves because the time between each frame is long enough to get over the lapse in readily available assets in memory.

 

And yes, what a game says it's using isn't what it *needs*, but it can only improve your experience to let it have more. I still feel 4GB is the bare minimum though, and 6GB is a comfortable floor with 8GB being nice and 12GB or more being a good topout. Also remember things like using NVENC adds vRAM usage, whether it be OBS or shadowplay etc.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
On 4/15/2019 at 3:06 AM, D2ultima said:

Low render resolution (number of pushed pixels) will reduce vRAM usage A LITTLE, but improve performance a LOT. If your GPU is producing 30fps then a vRAM choke is probably not going to show up. If your game is getting 100fps then it's more likely you'll notice the hitching. Raising texture resolution and shadowmap resolution etc can vastly increase vRAM requirements without increasing the graphical power required to run the game. This is why low resolution + high resolution assets is the way to check what vRAM limits do to games. If you're already at a VERY hard GPU bottleneck, then vRAM downsides won't even need to show themselves because the time between each frame is long enough to get over the lapse in readily available assets in memory.

 

And yes, what a game says it's using isn't what it *needs*, but it can only improve your experience to let it have more. I still feel 4GB is the bare minimum though, and 6GB is a comfortable floor with 8GB being nice and 12GB or more being a good topout. Also remember things like using NVENC adds vRAM usage, whether it be OBS or shadowplay etc.

Well I have some confusion about this, there is a video where both CPU and gpu is not being utilized to the max because of ram is this the fault of ram not being fast enough to read drives and display it? Well is this out of topic? To put it simple faster gives more performance, what I want to know is why.

Link to post
Share on other sites
Posted · Original PosterOP
3 hours ago, Oalei said:

Well I have some confusion about this, there is a video where both CPU and gpu is not being utilized to the max because of ram is this the fault of ram not being fast enough to read drives and display it? Well is this out of topic? To put it simple faster gives more performance, what I want to know is why.

You're going to need to explain yourself more. I do not get what you're asking. It sounds like you're talking about system RAM which is not the point of this topic, and while I could write a fairly competent RAM guide, it would not really go anywhere.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
2 hours ago, D2ultima said:

You're going to need to explain yourself more. I do not get what you're asking. It sounds like you're talking about system RAM which is not the point of this topic, and while I could write a fairly competent RAM guide, it would not really go anywhere.

OK ok, so there is some benchmarks with 2133mhz and 3000mhz, the 2133mhz is only showing about 70cpu and 60gpu, the 3000mhz shoeing 90-100% and gpu 90%,well maybe this is where the game is what the problem is. 

Link to post
Share on other sites
Posted · Original PosterOP
8 hours ago, Oalei said:

OK ok, so there is some benchmarks with 2133mhz and 3000mhz, the 2133mhz is only showing about 70cpu and 60gpu, the 3000mhz shoeing 90-100% and gpu 90%,well maybe this is where the game is what the problem is. 

This is the video RAM guide, as in memory on your video card.

 

You are talking about system RAM. And yes, system RAM is a big factor in CPU performance and can be a massive bottleneck, but this isn't the thread for it.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
3 minutes ago, D2ultima said:

This is the video RAM guide, as in memory on your video card.

 

You are talking about system RAM. And yes, system RAM is a big factor in CPU performance and can be a massive bottleneck, but this isn't the thread for it.

well sorry for wasting you time honestly.

 

well for vram i think i get a grasp of it, i literally read all of the comments and its giving me more knowledge (im just too lazy to read the stuff you make). I hope you will update it if you find something that is needed to be change.

Link to post
Share on other sites
Posted · Original PosterOP
57 minutes ago, Oalei said:

well sorry for wasting you time honestly.

 

well for vram i think i get a grasp of it, i literally read all of the comments and its giving me more knowledge (im just too lazy to read the stuff you make). I hope you will update it if you find something that is needed to be change.

I have things I could change and add, mainly simply adding GDDR5X/GDDR6/etc but this thread is broken because it was written on IPB v3 and the IPB v4 transition forever broke the article. I would have to start an entirely new thread and copy everything over manually for it to save correctly.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
19 minutes ago, D2ultima said:

I have things I could change and add, mainly simply adding GDDR5X/GDDR6/etc but this thread is broken because it was written on IPB v3 and the IPB v4 transition forever broke the article. I would have to start an entirely new thread and copy everything over manually for it to save correctly.

Yeah I seen that you complained about the new update. Yeah it would be very annoying to redo it all even if you just read it, it's already like who would if you don't needed/wanted to. Well I still wait for new topic maybe like V.2 because it just broke.

Link to post
Share on other sites
On 8/16/2014 at 10:52 AM, AlwaysFSX said:

I'm serious, I can't read text that's blue.

I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory b usa today protonmail  andwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.

Link to post
Share on other sites
2 hours ago, ALIBENWA said:

I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.

Why do my posts always get picked for necros


.

Link to post
Share on other sites
5 minutes ago, AlwaysFSX said:

Why do my posts always get picked for necros

cOnTeXT


[Desktop][i7 3960X @ 4.3Ghz][P9X79 Deluxe][16GB DDR3 1866Mhz][GTX 1080Ti Fe][1TB 850 Evo][Win 10 Pro]
[Laptop][Pavilion Gaming 15t][i7 8750H @ -128 undervolt][8GB DDR4 2666M Mhz][GTX 1060 Max-Q][1TB WD Blue 3D Nand SSD][Win 10 Pro]

Link to post
Share on other sites

looking for some advice here related to gpu

 

I need to assemble several PCs as part of a education program and I'm bound to low budget like 300 ~ 450 bucks each

 

I'm focused on FM2/FM2+ IGPU (2 AND 4 cores) (corp donations)

already got 8gb 1600 ram each (donation)

 

PCs should be used for office, web, intranet, and 1080p/30 streaming like youtube

 

do I need to look for a dedicated gpu even a cheap one

Link to post
Share on other sites

This is a good guide OP. You should do another one on video card hardware. Like on the gpu, shaders, supposed multi cores(which are just shader units?), compute abilities.

 

For example why do so many video cards have around the same clock speed but a huge difference in performance?

Link to post
Share on other sites
Posted · Original PosterOP
6 hours ago, blaze909 said:

looking for some advice here related to gpu

 

I need to assemble several PCs as part of a education program and I'm bound to low budget like 300 ~ 450 bucks each

 

I'm focused on FM2/FM2+ IGPU (2 AND 4 cores) (corp donations)

already got 8gb 1600 ram each (donation)

 

PCs should be used for office, web, intranet, and 1080p/30 streaming like youtube

 

do I need to look for a dedicated gpu even a cheap one

iGPUs should be perfectly fine

5 hours ago, masethekiller said:

This is a good guide OP. You should do another one on video card hardware. Like on the gpu, shaders, supposed multi cores(which are just shader units?), compute abilities.

 

For example why do so many video cards have around the same clock speed but a huge difference in performance?

I'm not sure what specifically I could do, I don't know enough about that since each architecture is very different and it's hard to quantify changes, or specify when they will make improvements. I could make general statements like how backend has barely any changes between maxwell and pascal but Turing is significantly faster in a lot of aspects that don't show up on tests like Firestrike, and how as more games take to using those hardware they'll pull far ahead of pascal counterparts (like how 2070 Super > 1080Ti in COD: MW and Superposition and Apex Legends but sticks near a 1080 in Sekiro or something), but that's just going to be too much things I can't prove.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
On 8/16/2014 at 3:53 AM, jmaster299 said:

So yes you can stick 8192MB of VRAM on a 128bit bus, but you will never in a 100 years be capable of achieving a memory clock rate that will be capable of using all that VRAM on such a small bus size.

A bit off topic I will be, but this comment is rapidly approaching being proven false with the RX 5500. (it will have a 128-bit bus, and a version with 8GB of GDDR6, I do believe there will be games that will be able to use it all, and it's only been 5 years)


CPU: Ryzen 5 3600     Cooler: GAMMAXX 400     RAM: G.Skill Ripjaws V 16gb 3600CL16 @3800CL16     Mobo: ASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT     Case: Antec P5     PSU: Rosewill Capstone 750M  Storage: Mushkin ECO2 512GB SSD, HP EX900 1TB NVMe

Link to post
Share on other sites
Posted · Original PosterOP
On 10/13/2019 at 11:36 AM, BTGbullseye said:

A bit off topic I will be, but this comment is rapidly approaching being proven false with the RX 5500. (it will have a 128-bit bus, and a version with 8GB of GDDR6, I do believe there will be games that will be able to use it all, and it's only been 5 years)

That comment was false from the start


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
On 10/14/2019 at 11:41 AM, D2ultima said:

That comment was false from the start

Not entirely...


CPU: Ryzen 5 3600     Cooler: GAMMAXX 400     RAM: G.Skill Ripjaws V 16gb 3600CL16 @3800CL16     Mobo: ASRock X570M Pro4

Graphics Card: ASRock Reference RX 5700 XT     Case: Antec P5     PSU: Rosewill Capstone 750M  Storage: Mushkin ECO2 512GB SSD, HP EX900 1TB NVMe

Link to post
Share on other sites
Posted · Original PosterOP
On 10/15/2019 at 6:26 PM, BTGbullseye said:

Not entirely...

Nope, it was always false. 100%. It was never not false. Filling a memory buffer is never impossible. Whether the card's memory bandwidth was useful for the situations which were likely to fill that buffer (large shadowmap resolutions, high resolutions and multisampling techniques, etc) is another story. But filling it was never impossible.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×