Jump to content

Regarding the recent Tech Quickie

42 minutes ago, Enderman said:

What comments?

All I said was that faster ram isnt worth it if it costs a lot more, but i its only a little more than it is.

Like, what's wrong with what I said? Are you upset for no reason? Having a bad day?

I can't be asked to find them. Also, faster RAM is almost always a good investment, a 4-10 FPS improvement for $15 is easily worth it. I'm not upset, from what I've seen in he past you don't seem to put much thought into your posts, I was calling you out on that.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Citadelen said:

I can't be asked to find them. Also, faster RAM is almost always a good investment, a 4-10 FPS improvement for $15 is easily worth it. I'm not upset, from what I've seen in he past you don't seem to put much thought into your posts, I was calling you out on that.

I never said it wasnt $15

I said that if its a lot more, like a hundred, then its not

Because fyi there are many people buying 2800MHz dominator platinum ram for hundreds of dollar more than regular ram because they think its better

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Enderman said:

-snip-

If people are building extreme PCs then it doesn't matter, as they'll still get better performance. This is more aimed at DDR4 by the way, it's cheaper for higher clocks.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, ONOTech said:

Umm..?

822af638f117cba730ac2e2804d0bde3.png

d69a109036739a6d329e211f4500b594.png

7909682af27770b1619c63e2cdc9c051.png

3e88645c9435f32a471092881a6ecb2a.png

82ecbe5afd933c47da59f4fd64e5a417.png

6-10 FPS, which is still pretty minimal under certain circumstances, but it's definitely not as small as you claim.

Those benchmarks pair a titan x and an i3. So the gpu is bottlenecked by the cpu. That's why there is a big differnece. The people who buy i3's probably can't get a titan x. If that were an i5 the diffence would be 2 fps. 

LOOK AT MY NEW FLAG DESIGNS FOR PA AND VOTE ON YOUR FAVORITE

LOOK AT MY FIRST BATCH OF DESIGNS HERE

 

 

 

 

 

4690K @ 4.5GHz

GTX 970 FTW

MSI Z97 PC MATE

Define R5 windowed

Cooler Master Seidon 240m

EVGA SuperNOVA 650 G1

Kingston 120gb SSD

SanDisk 480Gb SSD

Seagate 1Tb Hard drive

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SuperCookie78 said:

Those benchmarks pair a titan x and an i3. So the gpu is bottlenecked by the cpu. That's why there is a big differnece. The people who buy i3's probably can't get a titan x. If that were an i5 the diffence would be 2 fps. 

My Core i5 6600T clocked at 4.4ghz notices a 10% improvement in minimum FPS when paired with a GTX 770. Am willing to bet that an i3 will see the same benefit when paired with a compatible GPU. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, MageTank said:

My Core i5 6600T clocked at 4.4ghz notices a 10% improvement in minimum FPS when paired with a GTX 770. Am willing to bet that an i3 will see the same benefit when paired with a compatible GPU. 

A 10% improvment from doing what? And 10% from what and what game? And what about average fps. Minimum is important but so is average. 

LOOK AT MY NEW FLAG DESIGNS FOR PA AND VOTE ON YOUR FAVORITE

LOOK AT MY FIRST BATCH OF DESIGNS HERE

 

 

 

 

 

4690K @ 4.5GHz

GTX 970 FTW

MSI Z97 PC MATE

Define R5 windowed

Cooler Master Seidon 240m

EVGA SuperNOVA 650 G1

Kingston 120gb SSD

SanDisk 480Gb SSD

Seagate 1Tb Hard drive

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SuperCookie78 said:

A 10% improvment from doing what? And 10% from what and what game? And what about average fps. Minimum is important but so is average. 

A 10% improvement in minimum FPS when gaming. Games are Tomb Raider, Thief, AC:S, fallout 4, etc. The difference, and the specific ram speed that gives said difference, depends on the game itself. For Tomb Raider and Thief, 2800mhz was where diminishing returns kicked in. Where more speed simply stopped giving extra performance. For Fallout 4, it scaled to 3200mhz, but made no difference when i went to 3500mhz. For AC:S, I couldn't see much of a difference beyond 2666mhz, where 2800mhz actually took a small step backwards (by 1-2FPS) but actually went a little faster at 3000mhz. I chalk this up as slight margin of error when benching, as the main result is still 9-10FPS higher than stock memory speed across the board. 

 

As for average FPS, I do not see it as being important. Not even in the slightest. Minimum FPS is what will determine worse case scenario. Getting this number as high as physically possible is simply going to impact your play. That being said, I keep V-Sync on at all times, so I can't see anything beyond 60. If you need me to run tests with it off, i can provide that information tomorrow. Either way, I did not notice much of a difference on Average FPS in my signature when using a G4400 and GTX 770. I doubt I will notice much of a difference with a much stronger i5. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, MageTank said:

A 10% improvement in minimum FPS when gaming. Games are Tomb Raider, Thief, AC:S, fallout 4, etc. The difference, and the specific ram speed that gives said difference, depends on the game itself. For Tomb Raider and Thief, 2800mhz was where diminishing returns kicked in. Where more speed simply stopped giving extra performance. For Fallout 4, it scaled to 3200mhz, but made no difference when i went to 3500mhz. For AC:S, I couldn't see much of a difference beyond 2666mhz, where 2800mhz actually took a small step backwards (by 1-2FPS) but actually went a little faster at 3000mhz. I chalk this up as slight margin of error when benching, as the main result is still 9-10FPS higher than stock memory speed across the board. 

 

As for average FPS, I do not see it as being important. Not even in the slightest. Minimum FPS is what will determine worse case scenario. Getting this number as high as physically possible is simply going to impact your play. That being said, I keep V-Sync on at all times, so I can't see anything beyond 60. If you need me to run tests with it off, i can provide that information tomorrow. Either way, I did not notice much of a difference on Average FPS in my signature when using a G4400 and GTX 770. I doubt I will notice much of a difference with a much stronger i5. 

What resalution are you playing at? And what ram do you have? 

LOOK AT MY NEW FLAG DESIGNS FOR PA AND VOTE ON YOUR FAVORITE

LOOK AT MY FIRST BATCH OF DESIGNS HERE

 

 

 

 

 

4690K @ 4.5GHz

GTX 970 FTW

MSI Z97 PC MATE

Define R5 windowed

Cooler Master Seidon 240m

EVGA SuperNOVA 650 G1

Kingston 120gb SSD

SanDisk 480Gb SSD

Seagate 1Tb Hard drive

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, AshleyAshes said:

I'm mostly just enjoying that last video where the i5 2500K is shown to more or less still hold it's own five years later.  Sure they point out some weaknesses but those weaknesses are pretty minimal if you consider that this is a CPU that cost only $200 five freakin' years ago.  To get 'pretty good performance' out of a CPU five years down the line is a massive return on your investment in a piece of tech.

Agreed. I only *JUST* replaced my Xeon w3520 (i7-920 equivalent), and that was mostly just to get modern chipset features like USB 3.0. The performance for gaming was totally adequate still.

 

On topic, I agree, LMG should make a greater effort in maintaining brand agnostic videos when making generic comments, or try to give representation from all the major parties.

 

While I don't personally think they're doing it intentionally, LMG does mention NVIDIA more in their videos, and tends to use them more for builds that could easily use any card (eg: Not a special use case).

 

I'll take the OP at his word about Linus being wrong in the RAM video though, as I haven't watched it yet.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, dalekphalm said:

Agreed. I only *JUST* replaced my Xeon w3520 (i7-920 equivalent), and that was mostly just to get modern chipset features like USB 3.0. The performance for gaming was totally adequate still.

 

On topic, I agree, LMG should make a greater effort in maintaining brand agnostic videos when making generic comments, or try to give representation from all the major parties.

 

While I don't personally think they're doing it intentionally, LMG does mention NVIDIA more in their videos, and tends to use them more for builds that could easily use any card (eg: Not a special use case).

 

I'll take the OP at his word about Linus being wrong in the RAM video though, as I haven't watched it yet.

oh trust me. Linus is so wrong it is cringe worthy.

 

Litterally half of his "facts" are plain wrong.

 

The moment you, as a influencer with 2.4 million viewers, reccomend good looking sticks of RAM, before empirically provable better performing RAM. That is the moment you should realize your mistake and remove the video prior to it ruining your good reputation.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Prysin said:

oh trust me. Linus is so wrong it is cringe worthy.

 

Litterally half of his "facts" are plain wrong.

 

The moment you, as a influencer with 2.4 million viewers, reccomend good looking sticks of RAM, before empirically provable better performing RAM. That is the moment you should realize your mistake and remove the video prior to it ruining your good reputation.

While I agree with your sentiment, I would also argue that there is boring wrong with costing aesthetics over performance as long as you're open about there being a better performing alternative. As enthusiasts, we often choose aesthetics first, through things like color themed builds, etc.

 

Never the less, I still agree with your main point. I look forward to seeing the video and then comparing it to the ones you posted.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, SuperCookie78 said:

What resalution are you playing at? And what ram do you have? 

Resolution was 1080p in the tests performed in my signature. The ram I had in those tests were the 2133mhz CL13-13-13-35 Panram DDR4 (2x4GB) overclocked to 3200mhz CL15-15-15-30-CR1. I have since switched to the G Skill Ripjaws V 3200mhz CL14-14-14-34-CR2 and manually overclocked them to 3500mhz CL14-14-14-28-CR1.

 

From my personal tests, I would say going beyond 2800mhz in a single GPU configuration is not that beneficial from a price:performance standpoint, but manually overclocking ram for free is always a good idea if someone has the time to do so. Beyond 2800mhz in single card configurations, I just do not see any additional performance. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, SuperCookie78 said:

Those benchmarks pair a titan x and an i3. So the gpu is bottlenecked by the cpu. That's why there is a big differnece. The people who buy i3's probably can't get a titan x. If that were an i5 the diffence would be 2 fps. 

That's what I was thinking too. The setup in the video is not really great evidence for real world performance, because nobody in their right mind will pair an i3 with a Titan X. That's not to say that faster RAM is useless though. There are other tests (like this one) which shows that with modern processors (the difference in RAM speed was much smaller a few generations ago) there is a benefit to going with faster RAM. And yes I know my source is Corsair which obviously wants you to buy expensive RAM, but I trust Dustin and the benchmarks seems very legit.

 

"RAM speed does not matter" is outdated info, but I think those ~15% FPS gains in games from ~25% faster RAM that DigitalFoundry shows is not really representative of what people will actually see in their personal computers, which will probably have a CPU twice as powerful and a GPU like 40% slower than their really unbalanced setup.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

That's what I was thinking too. The setup in the video is not really great evidence for real world performance, because nobody in their right mind will pair an i3 with a Titan X. That's not to say that faster RAM is useless though. There are other tests (like this one) which shows that with modern processors (the difference in RAM speed was much smaller a few generations ago) there is a benefit to going with faster RAM. And yes I know my source is Corsair which obviously wants you to buy expensive RAM, but I trust Dustin and the benchmarks seems very legit.

 

"RAM speed does not matter" is outdated info,

LAwLz let me ask you.

How would you go about to test the definitive impact RAM speed can have?

 

Because to be honest, there really is just one way outside of synthetics.

That is to eliminate all other variables. Meaning the CPU must be a bottleneck because that is where we would see potential benefits, the GPU cannot be a bottleneck, the chipset must not be a bottleneck.

 

So if we are not to pair a weak CPU with a overkill GPU, how are you going to show definitive results?

 

You can pair a lower end spec CPU with a lower spec GPU and 4266MHz DDR4 and see what happens (your wallet is crying is what happens), but only in some circumstances will you see the benefit clearly. The rest of the time, it will be a rather murky thing that may not be possible to show clearly on video.

 

However, there IS one more way to test memory speed impact, that is frame latency. Aka stuttering.

And while digital foundry does have the tools to test this. they would have to isolate areas of a game where the game is CPU bound, then isolate the changes done by improved memory clock speeds.

While totally possible, AND more accurate in terms of actual performance gain and when/where we can expect to see a difference, it is also less interesting AND more time consuming to do.

 

While DFs video is a bit "quick and dirty" in terms of system setup, it is not wrong. Unlike Linus's video which is total bullshit from one end to the other.

 

Quote

but I think those ~15% FPS gains in games from ~25% faster RAM that DigitalFoundry shows is not really representative of what people will actually see in their personal computers, which will probably have a CPU twice as powerful and a GPU like 40% slower than their really unbalanced setup.

Actually, most people are using older or weaker hardware.

Only a few are using unlocked i5s. Most people are using locked i5s, laptop i5s or lower end CPUs. Most peoples GPUs arent 40% slower, it is 240-340% slower. The average user is probably sporting a 500 or 600 series GTX card/equivalent AMD card, or worse, Intel HD.

 

However, i encourage you to watch the video about the i5 2500k, which is a very popular part, highly adopted and still very relevant.

Link to comment
Share on other sites

Link to post
Share on other sites

I was waiting for this discussion to appear sooner or later. I am curious to @MageTank tests, as I'm interested in buying another setup next year and upgrading this one, RAM included, gaming purpose. 

But to be fair it's not first time Linus was wrong, and even though he admitted it a few times, like in Lawlz 64 bit thread, he never responded regarding RAM speed queations, and they've been around for quite some time.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bouzoo said:

I was waiting for this discussion to appear sooner or later. I am curious to @MageTank tests, as I'm interested in buying another setup next year and upgrading this one, RAM included, gaming purpose. 

But to be fair it's not first time Linus was wrong, and even though he admitted it a few times, like in Lawlz 64 bit thread, he never responded regarding RAM speed queations, and they've been around for quite some time.

I think he really dont WANT to be wrong here. Because it would undermine a lot of his integrity as he has been spending a whole lot of time scoffing at faster memory.

 

I wonder though. Why does Corsair/G.Skill/Kingston/Avexir/Crucial still send him stuff to test when he is litterally undermining their high end market with his bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

That's what I was thinking too. The setup in the video is not really great evidence for real world performance, because nobody in their right mind will pair an i3 with a Titan X. That's not to say that faster RAM is useless though. There are other tests (like this one) which shows that with modern processors (the difference in RAM speed was much smaller a few generations ago) there is a benefit to going with faster RAM. And yes I know my source is Corsair which obviously wants you to buy expensive RAM, but I trust Dustin and the benchmarks seems very legit.

 

"RAM speed does not matter" is outdated info, but I think those ~15% FPS gains in games from ~25% faster RAM that DigitalFoundry shows is not really representative of what people will actually see in their personal computers, which will probably have a CPU twice as powerful and a GPU like 40% slower than their really unbalanced setup.

My tests with an i5 6600T + GTX 770 still yields roughly 10% on average in my minimum FPS on the games I play. @SteveGrabowski0 uses a Haswell Xeon and a GTX 970 if I remember correctly, and he sees a decent boost in certain titles, very similar to what I see. 

 

I wish I could speak more about memory speed's impact on gaming at higher resolutions, but my GTX 770 struggles enough with 1080p, and 1440p is not fun to test on this older 2gb  card. When newer GPU's come out, I'll try some 1440p and 4k benchmarks with different ram speeds (from 2133 up to 3600) and get more information. I do not own a single card solution that can handle 1440p properly, unless I use my brothers 4790k and GTX 970 and his DDR3 memory. Sadly, his IMC is awful, and he can only go to 2000mhz CL8 (2133mhz won't boot, even at C10) so I can't show raw bandwidth impact. 

 

If you have an XMP kit, its fairly easy to test. Just use normal JEDEC SPD speeds, and pay attention to minimum framerates. Record the lowest value in a benchmark, and then load a faster XMP profile. Run the same benchmark, record the lowest value, and do the math to see exactly what % more FPS you get with faster memory. Compare the FPS % against the memory OC %, along with the price difference % to factor in price:performance. That's what I will be doing in my guide, along with teaching people how to overclock memory manually so that the price argument can be thrown out of the window (hopefully). 

 

Well, break time is over, back to work.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Prysin said:

How would you go about to test the definitive impact RAM speed can have?

I would test it in the way it is actually relevant to consumers. So I would get your typical medium/high end computer, let's say an i5-6600K and an AMD 290, and then only swap out the RAM between tests. The reason I would do that, instead of pairing a Titan X with an i3 is because while that might show that faster RAM makes a difference, it only shows that it is true in an unrealistic scenario.

 

1 hour ago, Prysin said:

Because to be honest, there really is just one way outside of synthetics.

That is to eliminate all other variables. Meaning the CPU must be a bottleneck because that is where we would see potential benefits, the GPU cannot be a bottleneck, the chipset must not be a bottleneck.

No Prysin... That's not how you test. What you are describing is how to design a test to show the results you are expecting. That's not how you do science, and like I said before the results would not be comparable to what the average consumer expects. The results might end up being misleading because you are resting it in unrealistic scenarios.

 

1 hour ago, Prysin said:

So if we are not to pair a weak CPU with a overkill GPU, how are you going to show definitive results?

Your goal should not be to "show definitive results" (by that I assume you mean the conclusion you reached before even doing the test). Your goal should be to design a test which reflects what your viewers should expect if they decide to get faster RAM in their computers.

 

You should not design a test with a result already in mind, because that will probably end up being a biased test.

 

1 hour ago, Prysin said:

You can pair a lower end spec CPU with a lower spec GPU and 4266MHz DDR4 and see what happens (your wallet is crying is what happens), but only in some circumstances will you see the benefit clearly. The rest of the time, it will be a rather murky thing that may not be possible to show clearly on video.

So what you are saying is that you won't see a clear benefit to fast RAM unless you do the test on a computer nobody in their right mind would use? Then maybe Linus has a point when he said RAM speed doesn't matter? I haven't seen the video so I don't know what claims he makes, but you are doing a terrible job defending your stance on this if you say you can't see a clear difference unless you pair a Titan X with an i3 and then get 4000+ MHz RAM.

 

1 hour ago, Prysin said:

And while digital foundry does have the tools to test this. they would have to isolate areas of a game where the game is CPU bound, then isolate the changes done by improved memory clock speeds.

Why isolate the test to only parts of games which are CPU bound? Again, you are designing a biased test with the intent to come to a predefined conclusion. That is not how you do science.

 

 

So what I want to see is how big of a difference going from let's say 1333MHz RAM to 2400MHz RAM makes on a typical gaming system, like a 4700K and a 660Ti.

Oh shit Linus already did that and the faster RAM made next to no difference in those two tests:

 

 

Not to say that the video is perfect, but I think their tests are far more comparable to what your average viewer will see, than the i3 + Titan X test.

Maybe it is a subject worth revisiting.

 

 

 

5 minutes ago, MageTank said:

That's what I will be doing in my guide, along with teaching people how to overclock memory manually so that the price argument can be thrown out of the window (hopefully). 

Just be very careful. When you already have a conclusion in mind, it is very easy to make the test biased unintentionally. Do the test with an open mind and don't go looking for the results you expect.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LAwLz said:

So what I want to see is how big of a difference going from let's say 1333MHz RAM to 2400MHz RAM makes on a typical gaming system, like a 4700K and a 660Ti.

Oh shit Linus already did that and the faster RAM made next to no difference in those two tests:

 

 

Not to say that the video is perfect, but I think their tests are far more comparable to what your average viewer will see, than the i3 + Titan X test.

Maybe it is a subject worth revisiting.

That video is just a gpu test. Linus either doesn't know what he's doing or he's steering his test to a desired answer. But testing 8xMSAA on FC3 and 4xSSAA on Metro LL is retarded, my 970 can't even handle those settings. And Digital Foundry has shown gains in using an i5-6500, i5-6600k, i7-6700k with a Titan X also. It was only their first two or three videos on the subject that were done with an i3.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, ONOTech said:

-snip-

That is pretty much irelevant , as in this case , there is a serious cpu bottleneck ( i3 + titan X ? ). In a more balanced configuration , there typically isn't much of an increase in performance.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

I would like to point out that in all the videos digital foundry showing " proof" that faster ram helps a lot , they are always using a setup consisting of an i3 and an overclocked titan X. Not to mention the games tested are usually cpu intensive. The cpu is a huge bottleneck here , and no sane person has a configuration like this.

 

In a standard gaming PC with balanced parts , the difference would be minor , at best.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Coaxialgamer said:

I would like to point out that in all the videos digital foundry showing " proof" that faster ram helps a lot , they are always using a setup consisting of an i3 and an overclocked titan X. Not to mention the games tested are usually cpu intensive. The cpu is a huge bottleneck here , and no sane person has a configuration like this.

 

In a standard gaming PC with balanced parts , the difference would be minor , at best.

Why should they not test cpu intensive games? Do people not play Witcher 3 and GTA V? And the very first video in the very first post of this thread shows them testing with an i5, but I'll repost it.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SteveGrabowski0 said:

Why should they not test cpu intensive games? Do people not play Witcher 3 and GTA V? And the very first video in the very first post of this thread shows them testing with an i5, but I'll repost it.

That was not my point . My point is that they are creating and maximizing a bottleneck ( which realistically ,nobody has such a bottleneck in their systems)  . They are playing at 1080p with a gpu way out of the cpu's league ( the gpu was overclocked , and the game was played at 1080p , further increasing the bottleneck )

 

A typical setup with a more powerful cpu and/or a less powerful cpu will not see this kind of performance increase if at all.

 

People contently bring up this video as proof that faster RAM makes a big difference in gaming , while all it actually shows is a bottleneck being alleviated .

 

I will say this again : this is in NO WAY a representation of the real world  and what people will get by using faster RAM.

 

So why should Linus and other tech channels care about it ?

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Coaxialgamer said:

That was not my point . My point is that they are creating and maximizing a bottleneck ( which realistically ,nobody has such a bottleneck in their systems)  . They are playing at 1080p with a gpu way out of the cpu's league ( the gpu was overclocked , and the game was played at 1080p , further increasing the bottleneck )

 

A typical setup with a more powerful cpu and/or a less powerful cpu will not see this kind of performance increase if at all.

 

People contently bring up this video as proof that faster RAM makes a big difference in gaming , while all it actually shows is a bottleneck being alleviated .

 

I will say this again : this is in NO WAY a representation of the real world  and what people will get by using faster RAM.

 

So why should Linus and other tech channels care about it ?

I can't guess what your point is, your words were claiming they were just testing with an i3 and testing cpu intensive games. I also don't understand this objection you're posting here, it's the exact same thing you do when testing cpus. Otherwise you're just doing a gpu test like Linus did in his memory speed test with 8xMSAA on FC3 and 4xSSAA on Metro LL or when he tested 2500k vs Skylake by gaming at 4k.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, LAwLz said:

snip

I do not see how I can be unintentionally biased due to my testing methodology, but I am human, so I am not immune to bias. In fact, I used to be one of the "memory speed doesn't matter" people, and went into the tests thinking it wouldn't matter. I was shocked when it not only mattered, but to the degree of difference it made. 

 

As for that LTT video Linus did, no. That is not what a typical user would do. No typical user is going to attempt to max metro with a ton of AA on a GPU that can't even handle it. 30fps on ultra, to your "typical user" is not playable. At least not when it comes to PC gaming. Not only that, the reason we advise people to use faster ram, is not because of average frame rates. It's impact on average framerates is not that great unless you have a high end GPU like a Titan X, like you see in the DF videos. When using normal hardware such as an i5 and x70 series GPU, you will still see a 10% difference in minimum FPS. This might not sound that great to most people, as it basically means if your minimum FPS dropped down to 50fps, with faster memory, it would only drop down to 55fps. However, 55fps is still closer to 60 than 50, and given how cheap faster ram is, it's a worthy investment from a price:performance standpoint.

 

All of my tests can be replicated. I leave no details out, and I even post screenshots of my BIOS and each individual timing. Assuming you have a CPU with an IMC on par with mine, you can replicate my work quite easily. This in and of itself helps remove any bias from my part, as it would be extremely easy to call me out on it. It's why I often beg people to test for themselves. At worst, it makes little to no difference to them, and they move on. At best, it actually helps them, and they get additional performance from something they didn't know would help. 

 

I highly recommend people read this thread on OCN: http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios

 

So much information (with screenshot proof) is included along with his testing methodology, including testing of latency and even ranks. It is the foundation for which my guide is based on. Not just the main post, but others throughout the thread test themselves, and find similar results on vastly different pieces of hardware. Again, this is with the primary focus being on minimum FPS (the most important number in gaming) and not average, where people seem to be focusing when claiming it makes no difference.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×