Jump to content

DDR4 3000 (PC4 24000) vs DDR4 3200 (PC4 25600)

Go to solution Solved by AnonymousGuy,
29 minutes ago, vagabond139 said:

Its not a big difference and ram speed does make a difference.

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

Is there much difference between these two, and if so is it worth paying an extra CAD/AUD $34 for the DDR4 SDRAM DDR4 3200 (PC4 25600)?

 

G.SKILL Ripjaws V Series 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3000 (PC4 24000) Desktop Memory Model F4-3000C14Q-32GVR

G.SKILL Ripjaws V Series 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14Q-32GVR

 

Thanks for your help

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

200 MHz isn’t that much of a difference. You won't notice any difference between thse two kits.

Save your $.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dabasepc said:

200 MHz isn’t that much of a difference. You won't notice any difference between thse two kits.

Save your $.

Okay, thanks

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

Save the money completely and stick to 2133 unless you're trying to win benchmarks, which you probably won't under any circumstance anyways.

 

Also I'm a single data point, but I've had multiple G.Skill kits disappoint me.  My storage servers kit started misbehaving after a few years in service.  That was a bitch to trouble shoot.

 

EDIT: To elaborate, Puget Systems and I are on the same page.  Extreme memory speed kits are pointless nowadays.  You get no real world performance difference and you risk mobo incompatibility, memory controller incompatibility, general shakiness, etc. 

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1600mhz!

duh

Ryzen 5 3600 stock | 2x16GB C13 3200MHz (AFR) | GTX 760 (Sold the VII)| ASUS Prime X570-P | 6TB WD Gold (128MB Cache, 2017)

Samsung 850 EVO 240 GB 

138 is a good number.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, themctipers said:

1600mhz!

duh

Um... I don't think I should downgrade to that, I was more trying to pick between 3000MHz and 3200MHz

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, vagabond139 said:

Its not a big difference and ram speed does make a difference.

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

 

 

 

1 hour ago, AnonymousGuy said:

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

 

 

http://www.techspot.com/review/1089-fallout-4-benchmarks/page6.html

 

http://www.bit-tech.net/hardware/2014/07/31/amd-a-10-7800-review/7

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, AnonymousGuy said:

Save the money completely and stick to 2133 unless you're trying to win benchmarks, which you probably won't under any circumstance anyways.

 

Also I'm a single data point, but I've had multiple G.Skill kits disappoint me.  My storage servers kit started misbehaving after a few years in service.  That was a bitch to trouble shoot.

 

EDIT: To elaborate, Puget Systems and I are on the same page.  Extreme memory speed kits are pointless nowadays.  You get no real world performance difference and you risk mobo incompatibility, memory controller incompatibility, general shakiness, etc. 

 

6 hours ago, AnonymousGuy said:

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

Do you know this for a fact? Is this something you tested? Or are you just repeating nonsense that you read elsewhere and are passing it off as a fact? Seeing as I myself, along with several other forum members, Digital Foundry, and a comprehensive study done on OCN beg to differ, I would say memory speed has quite the impact on gaming performance.

 

http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios

 

 

Memory speed has a huge impact on gaming performance depending on the amount of CPU overhead involved. Enough for me to see a 10% improvement in minimum framerates in the older games i play, and enough for @SteveGrabowski0 to see upwards of 15% improvement in some of his modern titles. 

 

Next time, test this for yourself before giving false advice to someone. It really damages someones price:performance, and perception of memory in general, which only creates yet another person to spread these false "facts" around. 

 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, MageTank said:

 

Do you know this for a fact? Is this something you tested? Or are you just repeating nonsense that you read elsewhere and are passing it off as a fact? Seeing as I myself, along with several other forum members, Digital Foundry, and a comprehensive study done on OCN beg to differ, I would say memory speed has quite the impact on gaming performance.

 

http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios

 

 

Memory speed has a huge impact on gaming performance depending on the amount of CPU overhead involved. Enough for me to see a 10% improvement in minimum framerates in the older games i play, and enough for @SteveGrabowski0 to see upwards of 15% improvement in some of his modern titles. 

 

Next time, test this for yourself before giving false advice to someone. It really damages someones price:performance, and perception of memory in general, which only creates yet another person to spread these false "facts" around. 

 

 

RE: Your OCN benchmarks, 5% framerate change with 30% faster memory frequency => the memory speed doesn't matter.   Minimum frame rate is also cherry picking where it's worst case scenario.

 

>Enough for me to see a 10% improvement in minimum framerates in the older games i play

 

And those same older games are going to get 200 fps anyways on modern hardware, so enjoy your 220 vs 200.

 

Oh hey look a LTT vid on the subject agreeing with me: 

 

Lastly an entire article showing sub-5% gains going from DDR4-2133 to DDR4-3200:

 

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/5

 

You lose son.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AnonymousGuy said:

Oh hey look a LTT vid on the subject agreeing with me: 

 

That video is garbage, did you watch it? It's just a gpu test since he is playing games with ridiculous amounts of AA on a lower end graphics card. The gains in gaming performance from faster RAM seem to come when you're cpu bound, which is why it can be very good for lifting minimum framerates even with a well balanced system but doesn't do a whole lot for average framerates most of the time unless you have an overkill gpu for your setup.

 

The gains to minimum framerates in GTA V and Fallout 4 since I went from a DDR3-1600 kit to a DDR3-2400 kit are obvious plain as day. They're not there in every game, but in some cpu heavy games the fast RAM was the difference between me getting what felt like a locked 60 fps because drops were only into the high 50s vs plainly obvious laggy drops into the low 50s. Witcher 3 is one of the other games I have that often dropped below 60 fps for me with DDR3 (I'm running a Xeon E3-1231v3 + GTX 970), but I haven't tested the difference there because I don't remember exactly what kind of drops I was seeing. But Digital Foundry shows Witcher 3 framerates to be very sensitive to memory speed in cpu heavy areas like Novigrad. The one game I have that often drops below 60 fps on my system that shows no improvement whatsoever with DDR3-2400 vs DDR3-1600 is Crysis 3 (which also agrees with Digital Foundry's findings, though I tested on the first level while they test on the second).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

5% framerate change with 30% faster memory frequency => the memory speed doesn't matter.

By that argument, why do we buy faster CPUs and overclock the life out of them also? Beyond some minimum point, faster CPUs don't make much of a difference to measured fps either.

 

Back to the OP's original question, I'd answer, it depends on what applications are used. Maybe there are situations where the faster ram would help. I search for prime numbers, and the software that does that shows ~30% speed improvement going from 2133 to 3200 on a Skylake at 4.2 GHz. I don't bother overclocking the CPU harder since I am limited by the ram still. I'd admit this is a niche case, but it is unlikely to be the only one.

 

I do agree once you get much above 3200 or so, you're getting into the region where compatibility and stability may require more effort. Up to around 3000 doesn't seem to pose any problems though.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

 

RE: Your OCN benchmarks, 5% framerate change with 30% faster memory frequency => the memory speed doesn't matter.   Minimum frame rate is also cherry picking where it's worst case scenario.

 

>Enough for me to see a 10% improvement in minimum framerates in the older games i play

 

And those same older games are going to get 200 fps anyways on modern hardware, so enjoy your 220 vs 200.

 

Oh hey look a LTT vid on the subject agreeing with me: 

 

Lastly an entire article showing sub-5% gains going from DDR4-2133 to DDR4-3200:

 

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/5

 

You lose son.

The world of hurt you just opened on yourself is an unforgiving one. That LTT video proves it. You linked a video from a man that has absolutely no idea what he is talking about when it comes to ram, to try to prove ME wrong? Ask around. That is not going to end well. His quad channel memory is slower than my dual channel memory. Can you possibly explain that? I sure can.

 

Also, the "older" games i play, are still demanding ones. You try to max Witcher's ubersampling. Even modern hardware will struggle. The metro series is still used in benchmarks, as is the Crysis series. Older titles, still useful for performance metrics. No "200+ FPS" numbers either. 

 

Your Anandtech article is using 4x4gb sticks, completely losing Rank Interleaving, which tells me their bandwidth is suffering as a result. Not to mention we are not getting any screenshots or videos. Just random charts with useless data. I provided proof with my claims, as did the dozen men on that OCN thread, along with our testing methodology that anyone can replicate to test for themselves. 

 

You act like a performance boost to minimum framerates is a bad thing. It's the most important FPS number when gaming. There are people overclocking CPU's and buying expensive coolers and not getting a 5% boost half of the time in that number. For you to complain about such a small boost for little to no difference in price (difference between 16GB DDR4 2133 and 2800mhz is $14, and only $4 if you go with 2666mhz). Not to mention manually overclocking is free, and nearly any kit can hit 2800mhz once you dump 1.35vDIMM into them. 

 

Your argument is pointless. You've been proven wrong my multiple members and sources, and not a single overclocker takes that LTT video seriously. Ask @SteveGrabowski0, @Prysin, etc if you will not take my word (and my mountain of evidence) for it. 

 

It's not about winning or losing "son". It's about making sure people get the right information. The sooner you learn that, the closer you will get to "beating" me. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AnonymousGuy said:

 

RE: Your OCN benchmarks, 5% framerate change with 30% faster memory frequency => the memory speed doesn't matter.   Minimum frame rate is also cherry picking where it's worst case scenario.

 

>Enough for me to see a 10% improvement in minimum framerates in the older games i play

 

And those same older games are going to get 200 fps anyways on modern hardware, so enjoy your 220 vs 200.

 

Oh hey look a LTT vid on the subject agreeing with me: 

 

Lastly an entire article showing sub-5% gains going from DDR4-2133 to DDR4-3200:

 

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/5

 

You lose son.

if you believe in the results of that video, i must question the integrity of your intellect. Let alone the purpose of your entire being.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, MageTank said:

 

Do you know this for a fact? Is this something you tested? Or are you just repeating nonsense that you read elsewhere and are passing it off as a fact? Seeing as I myself, along with several other forum members, Digital Foundry, and a comprehensive study done on OCN beg to differ, I would say memory speed has quite the impact on gaming performance.

 

http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios

 

 

Memory speed has a huge impact on gaming performance depending on the amount of CPU overhead involved. Enough for me to see a 10% improvement in minimum framerates in the older games i play, and enough for @SteveGrabowski0 to see upwards of 15% improvement in some of his modern titles. 

 

Next time, test this for yourself before giving false advice to someone. It really damages someones price:performance, and perception of memory in general, which only creates yet another person to spread these false "facts" around. 

 

Though what this discussion was mostly about was whether or not 3000 or 3200 is significant.

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AnonymousGuy said:

 

RE: Your OCN benchmarks, 5% framerate change with 30% faster memory frequency => the memory speed doesn't matter.   Minimum frame rate is also cherry picking where it's worst case scenario.

 

>Enough for me to see a 10% improvement in minimum framerates in the older games i play

 

And those same older games are going to get 200 fps anyways on modern hardware, so enjoy your 220 vs 200.

 

Oh hey look a LTT vid on the subject agreeing with me: 

 

Lastly an entire article showing sub-5% gains going from DDR4-2133 to DDR4-3200:

 

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/5

 

You lose son.

Though in this situation, 3000 and 3200, is that worth paying an extra CAD/AUD $34.

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, rm -rf said:

Though what this discussion was mostly about was whether or not 3000 or 3200 is significant.

I know what the discussion was about. I took issue with him saying you were better off sticking with 2133mhz, because it "makes little to no difference". That's just not true. As for your question, I can simplify it for you.

 

Assuming primary timings are exactly the same, and we live in a perfect world, then the following is true:

 

Max peak theoretical bandwidth for DDR4 3000mhz is 24,000MB/s single channel, 48,000MB/s dual channel, 96.000MB/s quad channel.

Max peak theoretical bandwidth for DDR4 3200mhz is 25,600MB/s single channel, 51,200MB/s dual channel, 102,400MB/s quad channel. 

 

In general usage (gaming, browsing, etc) you will more than likely not notice much of a difference at all between 3000mhz and 3200mhz, as the laws of diminishing returns kick in very heavily after 2800mhz in my personal testing. It depends entirely on the CPU overhead when it comes to video games. The more CPU overhead, the more faster memory helps. SLI/Crossfire also likes even faster, lower latency memory too. So with multi-GPU configurations, going with fast ram is a good idea. However, with that kind of price difference, it's simply not worth it in a price:performance standpoint.

 

You can also manually overclock that 3000mhz kit to 3200 if need be. Though you would see much more performance from adjusting tertiary timings and RTL/IO-L. Hope this helps.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MageTank said:

I know what the discussion was about. I took issue with him saying you were better off sticking with 2133mhz, because it "makes little to no difference". That's just not true. As for your question, I can simplify it for you.

 

Assuming primary timings are exactly the same, and we live in a perfect world, then the following is true:

 

Max peak theoretical bandwidth for DDR4 3000mhz is 24,000MB/s single channel, 48,000MB/s dual channel, 96.000MB/s quad channel.

Max peak theoretical bandwidth for DDR4 3200mhz is 25,600MB/s single channel, 51,200MB/s dual channel, 102,400MB/s quad channel. 

 

In general usage (gaming, browsing, etc) you will more than likely not notice much of a difference at all between 3000mhz and 3200mhz, as the laws of diminishing returns kick in very heavily after 2800mhz in my personal testing. It depends entirely on the CPU overhead when it comes to video games. The more CPU overhead, the more faster memory helps. SLI/Crossfire also likes even faster, lower latency memory too. So with multi-GPU configurations, going with fast ram is a good idea. However, with that kind of price difference, it's simply not worth it in a price:performance standpoint.

 

You can also manually overclock that 3000mhz kit to 3200 if need be. Though you would see much more performance from adjusting tertiary timings and RTL/IO-L. Hope this helps.

Didn't they just say  "If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible". What I assume they mean is that it matters in larger differences, though with the smaller 7% difference, theyre saying it doesnt really matter

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, rm -rf said:

Didn't they just say  "If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible". What I assume they mean is that it matters in larger differences, though with the smaller 7% difference, theyre saying it doesnt really matter

No, he is still wrong. I still noticed a worthwhile difference from 2666 to 2800. As did some of my other sources. Your largest performance differences comes from 2133 to 2666. After 2666, you will notice less of a performance boost, but a boost nonetheless. Once you hit about 3000, anything after that on a single card configuration feels pretty much the same. Once you add in SLI/Crossfire (Where memory latency becomes extremely important) then you notice the impact beyond 3000mhz. The reason being, is faster clock speeds = lower latency assuming you are not sacrificing your primaries/tertiary timings to get said clock speed. 

 

I am currently using a custom OC of 3500mhz CL14-14-14-28-1. With multi GPU configurations, performance scaling has been improved up to 3500mhz, but to get 3600mhz, i have to go C15, and upon doing so, latency doesn't change. The raw bandwidth changes lightly in favor of the 3600mhz, but since latency has not changed, the previous diminishing returns on bandwidth means my performance does not change. Remember, multi-GPU configurations care a lot more about latency than raw bandwidth (But both are important, as low latency on DDR4 cannot exist without high bandwidth). 

 

Again, I hope this is not too confusing. Memory itself is very convoluted at times. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, MageTank said:

Though you would see much more performance from adjusting tertiary timings and RTL/IO-L.

Recognising it may be drifting from the thread a bit, could you suggest a guide to adjusting those other settings?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

Recognising it may be drifting from the thread a bit, could you suggest a guide to adjusting those other settings?

I am currently working on a comprehensive guide for nearly every single ram timing that matters, along with their individual impact on general usage performance (browsing, gaming, etc). Taking a bit of time, as I am also trying to release a video alongside it to make it easier for people to follow along. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, MageTank said:

I am currently working on a comprehensive guide for nearly every single ram timing that matters, along with their individual impact on general usage performance (browsing, gaming, etc). Taking a bit of time, as I am also trying to release a video alongside it to make it easier for people to follow along. 

If in my build I have an 

  • Intel Core i7-5820K (Heavily Overclocked with water cooler)
  • MSI GTX 980TI GAMING 6G GOLDEN EDITION

Will buying the 3200 over the 3000 be a good decision, if it will cost an additional CAD/AUD $34?

 

If not, would it be useful to do that in the name of future proofing? I will be upgrading to 2-way SLI eventually. (1-3 years). 

1474409643.6492558

Link to comment
Share on other sites

Link to post
Share on other sites

  • 11 months later...
On 3/27/2016 at 0:05 AM, AnonymousGuy said:

Nope.  Not outside synthetic benchmarks.  If you're talking 1333 vs. 2133 then sure, but otherwise it's negligible.

Do me a favor, run Cemu (a Wii U emulator) and run Breath of the Wild in a RAMDisk. Make sure you own both a Wii U and Breath of the Wild to stay within a legal gray area. Use a precompiled shader pack that has 6000+ shaders.

 

If you attempt to launch the application, on a very fast SSD it would take about 20 minutes to load. In a RAMDisk at 2133mhz it would take maybe 2 minutes. At 3200mhz, maybe under a minute.

 

These are not "synthetic", it depends on how you use your RAM and why.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×