Jump to content

For a Ryzen 2400G build: DDR4 3200MHz CL16 or 3000MHz CL15?

Go to solution Solved by Mira Yurizaki,

CL16 and CL15 at those respective speeds are effectively the same latency. Get the faster memory as GPUs prefer bandwidth over latency because they're geared to crunching a ton of data at once.

More specifically, between these two:

https://www.gskill.com/specification/165/185/1567584549/F4-3200C16D-16GIS-Specification

https://www.gskill.com/product/165/184/1536054483/F4-3000C15D-16GVKB-Overview

 

mobo: msi B450M PRO-VDH, cpu 2400G with possible upgrade to 3600 at the end of the year.

 

3000 CL15 is 10€ more. My suspicion is that this is exactly the same thing, just when you take the 3200 CL16 down you can get CL15 but we can never be sure :)

 

Any guesses what would perform better or just get the cheaper one? I'm already disappointed because I wanted a 3600 CL16 but it's out of stock for a while now.

Link to comment
Share on other sites

Link to post
Share on other sites

CL16 and CL15 at those respective speeds are effectively the same latency. Get the faster memory as GPUs prefer bandwidth over latency because they're geared to crunching a ton of data at once.

Edited by Mira Yurizaki
English is hard
Link to comment
Share on other sites

Link to post
Share on other sites

Just get the cheaper one, 3200MHz CL16. 

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

I did a little bit more research into this whole DDR / Ryzen performance and crunched the numbers on this article: https://www.tomshardware.com/reviews/amd-ryzen-3000-best-memory-timings,6310.html

 

Zp1xEw4.png

 

 

My main takeaway is that nothing matters if kept within 'sane bounds'. Real world gaming performance won't change between 3200 CL16 / 3600 CL18, probably even 3000 CL16.

 

Also this was very specific tests and I only analyzed 1440p performance. It clearly shows there are a lot of variables that can impact all of this so just don't worry and get the cheapest sticks within reason :) It's a bit impossible to guess if for your specific CPU, game, resolution and settings the better synthetic DDR performance will matter or not.

 

One thing is for sure: all 'premium' DDR doesn't seem to be worth it in terms of price/performance. By premium I mean everything that costs +-25€ more than the 'standard' 3200 CL16.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×