Jump to content

2x8GB Ram sticks rated at 3200 run at 1600

Go to solution Solved by mariushm,

Dual channel means the processor reads from and writes to TWO memory sticks at the same time. This way, the transfer time between memory and CPU is reduced in half. 

 

The memory sticks run at 1600 Mhz. The 3200 Mhz value is marketing, and white lie... it's a way to express the actual performance of a memory stick.

 

In the past we had SDRAM, which made available 1 bit of information on every pin that can transfer information to PC - on a memory stick you have 64 such pins.

So, for every Hz, a memory stick could transfer 64 bits of information.

 

When DDR memory was invented, they made it possible to put 2 bits of information on every pin, but the number of pins remains the same. So, on every Hz of the memory's frequency, the memory stick could now transfer 128 bits of information.

 

First DDR memory sticks used to run at lower frequencies than SDRAM, for example you have 200 Mhz SDRAM and 166 Mhz DDRAM and customers couldn't grasp the concept that even though the frequency was lower, the DDRAM was better because of that 128 bit vs 64 bit transfers on each Hz.

In the example above, you had 200 x 64 vs 166 x 128: the DDR wins.

 

To work around this, the manufacturers simply decided to advertise the DDR memories at double the frequency.

In the example above, the 166 Mhz DDRAM was advertised as 333 Mhz and now the customers saw 200 Mhz and 333 Mhz and thought "of course DDR is better, it's a higher number"

 

DDR2 and DDR3 and DDR4 only make small improvements and jumps in performance, they still transfer only 2 bits of information.

So that's why you have the real frequency of 1600 Mhz and the sticks are sold as 3200 Mhz.

 

GDDR5 memory which is used on video cards puts 4 bits on every pin and that's why you see frequencies like 7000 Mhz - the chips in reality function at 7000/4 = 1750 Mhz.

Also, each tiny memory chip has 32 data pins on it so if you have a video card with 8 memory chips, it's already 256 bit wide, and for every Hz, you get 256 bits x 4 bits per Hz = 1024 bits.... big number, compared to 64 bit x 2 bits per Hz  x 2 (if used in dual channel) = 256 bits per Hz

That's one of the reasons video cards are so fast.

2x8GB Ram sticks rated at 3200 run at 1600

Does this mean since 2 sticks each run at 1600 = 3200 or is it a problem?

Link to comment
Share on other sites

Link to post
Share on other sites

Dual channel means the processor reads from and writes to TWO memory sticks at the same time. This way, the transfer time between memory and CPU is reduced in half. 

 

The memory sticks run at 1600 Mhz. The 3200 Mhz value is marketing, and white lie... it's a way to express the actual performance of a memory stick.

 

In the past we had SDRAM, which made available 1 bit of information on every pin that can transfer information to PC - on a memory stick you have 64 such pins.

So, for every Hz, a memory stick could transfer 64 bits of information.

 

When DDR memory was invented, they made it possible to put 2 bits of information on every pin, but the number of pins remains the same. So, on every Hz of the memory's frequency, the memory stick could now transfer 128 bits of information.

 

First DDR memory sticks used to run at lower frequencies than SDRAM, for example you have 200 Mhz SDRAM and 166 Mhz DDRAM and customers couldn't grasp the concept that even though the frequency was lower, the DDRAM was better because of that 128 bit vs 64 bit transfers on each Hz.

In the example above, you had 200 x 64 vs 166 x 128: the DDR wins.

 

To work around this, the manufacturers simply decided to advertise the DDR memories at double the frequency.

In the example above, the 166 Mhz DDRAM was advertised as 333 Mhz and now the customers saw 200 Mhz and 333 Mhz and thought "of course DDR is better, it's a higher number"

 

DDR2 and DDR3 and DDR4 only make small improvements and jumps in performance, they still transfer only 2 bits of information.

So that's why you have the real frequency of 1600 Mhz and the sticks are sold as 3200 Mhz.

 

GDDR5 memory which is used on video cards puts 4 bits on every pin and that's why you see frequencies like 7000 Mhz - the chips in reality function at 7000/4 = 1750 Mhz.

Also, each tiny memory chip has 32 data pins on it so if you have a video card with 8 memory chips, it's already 256 bit wide, and for every Hz, you get 256 bits x 4 bits per Hz = 1024 bits.... big number, compared to 64 bit x 2 bits per Hz  x 2 (if used in dual channel) = 256 bits per Hz

That's one of the reasons video cards are so fast.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mariushm said:

snip

That's a very long answer for a simple question. Nice.

Ryzen 5700g @ 4.4ghz all cores | Asrock B550M Steel Legend | 3060 | 2x 16gb Micron E 2666 @ 4200mhz cl16 | 500gb WD SN750 | 12 TB HDD | Deepcool Gammax 400 w/ 2 delta 4000rpm push pull | Antec Neo Eco Zen 500w

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SupaKomputa said:

That's a very long answer for a simple question. Nice.

Kinda pointless stress about it as well since tomorrow we'll have another thread of the same issue...

 

This is one of those things that will be part of the tech world to the end.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Th

4 minutes ago, mariushm said:

Dual channel means the processor reads from and writes to TWO memory sticks at the same time. This way, the transfer time between memory and CPU is reduced in half. 

 

The memory sticks run at 1600 Mhz. The 3200 Mhz value is marketing, and white lie... it's a way to express the actual performance of a memory stick.

 

In the past we had SDRAM, which made available 1 bit of information on every pin that can transfer information to PC - on a memory stick you have 64 such pins.

So, for every Hz, a memory stick could transfer 64 bits of information.

 

When DDR memory was invented, they made it possible to put 2 bits of information on every pin, but the number of pins remains the same. So, on every Hz of the memory's frequency, the memory stick could now transfer 128 bits of information.

 

First DDR memory sticks used to run at lower frequencies than SDRAM, for example you have 200 Mhz SDRAM and 166 Mhz DDRAM and customers couldn't grasp the concept that even though the frequency was lower, the DDRAM was better because of that 128 bit vs 64 bit transfers on each Hz.

In the example above, you had 200 x 64 vs 166 x 128: the DDR wins.

 

To work around this, the manufacturers simply decided to advertise the DDR memories at double the frequency.

In the example above, the 166 Mhz DDRAM was advertised as 333 Mhz and now the customers saw 200 Mhz and 333 Mhz and thought "of course DDR is better, it's a higher number"

 

DDR2 and DDR3 and DDR4 only make small improvements and jumps in performance, they still transfer only 2 bits of information.

So that's why you have the real frequency of 1600 Mhz and the sticks are sold as 3200 Mhz.

 

GDDR5 memory which is used on video cards puts 4 bits on every pin and that's why you see frequencies like 7000 Mhz - the chips in reality function at 7000/4 = 1750 Mhz.

Also, each tiny memory chip has 32 data pins on it so if you have a video card with 8 memory chips, it's already 256 bit wide, and for every Hz, you get 256 bits x 4 bits per Hz = 1024 bits.... big number, compared to 64 bit x 2 bits per Hz  x 2 (if used in dual channel) = 256 bits per Hz

That's one of the reasons video cards are so fast.

Thanks for taking the time to explain this to me, appreciate the effort you put into this text, thanks a lot

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×