Hi everyone,
Recently an interesting thing came into my mind after watching one of the OC videos for RTX3080 done by Jayztwocents. In there he mentioned that modern memories often have the auto-correction features so we are no longer see the artifacts in the screen as often as we used to but instead increasing the frequency causes higher amount of autocorrections needed and lowering the overall fps and performance.
I know RTX3080 uses a bit different memories compared to DDR4, but some base auto-corrections in DDR4 is already in place to achieve higher frequencies of DDR4.
As such I was wondering if for example DDR4 or even upcoming DDR5 already have some level auto-correcting features to be able to achieve higher frequencies. If so doesn't it practically making special ECC memories for server usage kinda obsolete and pointless to pay extra? ECC memory usually have lower frequencies compared to non-ecc ones, however if there is already some error-correction in non-ecc ones, why have the ecc ones at all? I understand that in past it was helpful as these auto-corrections were necessary, but what about these days with modern chips when memories must have it to achieve high frequencies anyway?
Does anyone ever actually really had need to use DDR4 ECC in servers? Or its nowadays just for the convenience because we used to need it with older memory standards?