Jump to content

Is CPU cache different from SSD caching?

Go to solution Solved by Gameking002,
Just now, Xdrone said:

Oooh, so the act of SSD caching is moving data to keep performance high. And in the case of an SSHD - moving more frequent files to the SSD portion of the drive, which is the equivalent of HDD + optane (HDD + optane = SSHD), everything here correct?

yes, thats correct

1 Isn't the cache of a cpu what feeds information into it? Is this correct?

2 And how is this different from SSD caching found on SSHDs and Intel optane?

3 How do SSHDs work? Is the cache (DRAM) of an SSHD the same as SSD caching or CPU caching (feeding info to the HDD part).

Link to comment
Share on other sites

Link to post
Share on other sites

1 and 2 I'm not an expert on, so I'll leave that to the experts. for 3, an SSHD is basically a normal HDD with a small SSD on it that uses an algorithm to figure out what files you use the most. These files are kept on the SSD for quicker access, while more uncommonly used files are left on the HDD. 

HEADS UP, THIS ACCOUNT IS INACTIVE NOW

I'm keeping everything else the way it was for anyone who might check out my answers in future, but I won't be using LTT.

 

 

 

 

Don't forget to quote me when replying to me!

Please explain your question fully, so I can answer it fully.

PSU Tier List Cooler Tier List SSD Tier List  My Specs Below!

Spoiler

My PC:

CPU: Ryzen 5 1600 @ 3.2GHz

Cooler: Stock Wraith Spire

RAM: G.Skill Trident Z RGB 3000mHz 16GB DDR4 (2x8GB) RGB

Motherboard: Asus ROG Strix X370-F Gaming ATX

SSD: Crucial MX500 500GB 2.5"

HDD: Western Digital Blue 1TB 7200rpm

GPU: Asus ROG Strix OC GTX 1060 6GB

Case: Cooler Master H500P

PSU: Corsair RM650i 650W 80+ Gold Fully Modular

OS: Windows 10 Home 64-bit

Fans: 4x Cooler Master Masterfan Pro 120 Air Balance

Spoiler

Potato Laptop (Samsung Series 5 Ultrabook, 2013):

CPU: Intel Ivy Bridge i5 3337U @ 1.8GHz

RAM: 8GB DDR3 2133mhz SODIMM (1x4GB Samsung, 1x4GB Kingston)

SSD: Kingston 24GB SSD (originally for caching)

HDD: HGST 500GB 5400rpm

GPU: Intel HD 4000 Graphics

OS: Windows 10 Home 64-bit

 

Link to comment
Share on other sites

Link to post
Share on other sites

cache of a cpu hold whatever the cpu is processing at the moment.

the cache of a ssd holds data that is moved around. ssds move data around to faster parts of the ssd to keep performance higher.

sshds move the data that is most used by the user to a ssd inside the sshd so that data can be accesed faster so applications the user uses alot will work faster than others

DRAM is not the cache of an sshd, it is the data that a cpu needs to access for the current application, but is not processing at the current moment.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Gameking002 said:

cache of a cpu hold whatever the cpu is processing at the moment.

the cache of a ssd holds data that is moved around. ssds move data around to faster parts of the ssd to keep performance higher.

sshds move the data that is most used by the user to a ssd inside the sshd so that data can be accesed faster so applications the user uses alot will work faster than others

DRAM is not the cache of an sshd, it is the data that a cpu needs to access for the current application, but is not processing at the current moment.

Oooh, so the act of SSD caching is moving data to keep performance high. And in the case of an SSHD - moving more frequent files to the SSD portion of the drive, which is the equivalent of HDD + optane (HDD + optane = SSHD), everything here correct?

Link to comment
Share on other sites

Link to post
Share on other sites

1. Cache is like a short term quick storage for the CPU that holds information on commands that will be needed within a short time in the future. So yes, cache feeds the CPU with info, but it's not the only one doing so.

 

2. CPU cache and SSD cache in SSHDs work on different components of a computer, though they do similar things: Hold the commonly used stuff for faster access. Intel Optane is more like a super fast SSD with low capacity, rather than a cache.

 

3. SSHD has a small SSD inside that holds commonly used info for faster access. CPU dont send info to the HDD or SSHD directly. It creates / receives info and dumps into the RAM, then that sends info into the HDD / SSHD.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Xdrone said:

Oooh, so the act of SSD caching is moving data to keep performance high. And in the case of an SSHD - moving more frequent files to the SSD portion of the drive, which is the equivalent of HDD + optane (HDD + optane = SSHD), everything here correct?

yes, thats correct

Link to comment
Share on other sites

Link to post
Share on other sites

Afbeeldingsresultaat voor how ram cache and hdd work together

this will maybe make it clearer (if it wasnt clear enough yet), you have these levels of storage. the input sources are basically mouse and keyboard. i wont be focusing on those at the moment

then you have the SSD/HDD (ROM storage). whenever you start an application, data gets pulled from the ROM into the RAM for quicker access by the CPU.

then the cache is what the CPU is processing at that specific moment. (for example a few lines of coding). then part by part those lines of coding get fed into the CPU register for the CPU to process. once it is done processing it sends that data back where it needs to be. and eventually when you save a file and/or exit, the saved file will get back to the ROM (SSD/HDD) for the next time you start that specific application. also options and preferences get overwritten from the previous data that was there for the application so you always have the options/preferences that you set.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1.)

 

Cache refers to any memory within the CPU.  There are 3 levels.  L1-L3.

 

To the best of my knowledge, L3 cache is shared memory across all cores.  This might be where information from the DRAM is directly queued.  

 

L2 cache is specific to each core, where I think it holds information prior to the x86/x64 decoding.

 

L1 maybe the latches or the registers in the execution core.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Xdrone said:

And how is this different from SSD caching found on SSHDs and Intel optane?

The only difference i know of, would be that CPU cache is volatile, but SSD  cache and optane memory is not volatile

 

Indus Monk = Indian+ Buddhist

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, xentropa said:

1.)

 

Cache refers to any memory within the CPU.  There are 3 levels.  L1-L3.

 

To the best of my knowledge, L3 cache is shared memory across all cores.  This might be where information from the DRAM is directly queued.  

 

L2 cache is specific to each core, where I think it holds information prior to the x86/x64 decoding.

 

L1 maybe the latches or the registers in the execution core.

Cpu has the fastest access to L1 cache,  when it goes to L2, it gets slower and even more so when it reaches L3. L1 is also the most expensive to manufacture, which is why up to now, L1 cache is always the least amount. Back then you can actually run test to see how cpu cache affects performance. The bios has the option to enable or disable them. 

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Xdrone said:

1 Isn't the cache of a cpu what feeds information into it? Is this correct?

Yes and no. Cache holds data that the CPU frequently asks for from main memory (or RAM). The idea is that the more often the CPU asks for the same data, the closer the access time and transfer speed gets to cache speed.

Quote

2 And how is this different from SSD caching found on SSHDs and Intel optane?

SSD caching is still considered to be in secondary memory. The processor will not touch it unless what it wants can't be found in main memory. However, the idea is still the same: the more times you access the data from the slower secondary memory, the closer its access time and transfer speed gets to the SSD's performance.

Quote

3 How do SSHDs work? Is the cache (DRAM) of an SSHD the same as SSD caching or CPU caching (feeding info to the HDD part).

The drive monitors what data you've been frequently accessing and makes a decision based on some algorithm to either keep the data in the SSD portion or not.

 

Basically the whole gist of this is caching is a way to add smaller amounts of faster memory to improve the performance of slower memory. The amount of data you actually need is tiny compared to the amount of data you can store permanently. So by prioritizing the small bits of data you use often, you can get much faster performance for not a whole lot more in cost. There's also the fact that if your cache size is too large, the amount of time you spend looking if the data is even in cache is too much and it would've been comparatively as fast to just look it up in the slower memory anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

a cache is a cache regardless of where. It holds information for faster access for where it is placed. A drive cache holds information for the use of the drive. It could hold information about where different pieces of data is so its faster to find or even data mostly used on the drive. A CPU cache holds data for CPUs to use when needed without having to wait. For example if you have multiple cores working on the same thing its faster to draw from the cache than it is to wait for the data to be sent from one core to ram and for another core to get it from ram.

 

RAM is not cache, it is a storage device. a CPU is flexible, it can use data from anywhere, be it ram, drives, even network as modern CPUs have flexible data controllers to manage data.

 

A practical example would be a physics simulation. The algorithm is the same, never changing so it can be stored in cache. The objects are reused many times so that too can be cached as long as there is space otherwise the ram can continously keep sending that data to the cpu asynchronously so the CPU isnt kept waiting. The CPU can then keep working the loop despite not having enough registers and storage for all the pieces, not to mention that the results will have to go somewhere like back to ram, display or even storage. This is one of the easier applications where cache use is helpful and less chance of a cache miss. Another example would be code compilation. Any application that reuses code and data many times will benefit from cache for processors (even on GPUs too which nvidia strips from consumer cards) which is why AMD cards do general purpose compute faster. These applications benefit from having a bigger CPU cache (some games too)

 

Some applications like networking dont benefit from caching except for tables. In networking packets have to get from one place to another as fast as possible so caching data or packets are a bad idea but its a good idea to cache connections and tables of routes. Though on the internet these things get so big that CPU caches cant fit everything so ram and storage are used too hence why on the professional level they use x86 servers or routers with DIMM slots so you can upgrade the ram or get the ram amount you need because certain applications like BGP store internet routes which would be good to cache but is just too big for any processor cache (it can run to gigabytes for small companies). This is why consumer routers have kilobytes of CPU cache for the CPUs they use as they only need to store the code for processing packets and not any data. This sort of use doesnt benefit from having huge cache, even for mobile CPUs where power efficiency is more important than instant access.

 

a processor cache is a lot faster than ram in latency. Ram bandwidth is on par with the highest level cache of the CPU but not in latency. In drives RAM can be used as cache since it is a lot faster but also a lot cheaper than CPU cache per capacity. So theoratically you could have a hard drive with a sodimm slot and have many gigabytes of ram as cache but with drives you risk of running into the issue of not enough time to dump the cache in case of power failure if the cache is too big.

 

CPU cache can have many levels not just 3 (reffering to above posts). Sparc had 4 or more levels. I think 6 levelsis the highest known to CPUs. Theres no such thing as a limit to cache levels or size, but there are limitations such as power and cost. A 1GB CPU cache would be very expensive, consume a huge die space and have low yields.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×