Why do Graphics cards clock a lot lower than cpus?
18 minutes ago, person223 said:Why do Graphics cards clock a lot lower than cpus?
You've gotten a good number of replies so far, but I'd like to use an analogy to really put the difference between CPUs and GPUs into easy-to-understand terms:
Think of a typical family sedan that people drive around with -- it's good for going shopping with in a city, people drive to work with them, chauffeur their kids to/from hobbies and school and such, maybe even haul small furniture with. Those kinds of cars are kinda like CPUs -- good for a lot of things, though not necessarily the best thing for any of them.
Now, think about a rocket-car: it goes really, really fast, but you can't carry even a small bag in one, there's only room for one person and even then just barely, they can't even turn! They do one thing really, really well, though: they go forward extremely fast. That's kind of like a GPU in some ways, ie. it has a specific purpose that it's good at and it does that purpose better than almost anything else.
In a manner similar to the two above cars, most comparisons between them just don't make much sense. Technically there are some kinda-sorta similar things, but they're often implemented in such different manner that you just can't apply concepts from one to the other. You probably haven't realized it, but your Ethernet-card has a clock, your SSDs and HDDs contain multiple clocks, your mobo contains a good handful of clocks, USB needs a clock at a certain speed and so on -- comparing those doesn't work, because they're for different purposes.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now