Jump to content

My teacher told me about lockers with over 200 cpus for weather calculations

Could someone teach me a little more about the following subject?  

So my teacher told me that in order to be able to approximate the weather you needed super computers, he also told me that they have a super computer nearby. They have big lockers full of cpus or something like that. I was thinking, isn't this similar to hooking up 6 GTX 1080 TIs and doing calculations for companies?  My question is then, why do they use cpus instead of gpus?

Link to comment
Share on other sites

Link to post
Share on other sites

It's easier to spread the calculations over fewer stronger cores in some applications. They do also use GPUs for the workloads that do better with them. Most servers with this kind of workload will have racks of CPUs and GPUs performing different work. 

That's an F in the profile pic

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zindan said:

My question is then, why do they use cpus instead of gpus?

Graphical tasks require GPU's, some demand CPU computation power instead.

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

GPU and CPU are good for different type of calculations but my guess here is that they primarily use CPUs because of the sheer amount of RAM required for these calculations.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, zindan said:

Could someone teach me a little more about the following subject?  

So my teacher told me that in order to be able to approximate the weather you needed super computers, he also told me that they have a super computer nearby. They have big lockers full of cpus or something like that. I was thinking, isn't this similar to hooking up 6 GTX 1080 TIs and doing calculations for companies?  My question is then, why do they use cpus instead of gpus?

well, mostly because they are fundamentally different. 

CPU's have much more powerfull cores. and i guess for the workload they are getting in that software you benefit greater from fewer more powerfull cores then from more less powerfull cores.

 

in the end it all depends on what the workload you are getting utilizes.

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the info, I think I understood most of it. These sort of things are very very interesting. Hardware combined with software leads to extremely useful outcomes. Iv'e also been told that they use the same kind of system when they are doing chemistry. Millions and millions different types of mixtures etc.. They do this with CPUs/GPUs no doubt right? 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, RandomGuy13 said:

Its also something to note, GPU computation is sort of a newish thing. Back in the day, a GPU was only for 2d and later 3d video. I think it must have started with hashing when people started to make algorithms for GPU compute. Something like Prime number finding or password cracking I imagine is where it started because historically everything was done on the CPU. Even to this day a lot of stuff that would clearly benefit from GPU acceleration (Audio rendering and even some video rendering is still done on CPU).

This is a great answer, very well written.

 

This Wikipedia page about General purpose computing on GPUs is talking more about why it's a newish thing, what kind of data can be processed on GPUs and why. I decided to post it as my answer would be very similar.

My heart belongs to AMD but that doesn't mean I furiously hate Intel or NVIDIA :)

 

MAIN RIG AMD Ryzen 7 1700 | ASRock Fatal1ty X370 Gaming-ITX/ac | MSI HD7950 OC 3GB | G.Skill Ripjaws V 2x8GB @ 2666MHz (Samsung D-Die) | ADATA SX8200 480GB NVMe SSD & Seagate Barracuda 120 1TB SSD & WD Black 500GB | Sharkoon QB One

 

LAPTOP Lenovo Yoga Slim 7 (14ARE05) - AMD Ryzen 5 4500U | AMD Vega 8 (Renoir) | 16GB RAM | SKHynix PC601 512GB (OEM) | 1080p 300nit non-touch display

Link to comment
Share on other sites

Link to post
Share on other sites

I know that the super computer in Norway that is used to calculate weather (and probably some other things) is 2808 8 core Intel Xeon E5-2670 CPUs. They have 16 GB memory each, but the cpus are in dual socket mobos/servers.
As far as I know, it have no GPUs. Weather calculations doesnt actually work that well on GPUs. a lot of data is dependent on other data.

Its in NTNU btw, and the heat is used to warm up some of the buildings there in the winter.
Its not new.
 

Its 396 Tflop.

Going to be replaced soon with something better tho.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Mihle said:

I know that the super computer in Norway that is used to calculate weather (and probably some other things) is 2808 8 core Intel Xeon E5-2670 CPUs. They have 16 GB memory each, but the cpus are in dual socket mobos/servers.
As far as I know, it have no GPUs. Weather calculations doesnt actually work that well on GPUs. a lot of data is dependent on other data.

Its in NTNU btw, and the heat is used to warm up some of the buildings there in the winter.
Its not new.
 

Its 396 Tflop.

Going to be replaced soon with something better tho.

What are the costs for these? What about the world of warcraft/counterstrike servers? Is it the same kind of setup with expensive cpus?

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, zindan said:

What are the costs for these? What about the world of warcraft/counterstrike servers? Is it the same kind of setup with expensive cpus?

The cost is a lot, I dont know how much.
Its way more than you would need for CS servers (I think)

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

They do different things. Gpus do many similar math operations at once. Some things benefit from gpu acceleration greatly, but weather prediction has many complicated functions that require many varied calculations - requiring cpus, though certain aspects are gpu accelerated. In short, a petaflop on cpus ≠ a petaflop on gpus. Hope this helps.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Gravesnear said:

They do different things. Gpus do many similar math operations at once. Some things benefit from gpu acceleration greatly, but weather prediction has many complicated functions that require many varied calculations - requiring cpus, though certain aspects are gpu accelerated. In short, a petaflop on cpus ≠ a petaflop on gpus. Hope this helps.

Also the data sets for weather prediction tend to be absolutely enormous, at a minimum in the tens or hundreds of TB. So programs that can handle data access in addition to heavy floating point operations are important. So currently the target systems (i.e., what hardware the program is written to take advantage of) tend to be CPU based rather than GPU based. However with GPU speeds/cores increasing over the last couple of years - remember it has only really been in the last 3-5 years that GPUs have caught up to CPUs in compute performance - this may be changing.

 

The design/built times for supercomputer level machines is in the 5-10 year range, so the machines that you see going into production now were designed based on the prototypes and timelines from the chip manufacturers from years ago. You don't design/build a several million dollar computer in a weekend or even a year, just getting the funding takes years. Yes the final hardware used may change in detail but the base architecture and software is going to be the same, so you can't just drop in GPU computing when the performance exceeds your planning multi-CPU array at the last minute, even if it would be technically faster in some of your cases.

 

The GPU based supercomputers that are coming online are the first of the machines that were designed at the very leading edge of the 2010/2012 early indications that GPU compute was going to take off from looking at the NVIDA/AMD projected timelines of when they were releasing enterprise products. These supercomputers were huge risks for the companies making them because if the GPU compute had not advanced as fast as projected then they would be failing right now rather than being able to deliver actual supercomputers that are faster than most others at some tasks.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×