Jump to content

Nvidia CEO - Moore's Law is dead; GPUs to replace CPUs

kagarium
25 minutes ago, MadyTehWolfie said:

It wouldn't on the sole fact it relys on being subsidized by tax payers. The day Elon can make his business model work without the government paying close to half to 80% of his products from him the. I'll believe gas engines are dying.

So strange that I heard the FF6 fanfare theme play in my head after reading your post.

That said, it is government business to be involved in infrastructure and big R&D cause no one else can or will.  Aside from Govs only banks really can get involved, but those guys are just huge assholes and ruin everything.

 

 

:edit: OT: If Nvidia GPUs are so great, why are they soo reliant on intel CPUs to manage their scheduling? 

[/end cheap poke-jab]

Link to comment
Share on other sites

Link to post
Share on other sites

I believe we were actually too far behind the curve for Moore's Law by the around 2001. Improvements in certain areas will still be extremely rapid, but we've reached limits for what silicon can do. There's a reason everything is multi-core now. 

 

The effect of Moore's analysis about production from the initial stage through deep maturity of any silicon (or like) product was a deep insight into the scaling aspects of certain types of manufacturing and design. We still see the effect any time a new display panel technology comes on line, but there's always a point where the effect breaks. Because everything always has a zero bound, so you eventually reach the end of the effect.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, JoostinOnline said:

Nobody can decide if Moore's Law is dead.  It is.  End of story.  I explained it on facebook but I don't want to type it out again.

Capture.PNG.227e0d4aa47ec25bb08ebdad72aac1f0.PNG

The conversation goes:

Me

Someone else

Me

Someone Else

Me responding to a name I forgot to block.

Exactly, I don't get why people keep associating Moore's law with computation (Yes there are gains in computation and such but that's not the point of the observation).

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, MoonSpot said:

So strange that I heard the FF6 fanfare theme play in my head after reading your post.

That said, it is government business to be involved in infrastructure and big R&D cause no one else can or will.  Aside from Govs only banks really can get involved, but those guys are just huge assholes and ruin everything.

 

 

:edit: OT: If Nvidia GPUs are so great, why are they soo reliant on intel CPUs to manage their scheduling? 

[/end cheap poke-jab]

Not the job of the government to subsidize the R&D or the products for the end consumer for a private company.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

He needs to figure out a way to transition people though: I agree that for the user experience and administration of other hardware we're approaching diminishing returns in CPUs and what we realistically need but people like Nvidia need to not just come up with better and faster GPUs but ways to make it substantially easier for coders to take advantage of said resources otherwise outside of some types of workloads it's kind of moot: so many things could take advantage of CUDA but do not because it's not easy to just say "here use this other resource" and magically make your code optimized for it to matter.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

In a way Intel shot their own foot... Phi?? Looks exactly like a GPU.

CanTauSces: x5675 4.57ghz ~ 24GB 2133mhz CL10 Corsair Platinum ~ MSI X58 BIG BANG ~ AMD RADEON R9 Fury Nitro 1155mhz ~ 2x Velociraptor 1TB RAID 0 ~ 960GB x3 Crucial SSD ~ Creative SB Audigy FX ~ Corsair RM series 850 watts ~ Dell U2715H 27" 2560x1440.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Potato*Salad said:

In a way Intel shot their own foot... Phi?? Looks exactly like a GPU.

Xeon Phi started out as an x86 based GPU.

 

But Xeon Phi is a self contained computer that number crunches whatever the main CPU tells it to. It is the closest thing we'll see to Nvidia CEO's claims, IMO.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Spenser1337 said:

Thank you 

Cummins Engines are starting to become near zero emissions. They release more water and Carbon Dioxide which is much better for the environment. Diesel engines have more more torque at a lower RPM then a gas engine does, meaning that the Gas engine will pollute more. The diesel doesn't have to work as hard in the end. So I can definitely see Gas engine disappearing someday soon. They are starting to release diesel engines in the chevrolet equinox and other suvs and cars. Ford is adding a 3.5 liter or so Powerstroke to there F150, more torque less power. Dodge uses a Italian Ecodiesel engine which benefits the needs of most vehicles. Mostly used in Jeeps, Ram vehicles. The world is shifting to Diesel, and there will be those vehicles with electricity instead. But the bottom line is, diesel and electric together are the best for the environment.

The geek himself.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, gabrielcarvfer said:

Actually, the latest Xeon Phi (Knights Landing) can be the main CPU, with up to 72 cores and 288 threads.

I think we will be getting a LTT video on that soon (Was briefly discussed in a live stream)

Link to comment
Share on other sites

Link to post
Share on other sites

How would you operate a system on all GPU's? Don't you need a CPU to tell the GPU what to do, or handle tasks that would take up precious GPU tasking?

 

This makes no sense to me. IIRC GPU's are just highly specialized Microprocessors that have a shit ton more cores (threads?) than CPU's. Are they talking about replacing the typical low core design of CPU's with systems that have a lot more cores, all highly specialized but not necessarily for the same task?

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Being Delirious said:

Dodge uses a Italian Ecodiesel engine..

VM motori.

34 minutes ago, Being Delirious said:

 But the bottom line is, diesel and electric together are the best for the environment.

They've a long way to go on affordability.  EGR, SCR, particulate filter capturing and exhaust aftertreatment systems are hugely expensive and the most common points of failure.  Maintenance is generally less costly for diesels since diesel itself is a lubricant and oil changes are half a frequent as a gas car requires; but upkeep and repairing diesels is more than a little more costly than the a-typical gasser.  Until diesels cost of ownership is equivalent to that of a gas powered car, it won't replace them.  And a diesel without all the exhaust treatment may still have been better for the environment, but they were very bad for peoples lungs.  There are reasons why Governments were pissed with VW in particular for their e-test cheating.  NOx, from untreated diesel exhaust, means ground level Ozone, smog and airborne particles that embed in peoples lungs.  That brings respiratory problems, like lung cancer.

 

I'm with ya though, I absolutely wanted a diesel/electric vehicle.  Just don't want to have to pay for one.  Actually don't think I could afford to own one even if I wanted to.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MoonSpot said:

VM motori.

They've a long way to go on affordability.  EGR, SCR, particulate filter capturing and exhaust aftertreatment systems are hugely expensive and the most common points of failure.  Maintenance is generally less costly for diesels since diesel itself is a lubricant and oil changes are half a frequent as a gas car requires; but upkeep and repairing diesels is more than a little more costly than the a-typical gasser.  Until diesels cost of ownership is equivalent to that of a gas powered car, it won't replace them.  And a diesel without all the exhaust treatment may still have been better for the environment, but they were very bad for peoples lungs.  There are reasons why Governments were pissed with VW in particular for their e-test cheating.  NOx, from untreated diesel exhaust, means ground level Ozone, smog and airborne particles that embed in peoples lungs.  That brings respiratory problems, like lung cancer.

 

I with ya though, I absolutely wanted a diesel/electric vehicle.  Just don't want to have to pay for one.  Actually don't think I could even if I wanted to.

I hate VW lol. I see your point, and I for sure thought that we need to wait for the price of diesel repairs specifically go down to a gas price. Thanks for the response mate. Cummins made a great 5.0L engine for nissan though can't lie about that.

The geek himself.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, there are limits of silicon based chip technology. You can't go infinitely smaller because at some point you'll end up with relativistic effects in your chip's circuitry – if that happens you can't predict and/or reproduce anything on your chip so it's not working anymore. But you'd had to keep your circuitry shrinking and shrinking in order to double the transistorts on the die. Of course, you can increase die sizes but that will create different problems. The longer the tracks, the longer it takes for signals to reach the next unit. The longer the bus the lower the max frequency. Efficiency is another issue. So you can't just make them bigger and bigger in order to double the number of transistors. 

So yeah, it's already dead.

But that doesn't mean anything. There's still plenty of room for improvements. CPUs aren't dead and won't be dead for a long time. They will change, yes, and parallelization will become more important. So differences between CPUs and GPUs will become smaller. But they will not disappear. A major instruction set switch at this point is more than unlikely so unless there's a x86 compatible GPU they won't take over. 

GPUs are kind of similar compared to (the concept of) quantum computers: they can be blazingly fast in certain applications but they're no general purpose chips and will never be universally fast at any application. That's just a fact.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

GPUs won't replace CPUs. But CPU architecture will change. X86 is close to it's end. Multicore is the way to go. But still it will be a multicore CPU, not a GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Spenser1337 said:

Tesla CEO says engines are dying

 

almost as if they believe in the product they produce 

Musk is a bit of a dreamer anyway...

 

Spoiler

CPU:Intel Xeon X5660 @ 4.2 GHz RAM:6x2 GB 1600MHz DDR3 MB:Asus P6T Deluxe GPU:Asus GTX 660 TI OC Cooler:Akasa Nero 3


SSD:OCZ Vertex 3 120 GB HDD:2x640 GB WD Black Fans:2xCorsair AF 120 PSU:Seasonic 450 W 80+ Case:Thermaltake Xaser VI MX OS:Windows 10
Speakers:Altec Lansing MX5021 Keyboard:Razer Blackwidow 2013 Mouse:Logitech MX Master Monitor:Dell U2412M Headphones: Logitech G430

Big thanks to Damikiller37 for making me an awesome Intel 4004 out of trixels!

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, samcool55 said:

Did we come full circle? This sounds very similar to what ATI yelled 10 years ago. CPU and GPU would work together at such a close level you wouldn't notice what runs on what.

 

Nvidia, run Linux on a nvidia GPU ONLY first, then we'll talk again.

Won't happen. This is just PR at it's best.

 

He doesn't mean Windows will boot off a GPU.

 

He actually means that certain workloads which run on CPUs will run on GPUs much faster and will be accelerated. For example Premiere Pro etc. We'll just see more and more programs taking advantage of GPU horsepower. Especially enterprize, machine learing etc.

 

But it makes a flashier headline and sparks interest and debate if he says something akin to "GPU replaces CPU" because people people will interpret stuff into it. 

 

I'm not critisizing him, just pointing out. He's actually smart about it.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't Intel themselves announce it was dead not that long ago?

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2017 at 2:01 PM, Flavio hc 16 said:

 But in reality, you can't stack up as many gpu cores as cpu cores without needing a nuclear power plant

 

Uh, not very true.

Join the Appleitionist cause! See spoiler below for answers to common questions that shouldn't be common!

Spoiler

Q: Do I have a virus?!
A: If you didn't click a sketchy email, haven't left your computer physically open to attack, haven't downloaded anything sketchy/free, know that your software hasn't been exploited in a new hack, then the answer is: probably not.

 

Q: What email/VPN should I use?
A: Proton mail and VPN are the best for email and VPNs respectively. (They're free in a good way)

 

Q: How can I stay anonymous on the (deep/dark) webzz???....

A: By learning how to de-anonymize everyone else; if you can do that, then you know what to do for yourself.

 

Q: What Linux distro is best for x y z?

A: Lubuntu for things with little processing power, Ubuntu for normal PCs, and if you need to do anything else then it's best if you do the research yourself.

 

Q: Why is my Linux giving me x y z error?

A: Have you not googled it? Are you sure StackOverflow doesn't have an answer? Does the error tell you what's wrong? If the answer is no to all of those, message me.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2017 at 2:03 PM, Trixanity said:

I agree that we probably need a radical re-design of modern CPUs and that we need to dump a lot, if not all, of the legacy baggage.

Intel tried that once.  It was often referred to as the Itanic.  That "legacy baggage" means software doesn't have to be rewritten from scratch.  Would it be better if it were?  Probably, but the costs would also be significantly higher, meaning the purchase price would be higher.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Jito463 said:

Intel tried that once.  It was often referred to as the Itanic.  That "legacy baggage" means software doesn't have to be rewritten from scratch.  Would it be better if it were?  Probably, but the costs would also be significantly higher, meaning the purchase price would be higher.

I'm aware of that but sometimes you gotta make a radical change to move forward. In the beginning it would suck for all parties but given time people would probably ask themselves why we didn't do it sooner. We have that same problem that "things would be so much better if we did this but it costs so much money and time" in a lot of areas.

 

It seems x86 is in need of an amputation in order to start moving again.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2017 at 7:57 PM, kagarium said:

Nvidia CEO Jensen Huang said at the GTC confrence in Beijing that GPUs will replace CPUs.

http://www.pcgamer.com/nvidia-ceo-says-moores-law-is-dead-and-gpus-will-replace-cpus/

Intel denies that Moore's Law is dying, and continues to back it up.

It's an interesting thing to consider, GPUs replacing CPUs. We see it happening a lot already in modern gaming consoles.

 

What's everyone else think? Are CPUs failing?

 

-Ken

 

 

I wouldnt say Moores law is dead, per say, but Adored bring up a valid point; it wont survive without serious innovation. AMD is currently spearheading this. Intel has said their next Full architecture (10nm will be new arch. Core architecture ends with coffee lake) will be modular like Zen is.

Link to comment
Share on other sites

Link to post
Share on other sites

"RAM will replace gpu's" says random forum poster. I want to see a demo of some sort.

GPU drivers giving you a hard time? Try this! (DDU)

Link to comment
Share on other sites

Link to post
Share on other sites

It doesnt matter how much faster GPUs are and how parallelised the task is, GPUs will never replace CPUs and the answer is simple. CPUs are very good and fast at branching, switching between things and performing logic and other decisions, something a GPU cant do fast enough.

 

There are 2 things i want to point out

1) CPUs have lower latencies than GPUs, they are much smaller too and their memory architecture is all about low latency including their caches. GPUs have loads of bandwidths, lots of paths but have a lot higher latencies. This directly impacts decision making and logic which are mostly dependent on other data which a GPU would be absolutely slow at (yes try running unoptimised workloads on GPUs, GPU usage will go up to 100% on nvidia but you get very slow on processing, far from stated numbers).

 

2) Upgradeability, many CPUs even intels let you swap out between a couple of gens with AMD having much greater compatibility, it just makes board and CPU pairing easier and cheaper vs a GPU that is created on a specific BGA or LGA, this makes using a GPU as a general purpose CPU expensive as not only will everything cost more (GPUs cost way more than mainstream CPUs) but what about memory? Will you be able to put in your own ram? make us some GDDR5 sticks first nvidia before trying to dump GPUs into our faces to replace CPUs.

 

3) Most tasks are logic and decision making despite data crunching being a large task, but there is more code out there in decision making than there is in number crunching. GPUs are specialised processors.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×