Jump to content

AMD's Hawaii Is Officially The Most Efficient GPGPU In The World To Date, Tops Green500 List

I'm pretty sure maxwell cut compute even further :/

yeah they have, but not to the point where its not good at it. if you look at the theorethical TFLOPS for the GM204 its quite higher than the GK104. and the X04 chips are never compute beasts. while the X10 are usually pretty rad at it xD

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell/Kepler suck giant asshole at compute, last laugh for NV was Fermi.

i7 5930k . 16GB Corsair Vengeance LPX 2666 DDR4 . Gigabyte GA-X99-Gaming G1-WIFI . Zotac GeForce GTX 980 AMP! 4GB SLi . Crucial M550 1TB SSD . LG BD . Fractal Design Define R2 Black Pearl . SuperFlower Leadex Gold 750w . BenQ GW2765HT 2560x1440 . CM Storm QF TK MX Blue . SteelSeries Rival 
i5 2500k/ EVGA Z68SLi/ FX 8320/ Phenom II B55 x4/ MSI 790FX-GD70/ G.skill Ripjaws X 1600 8GB kit/ Geil Black Dragon 1600 4GB kit/ Sapphire Ref R9 290/ XFX DD GHOST 7770 
Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell/Kepler suck giant asshole at compute, last laugh for NV was Fermi.

They are still waaay faster than fermi, but thats because they have a shitton more cores :P

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

no i dont think they would take intels GPU. first of all, because they like their in house solutions, and second of all, intel is trying to shove their D into the mobile market. just giving all that performance to a competitor is the equivalent of chopping that D off ;)

I've been surprised in the past, I think this is something Apple would do given the right reasons etc and pull off with the control they have on their ecosystem.

Link to comment
Share on other sites

Link to post
Share on other sites

I've been surprised in the past, I think this is something Apple would do given the right reasons etc and pull off with the control they have on their ecosystem.

it maybe something apple would do, its deffo not something intel would do. and im not sure i can see NV selling a licence either, with their Tegras gaining ground, and i think the Tegra M1 (or wahtever it will be) if it is the 1SMM with 2 powerful cores could do quite well, speed and power wise

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

It's been known for a long time that AMD is king when it comes to GPGPU compute performance. Gamers seem to forget about how advanced these things really are because nVidia has optimised their architecture more for just graphics, and to a much lesser extent compute. This is why nVidia has the reputation of being more efficient where in reality the emphasis is just different.

To all those who seem surprised: don't be. AMD is working tirelessly to have a GPU architecture designed to do everything, including accelerating aplications in everyday tasks through OpenCL and soon HSA. nVidia doesn't really compete here as they've created a market for themselves with the proprietary CUDA language. So they can focus their efforts on graphics performance whereas AMD needs to have power allround.

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

it maybe something apple would do, its deffo not something intel would do. and im not sure i can see NV selling a licence either, with their Tegras gaining ground, and i think the Tegra M1 (or wahtever it will be) if it is the 1SMM with 2 powerful cores could do quite well, speed and power wise

Oh I wouldn't expect Intel to at all, but if Apple wanted the performance I'd see them moving in a heartbeat :)

Link to comment
Share on other sites

Link to post
Share on other sites

General relativity is not proven. nothing is proven. congratulations, youve just proven yourself unworthy of any more of my attention. you speak of science, yet you dont understand the basic concept of it.

 

and no, there are not 3 competing theories, all unproven. there is one theory right now, that is being worked on. Its not really even a theory yet, its just a bunch of hypothesis' that were not shot down with our observations yet. its called the M Theory and it attempts to unify and translate each of the string theories and supergravity as the infinite limits of a bigger theory. http://en.wikipedia.org/wiki/M-theory

 

seriously? youre gonna nitpick 2%? im a physicist. to me pi^2 is 10, and a year has pi * 10^7 seconds. you wont make me accept 11.8 to be different from 10

You have quantum loop gravity, Quantum Gravitation, and Unified Field Theory (bound up in M theory) all looking for explanations of gravitational effects at the quantum level. Thanks for proving you know nothing beyond your physics textbooks and lectures.

 

Second, we have absolute proof of how air resistance, thermodynamic flow, and other physical phenomena occur and act. We have equations which perfectly map some interactions at some scales, but what we lack is that which encompasses everything, the hidden jewel everyone's looking for.

 

You're a child as much as you treat me like one.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It's been known for a long time that AMD is king when it comes to GPGPU compute performance. Gamers seem to forget about how advanced these things really are because nVidia has optimised their architecture more for just graphics, and to a much lesser extent compute. This is why nVidia has the reputation of being more efficient where in reality the emphasis is just different.

To all those who seem surprised: don't be. AMD is working tirelessly to have a GPU architecture designed to do everything, including accelerating aplications in everyday tasks through OpenCL and soon HSA. nVidia doesn't really compete here as they've created a market for themselves with the proprietary CUDA language. So they can focus their efforts on graphics performance whereas AMD needs to have power allround.

Um, no. Nvidia outnumbers AMD's supercomputer marketshare almost 3.5 to 1. If AMD was actually king of GPGPU compute, it would show up in the market.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

in compute. a 980 is quite bad at DPFP compute ;)

 

 

I'm pretty sure maxwell cut compute even further :/

 

At least at double precision

Ahh I see, interesting

Computer Specifications:

AMD Ryzen 5 3600  Gigabyte B550M Aorus Elite | ADATA XPG SPECTRIX D50 32 GB 3600 MHz | Asus RTX 3060 KO Edition CoolerMaster Silencio S400 Klevv Cras C700 M.2 SSD 256GB 

1TB Crucial MX500 | 1 TB SanDisk SSD Corsair RM650W

Camera Equipment:

Camera Bodies: 

Olympus Pen-F Panasonic GH3 (Retired)

Lenses:

Sigma 30mm F1.4 | Sigma 16mm F1.4 | Sigma 19mm F2.8 | Laowa 17mm F1.8 | Olympus 45mm F1.8

Link to comment
Share on other sites

Link to post
Share on other sites

Something something space heater

 

Fuck you guys, I don't see you making GPUs. That joke is really old and annoying now.

I really wish I could come on here to read some interesting news and read a good discussion about it in the comments, but nooooooooo, you've all got your heads so far up Nvidia and Intel's asses that you couldn't even think for one second that AMD could do something right.

 

Except for the reasonable people who I've been able to talk about things with reasonably, you know who you are. You guys are awesome. Problem is, on here, you're part of the 10%.

waffle waffle waffle on and on and on

Link to comment
Share on other sites

Link to post
Share on other sites

Something something space heater

 

Fuck you guys, I don't see you making GPUs. That joke is really old and annoying now.

I really wish I could come on here to read some interesting news and read a good discussion about it in the comments, but nooooooooo, you've all got your heads so far up Nvidia and Intel's asses that you couldn't even think for one second that AMD could do something right.

 

Except for the reasonable people who I've been able to talk about things with reasonably, you know who you are. You guys are awesome. Problem is, on here, you're part of the 10%.

I'll admit AMD did something right when they produce a good product across all bases. AMD is the absolute loser when it comes to market share. If the market thought AMD was better (which would show up in said market share) then the market would buy! The thermals are worse, the power draw is worse, and the compute is at best equal between Nvidia and AMD, whereas AMD just completely loses to Intel. What about this is wrong?!

 

I'm rooting for AMD! I'd love to be able to drop my Intel stock which has no more room to grow, and maybe when the benchmarks for Carrizo come out and the 300 series launches, I will if the results are good. Until then, we have only AMD's current, lousy products to evaluate them on.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

General relativity is not proven. nothing is proven. congratulations, youve just proven yourself unworthy of any more of my attention. you speak of science, yet you dont understand the basic concept of it.

 

and no, there are not 3 competing theories, all unproven. there is one theory right now, that is being worked on. Its not really even a theory yet, its just a bunch of hypothesis' that were not shot down with our observations yet. its called the M Theory and it attempts to unify and translate each of the string theories and supergravity as the infinite limits of a bigger theory. http://en.wikipedia.org/wiki/M-theory

 

seriously? youre gonna nitpick 2%? im a physicist. to me pi^2 is 10, and a year has pi * 10^7 seconds. you wont make me accept 11.8 to be different from 10

 

 

"Einstein's relativity work is a magnificent mathematical garb which fascinates, dazzles and makes people blind to the underlying errors. The theory is like a beggar clothed in purple whom ignorant people take for a king... its exponents are brilliant men but they are metaphysicists rather than scientists."
New York Times (11 July 1935), p. 23, c.8
 
"Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality."
 
-Nikola Tesla

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

I really wish I could come on here to read some interesting news and read a good discussion about it in the comments, but nooooooooo, you've all got your heads so far up Nvidia and Intel's asses that you couldn't even think for one second that AMD could do something right.

 

Except for the reasonable people who I've been able to talk about things with reasonably, you know who you are. You guys are awesome. Problem is, on here, you're part of the 10%.

This forum has gotten so toxic that I went from posting over 10 times a day to visiting the forum maybe twice a week, getting disgusted and leaving.

Link to comment
Share on other sites

Link to post
Share on other sites

This forum has gotten so toxic that I went from posting over 10 times a day to visiting the forum maybe twice a week, getting disgusted and leaving.

 

I'd love to find a different forum with a good news section that has a more mature userbase but... I have no idea where to start.

waffle waffle waffle on and on and on

Link to comment
Share on other sites

Link to post
Share on other sites

This forum has gotten so toxic that I went from posting over 10 times a day to visiting the forum maybe twice a week, getting disgusted and leaving.

 

Kinda on the same boat. Any kind of discussion quickly becomes a battle of egos or just devolves into general shitflinging.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

No, but first they wil probably start implementing an AiO on all their GPUs, and eventually move to thermoelectric or phase change cooling :P

Not really what I was asking about. If heat-output and power consumption will always be a secondary thought, will it be a viable long-term strategy?

Link to comment
Share on other sites

Link to post
Share on other sites

Not really what I was asking about. If heat-output and power consumption will always be a secondary thought, will it be a viable long-term strategy?

No. They will have to do something about it sooner or later. They said a while ago they were expecting a 75% power consumption and heat output reduction in the next 20 years or something ridiculous.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

People seem to be confused about pew pew and computing being two entirely different things. Also as already mentioned before, AMD's stock coolers are crap. When we focus on computing performance and a *good* cooling solution, AMD is impressive. If you want to play video games all day, Nvidia is solid. Was planning on getting a GTX 970, but I'm curious about the 3XX series of cards.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Um, no. Nvidia outnumbers AMD's supercomputer marketshare almost 3.5 to 1. If AMD was actually king of GPGPU compute, it would show up in the market.

I've already explained that nVidia has created their own market share because they got in first and gave the proprietary CUDA language. A lot of projects that you'd need a supercomputer for had been in development for a long time and the developers aren't going to rewrite the whole thing in OpenCL just because of the small  performance increase this enables. Also, supercomputer manufacturers know there are some projects out there that are written in CUDA, and some in OpenCL. They want to support both.

 

 

 

People seem to be confused about pew pew and computing being two entirely different things. Also as already mentioned before, AMD's stock coolers are crap. When we focus on computing performance and a *good* cooling solution, AMD is impressive. If you want to play video games all day, Nvidia is solid. Was planning on getting a GTX 970, but I'm curious about the 3XX series of cards.

They're not entirely different, but there are optimisations that apply to one but not the other. That new texture compression stuff that Maxwell has going on for example works wonders for saving memory bandwidth for video game rendering tasks, but it doesn't do you any good at all in the compute department. nVidia's cards are just much more specifically geared towards graphics performance due to that and other optimisations that they make (you can clearly see the balance shift here when you compare their older 400-series cards to the newer ones) whereas AMD focuses on raw power. Neither is a bad thing but just remember that as a result both excel at different things.

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

No. They will have to do something about it sooner or later. They said a while ago they were expecting a 75% power consumption and heat output reduction in the next 20 years or something ridiculous.

I'm pretty sure it was by 2020, so 5 years, and also the power consumption is meant to be slashed by 25X not 75%.

My PC specs; Processor: Intel i5 2500K @4.6GHz, Graphics card: Sapphire AMD R9 Nano 4GB DD Overclocked @1050MHz Core and 550 MHz Memory. Hard Drives: 500GB Seagate Barracuda 7200 RPM, 2TB Western Digital Green Drive, Motherboard: Asus P8Z77-V , Power Supply: OCZ ZS series 750W 80+ Bronze certified, Case: NZXT S340, Memory: Corsair Vengance series Ram, Dual Channel kit @ 1866 Mhz, 10-11-10-30 Timings, 4x4 GB DIMMs. Cooler: CoolerMaster Seidon 240V

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure it was by 2020, so 5 years, and also the power consumption is meant to be slashed by 25X not 75%.

Are you sure you mean 25X, and not 25%? 25X would mean a reduction of 2500%, which doesn't make sense to me (Unless someone can explain the math around that? I thought about it, and now my head hurts :P lol)

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure you mean 25X, and not 25%? 25X would mean a reduction of 2500%, which doesn't make sense to me (Unless someone can explain the math around that? I thought about it, and now my head hurts :P lol)

 

Here is the press release from AMD's website: http://www.amd.com/en-us/press-releases/Pages/amd-accelerates-energy-2014jun19.aspx

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks, you beat me to it.

My PC specs; Processor: Intel i5 2500K @4.6GHz, Graphics card: Sapphire AMD R9 Nano 4GB DD Overclocked @1050MHz Core and 550 MHz Memory. Hard Drives: 500GB Seagate Barracuda 7200 RPM, 2TB Western Digital Green Drive, Motherboard: Asus P8Z77-V , Power Supply: OCZ ZS series 750W 80+ Bronze certified, Case: NZXT S340, Memory: Corsair Vengance series Ram, Dual Channel kit @ 1866 Mhz, 10-11-10-30 Timings, 4x4 GB DIMMs. Cooler: CoolerMaster Seidon 240V

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure you mean 25X, and not 25%? 25X would mean a reduction of 2500%, which doesn't make sense to me (Unless someone can explain the math around that? I thought about it, and now my head hurts :P lol)

Yes. Its possible.

My PC specs; Processor: Intel i5 2500K @4.6GHz, Graphics card: Sapphire AMD R9 Nano 4GB DD Overclocked @1050MHz Core and 550 MHz Memory. Hard Drives: 500GB Seagate Barracuda 7200 RPM, 2TB Western Digital Green Drive, Motherboard: Asus P8Z77-V , Power Supply: OCZ ZS series 750W 80+ Bronze certified, Case: NZXT S340, Memory: Corsair Vengance series Ram, Dual Channel kit @ 1866 Mhz, 10-11-10-30 Timings, 4x4 GB DIMMs. Cooler: CoolerMaster Seidon 240V

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×