Jump to content

AMD: APUs to get 25x more efficient by 2020

hal9001

FFS. Just make it so the APU's can run in crossfire with EVERY graphics card instead of just the low end ones.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

*looks at grandma's A6*

 

*laughs like a motherfucker*

 

Not seeing it.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

That would mean a decrease in power efficiency increase over time by 75%,

so basicly bad news? It also only seem to apply to mobile chips (if the power draw decreased by 60% the performance should have increased to 400% in the last 6 years....not sure which processor lineup they are talking about).

Link to comment
Share on other sites

Link to post
Share on other sites

why is everyone so negative?

 

remember the 480? that shit was even worst then 290x

 

they had to CUT HOLES IN THE PCB because it didnt get enough air... 

ITX Monster: CPU: I5 4690K GPU: MSI 970 4G Mobo: Asus Formula VI Impact RAM: Kingston 8 GB 1600MHz PSU: Corsair RM 650 SSD: Crucial MX100 512 GB HDD: laptop drive 1TB Keyboard: logitech G710+ Mouse: Steelseries Rival Monitor: LG IPS 23" Case: Corsair 250D Cooling: H100i

Mobile: Phone: Broken HTC One (M7) Totaly Broken OnePlus ONE Samsung S6 32GB  :wub:  Tablet: Google Nexus 7 2013 edition
 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah well whatever you are aiming for please aim higher as nvidia is giving you a good spanking in that department right now. 

Your talking about a new generation of gpu's. 3 gens ago NVIDIA was behind AMD in terms of power usage. I'm sure AMD's new generation of gpu's will use less power then their current offerings.

 

why is everyone so negative?

 

remember the 480? that shit was even worst then 290x

 

they had to CUT HOLES IN THE PCB because it didnt get enough air... 

Indeed using over 400 watts. Just a bunch of NVIDIA fanboys being negative towards AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

FFS. Just make it so the APU's can run in crossfire with EVERY graphics card instead of just the low end ones.

There wouldn't be a benefit with high-end cards. In fact, it would probably be slower in a lot of cases. It would be nice with an R7 260X (or its successor with Carizo), but anything higher than that is pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

*looks at grandma's A6*

 

*laughs like a motherfucker*

 

Not seeing it.

*Looks at Xeon*

*Giggles*

Good luck AMD, you have a long way from it..

Psst, that A6 has similar power consumption to my old i5 650. *Click the image for benchmarks*

ef2585bc2f80f3b236341f1b0121e16c.png

 

Spoiler

Senor Shiny: Main- CPU Intel i7 6700k 4.7GHz @1.42v | RAM G.Skill TridentZ CL16 3200 | GPU Asus Strix GTX 1070 (2100/2152) | Motherboard ASRock Z170 OC Formula | HDD Seagate 1TB x2 | SSD 850 EVO 120GB | CASE NZXT S340 (Black) | PSU Supernova G2 750W  | Cooling NZXT Kraken X62 w/Vardars
Secondary (Plex): CPU Intel Xeon E3-1230 v3 @1.099v | RAM Samsun Wonder 16GB CL9 1600 (sadly no oc) | GPU Asus GTX 680 4GB DCII | Motherboard ASRock H97M-Pro4 | HDDs Seagate 1TB, WD Blue 1TB, WD Blue 3TB | Case Corsair Air 240 (Black) | PSU EVGA 600B | Cooling GeminII S524

Spoiler

(Deceased) DangerousNotDell- CPU AMD AMD FX 8120 @4.8GHz 1.42v | GPU Asus GTX 680 4GB DCII | RAM Samsung Wonder 8GB (CL9 2133MHz 1.6v) | Motherboard Asus Crosshair V Formula-Z | Cooling EVO 212 | Case Rosewill Redbone | PSU EVGA 600B | HDD Seagate 1TB

DangerousNotDell New Parts For Main Rig Build Log, Señor Shiny  I am a beautiful person. The comments for your help. I have to be a good book. I have to be a good book. I have to be a good book.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Mantle and HSA alone would count for 20 of that 25x figure. If the software doesn't support either we're talking about a much more modest improvement. The hardware can only take you so far if you're running ancient code.

 

PS Maxwell isn't as efficient as people might imagine.
 

wB0hNgm.png

VHZGBej.png

ao1Jsgh.png
http://www.techspot.com/review/898-geforce-gtx-970-sli-4k-gaming/page6.html

Also please take note of this when looking at any power consumption graph.

Power consumption under the effects of a gaming workload also turns out to be lower than what older/slower equipment would have us believe. Those massive disparities between our gear and slower equipment only showed up in the last two generations of AMD's hardware, so it's a fairly recent phenomenon. But it does mean the company gets beaten up more than it should in most reviews.

http://www.tomshardware.com/reviews/graphics-card-performance-benchmarks,3784-3.html
Due to the unique way of how AMD PowerTune works, most power consumption tests out there end up unfairly favoring Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

The only reason I went with GTX 780 instead of R9 290 was the power consumption. Once AMD has that fixed, then it's sayonara Nvidia. Fuck you price gouging.

•  i7 4770k @ 4.5ghz • Noctua NHL12 •  Asrock Z87 Extreme 4 •  ASUS GTX 780 DCII 1156/6300 •

•  Kingston HyperX 16GB  •  Samsung 840 SSD 120GB [boot] + 2x Seagate Barracuda 2TB 7200RPM •

•  Fractal Design Define R4  •  Corsair AX860 80+ Platinum •  Logitech Wireless Y-RK49  •  Logitech X-530  •

Link to comment
Share on other sites

Link to post
Share on other sites

There wouldn't be a benefit with high-end cards. In fact, it would probably be slower in a lot of cases. It would be nice with an R7 260X (or its successor with Carizo), but anything higher than that is pointless.

With current architecture yes, that's true. I'm saying they need to make changes, and using HSA make it so that you can run the onboard gpu with the high end expansion card gpu and find a way to make it beneficial overall. Have the onboard GPU render things that don't require as much power or something, like backgrounds, the sky, etc. Remove some of easier stuff from the workload of the high end gpu so that it can concentrate on the more intensive workloads.

 

Have them work together, but individually on their own things, granted that would be harder to do and would require support from the devs (I would think)

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

I actually hope this is possible not only for the environmental impact, but with more efficient SOCs, AMD may be able to get into the mobile phone and tablet business. In addition, I would like AMD to catch up to INTEL so we can finally get some competition (innovation) in the CPU market. I think that Moore's law states that processing power will double every 18-24 months. It has nothing to do with efficiency. Go AMD!!!!!!

The law is that the number of transistors doubles, not that computing power doubles, which is a pipe dream.

 

Furthermore, there's a ton of innovation in Intel's hardware. Programmers and consumers are just too slow to pick it up. Intel has 4 ALU clusters per core in its CPUs. AMD has 2 even in its most recent architecture. Intel has extremely good power efficiency. Intel hosts the bleeding edge network interface technology. Intel has the fastest floating point calculator in the industry too. Just because you have software that stupidly uses spin locks (infinite loops until a condition is met) which send your CPU usage to 100% for no good reason on junk calculations doesn't mean Intel isn't innovating. Now Intel is focusing on iGPU and innovations that go beyond the simple calculations, such as wireless charging with Skylake, unified memory with Skylake for greater iGPU performance, and much more while trying to cram it into smaller and smaller thermal and power packages. Don't get me wrong: Intel was a highly anti-competitive company for a while, but it's highly innovative. Programmers just suck that much.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know why people are so negative, if they start chasing power efficiency I'm sure they could bring down the power consumption maybe not to the point where their goal is set at but probably down a considerable amount. Plus they are saying 2020, none of us can see into the future so only time will tell.  

CPU amd phenom ii x4 965 @ 3.4Ghz | Motherboard msi 970a-g46 | RAM 2x 4GB Team Elite | GPU XFX Radeon HD 7870 DD | Case NZXT Gamma Classic | HDD 750 GB Hitachi | PSU ocz modxstream pro 600w

Link to comment
Share on other sites

Link to post
Share on other sites

FFS. Just make it so the APU's can run in crossfire with EVERY graphics card instead of just the low end ones.

The graphics architecture has to exist before it can be integrated. I'm pretty sure APUs will always be a graphics architecture behind discrete cards for exactly this reason.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Lots of bullshit useless posting.

Seems to be a theme here: hate on a product/company you don't like because it's not popular in place for your other favorite corporate brand.


Whether you feel you have an excuse or not for your ignorant posting: useless comments in no way related to the original post does not help this community. You guys really betray your own age, or at the least, your maturity.

Link to comment
Share on other sites

Link to post
Share on other sites

The graphics architecture has to exist before it can be integrated. I'm pretty sure APUs will always be a graphics architecture behind discrete cards for exactly this reason.

Well, if you look at what AMD has talked about, with HSA, the idea is sound. You would need software to support it yes, but other than that, there is no reason that the onboard gpu on a processor can't handle the lower speed lower intensity workloads while the "discrete" gpu (still not sure why in the hell it's called that) handles the higher priority stuff. The only thing needed is for the 2 to communicate in such a way that each knows what it needs to do.

 

Lots of bullshit useless posting.

Seems to be a theme here: hate on a product/company you don't like because it's not popular in place for your other favorite corporate brand.

Whether you feel you have an excuse or not for your ignorant posting: useless comments in no way related to the original post does not help this community. You guys really betray your own age, or at the least, your maturity.

 

Your post is equally useless.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Whether AMD delivers or not, it's nice to see this sort of action coming from both Nvidia and AMD, as that's only going to be good for us.

CPU- 4690k @4.5ghz / 1.3v    Mobo- Asus Maximus VI Gene   RAM- 12GB GSkill Assorted 1600mhz   GPU- ASUS GTX 760 DCUII-OC 

Storage- 1TB 7200rpm WD Blue + Kingston SSDNow 240GB   PSU- Silverstone Strider ST75F-P

 

Link to comment
Share on other sites

Link to post
Share on other sites

But think about it. That means in 2020, an APU would rival a high end GPU today (not necessarily a 290X/780Ti, but a 280x/770). It may play games on low that are released in 2020, but imagine all the games you will be able to max out from 2000-2015, on an APU

That's pretty crazy.

It'll be a lot more than that dude. And in 2020, the gap between APU's and GPU's will be much closer. Just like how mobile gpu today are catching up to desktop GPU's

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

Love how Apple, Nvidia, Intel, ect. Can say we'll be more efficient and everyones like "Awesome, better <insert product here> faster and more efficient> as to where everyone on this thread is like:
A: AMD's lieing
B: Doubt it
C: More efficient doesn't mean better
D: it'll probably be slower then...

from the company responsible for the FX-9590 and r9 290x

idk man Nvidia made the 480 which was hotter than the 9590 and 290x >.> someones having some fanboy tunnel vision Nvidia's 480 was soooo hot and inefficient even for the time that Nvidia made the 500 series less than a year later, THEY RUSHED IT OUT and even delivered an appology saying it was a failure and that they only released it cause they were DESPERATE, Do note: They never released the 300 series because it was also labeled a failure so bad they skipped it and refused to use the name almost as if they thought it would be a curse... And now look at the 980... And intel was (and kind of is still) a d*** they used to release very hot CPUs and for a while there AMD was king of cpus the thing is intel was just good at marketting and making profits AMD would sell CPUs at nearly no profit and remember AMD invented 64bit, Dual Core CPUs, Quad Core CPUs, Hexa Core CPUs, Octo Core Cpus, and also were the first ones to break 1Ghz... Just pointing this out to you because companies can change I mean the company that made the 480 also made the 980... And Intel the company that's as stubborn as a mule made Haswell...

5820k4Ghz/16GB(4x4)DDR4/MSI X99 SLI+/Corsair H105/R9 Fury X/Corsair RM1000i/128GB SM951/512GB 850Evo/1+2TB Seagate Barracudas

Link to comment
Share on other sites

Link to post
Share on other sites

Love how Apple, Nvidia, Intel, ect. Can say we'll be more efficient and everyones like "Awesome, better <insert product here> faster and more efficient> as to where everyone on this thread is like:

A: AMD's lieing

B: Doubt it

C: More efficient doesn't mean better

D: it'll probably be slower then...

idk man Nvidia made the 480 which was hotter than the 9590 and 290x >.> someones having some fanboy tunnel vision Nvidia's 480 was soooo hot and inefficient even for the time that Nvidia made the 500 series less than a year later, THEY RUSHED IT OUT and even delivered an appology saying it was a failure and that they only released it cause they were DESPERATE, Do note: They never released the 300 series because it was also labeled a failure so bad they skipped it and refused to use the name almost as if they thought it would be a curse... And now look at the 980... And intel was (and kind of is still) a d*** they used to release very hot CPUs and for a while there AMD was king of cpus the thing is intel was just good at marketting and making profits AMD would sell CPUs at nearly no profit and remember AMD invented 64bit, Dual Core CPUs, Quad Core CPUs, Hexa Core CPUs, Octo Core Cpus, and also were the first ones to break 1Ghz... Just pointing this out to you because companies can change I mean the company that made the 480 also made the 980... And Intel the company that's as stubborn as a mule made Haswell...

Yea i know but in modern pcparts those are the worst and the topic is amd

Please follow your topics guys, it's very important! CoC F.A.Q  Please use the corresponding PC part picker link for your country USA, UK, Canada, AustraliaSpain, Italy, New Zealand and Germany

also if you find anyone with this handle in games its most likely me so say hi

 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually you do bring an interesting point: Why doesn't AMD just create a HUGE sized die for it's next socket? We've come pretty far in miniaturization for those itx boards and such so why not make a die size like double the size or more and fit truly great performing CPU along side GPU?

Twice the size of LGA 2011-3, half the performance! :D

.

Link to comment
Share on other sites

Link to post
Share on other sites

This is good news, a short time period for something stronger than the Athlon 5350 to be power efficient enough to fit into a smartphone.

Link to comment
Share on other sites

Link to post
Share on other sites

Love how Apple, Nvidia, Intel, ect. Can say we'll be more efficient and everyones like "Awesome, better <insert product here> faster and more efficient> as to where everyone on this thread is like:

A: AMD's lieing

B: Doubt it

C: More efficient doesn't mean better

D: it'll probably be slower then...

idk man Nvidia made the 480 which was hotter than the 9590 and 290x >.> someones having some fanboy tunnel vision Nvidia's 480 was soooo hot and inefficient even for the time that Nvidia made the 500 series less than a year later, THEY RUSHED IT OUT and even delivered an appology saying it was a failure and that they only released it cause they were DESPERATE, Do note: They never released the 300 series because it was also labeled a failure so bad they skipped it and refused to use the name almost as if they thought it would be a curse... And now look at the 980... And intel was (and kind of is still) a d*** they used to release very hot CPUs and for a while there AMD was king of cpus the thing is intel was just good at marketting and making profits AMD would sell CPUs at nearly no profit and remember AMD invented 64bit, Dual Core CPUs, Quad Core CPUs, Hexa Core CPUs, Octo Core Cpus, and also were the first ones to break 1Ghz... Just pointing this out to you because companies can change I mean the company that made the 480 also made the 980... And Intel the company that's as stubborn as a mule made Haswell...

Remember it's Intel who gave AMD a license to make x86 cpu's, if they decided to be a "d***" about it we would only have ATi now. Also inventing a quadcore/hexacore isn't inventing anymore, any idiot could just stack cores next to each other while ignoring what power it would consume. Inventing the dual core is however inventing. AMD wasn't the first with 6 cores though, Intel was 2 years earlier with a 6 core but Intel was one day later with an eightcore.

First Intel 6 core: http://ark.intel.com/products/36941/Intel-Xeon-Processor-E7450-12M-Cache-2_40-GHz-1066-MHz-FSB

First AMD 6 core: Opteron 2419

First AMD 8 core: Opteron 61KS

First Intel 8 core: http://ark.intel.com/products/46495

On the desktop side, Intel was earlier with an eightcore but decided to be like you said a "d***" with the 3930K lasercutting 2 cores. We haven't seen anything higher than an eightcore from AMD yet, no wonder if you struggle getting your power consumption under control, their only eightcore atm is just their opteron bulldozer based one thats advertised as a 16 core. Intel is at 18 already or? 

Dominating X86 is quite straight forward; pair highest single core performance with the lowest power consumption you win. Lets not call Intel a "d***" and somehow think AMD is pure love, if they have the feeling they can instantly make an end to AMD by simply offering their 5960x for the price of a 8320 and 5930K for the price of a 4300 - that would destroy the competition completely and Intel playing a monopoly we all will be hating. Whatever made you think Intel is stubborn, blame AMD. Anyways most people will be skeptical about their claim, we have to wait like 2 years to finally see their Zen architecture so I kinda lost my hope in AMD catching up with Intel.

Link to comment
Share on other sites

Link to post
Share on other sites

Lots of bullshit useless posting.

Seems to be a theme here: hate on a product/company you don't like because it's not popular in place for your other favorite corporate brand.

Whether you feel you have an excuse or not for your ignorant posting: useless comments in no way related to the original post does not help this community. You guys really betray your own age, or at the least, your maturity.

 

Fanboys will be fanboys.  The people that bought into the Titan videocards will still stand by them even though they cost more and perform worse than the 295x2.

 

Being objective is unpopular apparently.  Intel performs better but we have a whole flock of people who buy i7s for no reason.  Spec kiddies man.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×