Jump to content

Polaris Reportedly Doubles Performance Per Watt

26 minutes ago, SuperShires said:

But that makes no sense, some people like less heat emitted, me being one of them also less noise because of said heat reduction. I dont get your arguement of "if you want a lower wattage GPU go buy a console or laptop"

What I'm saying is that we should be wanting increased performance with this lower power consumption advantage, not same performance with lower consumption side grades. A 390 and 970 can barely do 60 fps in ROTTR at 1080p with the settings people would prefer to use and obviously games are going to get more demanding in the future as graphics tech improves. We should be wanting the next $300 - $350 cards to perform like a Fury.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

GPU manufacturers always seem to make really grand claims then fall short on delivery.

 

I'll just wait for the launch to see what it actually does instead of arguing based on speculation.

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ivan134 said:

What I'm saying is that we should be wanting increased performance with this lower power consumption, not same performance with lower consumption side grades. A 390 and 970 can barely do 60 fps in ROTTR at 1080p with the settings people would prefer to use and obviously games are going to get more demanding in the future as graphics tech improves. We should be wanting the next $300 - $350 cards to perform like a Fury.

Yeah from that angle I agree, my point was some people like efficiency so it wouldnt be all bad if they did the same performance less power, but that wont be the case I dont think haha. Well hopefully they will be on par with the Fury.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MEC-777 said:

They also said the Fury X would be "an overclocker's dream"... ;)

They were right about the effiency gains of the fury x. 1.5 times over the r9 290x.

16 minutes ago, patrickjp93 said:

You forget Nvidia is getting a double node shrink too. I doubt we're going to see Thermi Part 3.

Thermi was about power consumption, a smaller process node means each core is more efficient, but if nvidia shoves a lot of cores in it will still use a lot of power.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Watermelon Guy said:

They were right about the effiency gains of the fury x. 1.5 times over the r9 290x.

Thermi was about power consumption, a smaller process node means each core is more efficient, but if nvidia shoves a lot of cores in it will still use a lot of power.

But Nvidia has also probably learned from its mistakes with Fermi. Furthermore, I see it being much more like Kepler given we already know peak double precision is 4 teraflops. I'm guessing that will put single precision at 12 Teraflops. Volta goes to 7 DP, so if anything, that will be like Fermi.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, patrickjp93 said:

You forget Nvidia is getting a double node shrink too. I doubt we're going to see Thermi Part 3.

No, I didn't forget. Yea, Fermi was unfortunate to be made on a 40 nm node, otherwise the architecture was pretty good. It was just ahead of its time.

 

What I'm talking about is the people who believe this myth that Nvidia has figured out something about efficiency that AMD hasn't. Hawaii used roughly the same amount of power as high end Kepler and you know how Maxwell's efficiency was achieved since you mentioned Fermi. AMD even took the same approach with GCN 1.2, but still have more compute units than Maxwell and didnt throw out their hardware schedulers.

 

Pascal and Polaris will be roughly equal in power consumption since Nvidia themselves have already given us confirmation they're not skimping on double precision. The only thing that was worrisome about that article was that they said they were having problems bringing consumption under control iirc and I'm paraphrasing right. Nvidia has some good engineers though and they should be able to pull through.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

How does power consumption (generally) scale with transistor size? I thought it was pretty linear (meaning this claim would make sense).

 

I really don't know though.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, -BirdiE- said:

How does power consumption (generally) scale with transistor size? I thought it was pretty linear (meaning this claim would make sense).

 

I really don't know though.

HBM seems to be what has taken the power consumption down quite a bit, and optimizations in architecture.

 

Take a look at these CPUs:

http://ark.intel.com/products/33910/Intel-Core2-Duo-Processor-E8400-6M-Cache-3_00-GHz-1333-MHz-FSB

http://ark.intel.com/products/42801/Intel-Pentium-Processor-E5700-2M-Cache-3_00-GHz-800-MHz-FSB

 

Both of them have a 65W TDP, 3ghz clock speed, two cores, and the same Lithography size (45nm)

 

But the core2duo manages a pretty decent increase in performance over the pentium on passmark

http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core2+Duo+E8400+%40+3.00GHz&id=955

http://www.cpubenchmark.net/cpu.php?cpu=Intel+Pentium+E5700+%40+3.00GHz&id=1101

 

Intel improved how the CPUs chug, and nearly doubled the transistor count on the core2duo, even though the TDP and Lithography size was the same.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, -BirdiE- said:

How does power consumption (generally) scale with transistor size? I thought it was pretty linear (meaning this claim would make sense).

 

I really don't know though.

It stopped being linear at 90nm. Dennard Scaling was the name for it, but it stopped being true a long time ago.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

It stopped being linear at 90nm. Dennard Scaling was the name for it, but it stopped being true a long time ago.

Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like so poeple don't understand how performance per watt is worked out.....

Link to comment
Share on other sites

Link to post
Share on other sites

I personally think efficiency is a bad argument unless it is because your power supply can't power it and you would have to buy a new PSU too.

You'd be much better off buying a few LED light bulbs and replacing some bulbs with those!

Of course if you already have LED bulbs then....

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

I used to say I didn't care about power consumption and only mainly about performance. That was when I ran a single R9 290. Then I ran a pair of 290's in CF and oh my word, the heat! I had to water cool them to keep temps in control (and in a case with very good air flow). I've since switched to a single 980 (which delivers about 75-90% of the performance of the CF 290's) and oh my word, the silence! lol. Runs low-demanding game while remaining totally passive and temps are very well controlled with tons of cooling overhead available. Performance still takes a high priority but I now care a lot more about power efficiency than I did before and I think a lot of other people do as well. 

 

It boils down to personal preference, but there is something to be said about the benefits of power efficiency. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, N3v3r3nding_Newb said:

According to the website "mobipicker," AMD's head, Srinivasan M., said that Polaris will double performance per watt.

http://www.mobipicker.com/amds-polaris-to-double-performance-per-watt-india-head-srinivasan-m/

Please edit your post to follow the Posting Guidelines for the Tech News and Reviews Section.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

But can it compute?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Great. Noiw let's all wake up and stop believing rumors. On a side note , i would be nice, but they did say the fury x would be an "overclocker's dream" , not to mention all the BS surrounding the fx launch.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

So the nuclear GPUs now go thermonuclear. 2x PPW and 2x the heat output.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MEC-777 said:

They also said the Fury X would be "an overclocker's dream"... ;)

They were right about the effiency gains of the fury x. 1.5 times over the r9 290x.

4 hours ago, patrickjp93 said:

You forget Nvidia is getting a double node shrink too. I doubt we're going to see Thermi Part 3.

Thermi was about power consumption, a smaller process node means each core is more efficient, but if nvidia shoves a lot of cores in it will still use a lot of power.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Delicieuxz said:

Does lower wattage also mean quieter fan cooling?

 

I hope Polaris is a home-run for ATI, by way of dominating in performance over Nvidia. Graphics hardware has been stagnating, and increasing by meager amounts, for far too many years.

I won't say that. From 980Ti to 980 is like 25-30% faster. Just think how much faster 980Ti to, say a 580. Comparing Skylake to Haswell....yeah, I mean if Intel gives us 25-30% jump of IPC every generation.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Deli said:

I won't say that. From 980Ti to 980 is like 25-30% faster. Just think how much faster 980Ti to, say a 580. Comparing Skylake to Haswell....yeah, I mean if Intel gives us 25-30% jump of IPC every generation.

 

A 580 to 980Ti is like 5 years to achieve just 2.2 times the performance. That's not all that great. Graphics technology progression really slowed down with the 360 / PS3 console generation, as everything was targeted for those machines - and the major slowdown in graphics tech progression hasn't stopped from then until now. I was hoping Pascal / Polaris would finally be the end of it, but Nvidia seem to be really dragging their feet on doing anything meaningful for graphics power. The new normal is is all about milking consumers with minimal progression, while the consequence is that graphics could have been more than twice what they are at this point, if the 360 / PS3 slowdown hadn't occurred.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Delicieuxz said:

 

A 580 to 980Ti is like 5 years to achieve just 2.2 times the performance. That's not all that great. Graphics technology progression really slowed down with the 360 / PS3 console generation, as everything was targeted for those machines - and the major slowdown in graphics tech progression hasn't stopped from then until now. I was hoping Pascal / Polaris would finally be the end of it, but Nvidia seem to be really dragging their feet on doing anything meaningful for graphics power. The new normal is is all about milking consumers with minimal progression, while the consequence is that graphics could have been more than twice what they are at this point, if the 360 / PS3 slowdown hadn't occurred.

You sound like Nvidia and AMD owe us if they do not double their GPU performance every year.

 

How about there is only 30% IPC improvement from Sandy to Skylake. Boycott Intel?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, DocSwag said:

I personally think efficiency is a bad argument unless it is because your power supply can't power it and you would have to buy a new PSU too.

You'd be much better off buying a few LED light bulbs and replacing some bulbs with those!

Of course if you already have LED bulbs then....

Tell that to server and supercomputer architects. Sorry but modern dGPUs are not designed around gaming.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Deli said:

You sound like Nvidia and AMD owe us if they do not double their GPU performance every year.

 

How about there is only 30% IPC improvement from Sandy to Skylake. Boycott Intel?

There's a doubling of performance from Sandy Bridge to Skylake. Blame consumer software developers for not keeping up with newer instruction sets.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, ivan134 said:

No thanks. I'd rather have more performance for the same wattage. We're talking about having one less light bulb in your house worth of savings.

Agree on the performance on the same wattage... but i've fully switched over to LED bulbs sooo.... at 7 watts per bulb (lets say 150 watts reduction(1/2 of 300watts) that puts me at 21 light bulbs of more power usage)

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Delicieuxz said:

 

A 580 to 980Ti is like 5 years to achieve just 2.2 times the performance. That's not all that great. Graphics technology progression really slowed down with the 360 / PS3 console generation, as everything was targeted for those machines - and the major slowdown in graphics tech progression hasn't stopped from then until now. I was hoping Pascal / Polaris would finally be the end of it, but Nvidia seem to be really dragging their feet on doing anything meaningful for graphics power. The new normal is is all about milking consumers with minimal progression, while the consequence is that graphics could have been more than twice what they are at this point, if the 360 / PS3 slowdown hadn't occurred.

well, a 780Ti is roughly equal to a gtx970... which when put into SLI is a bit more powerful than a 980Ti... so from 780ti to 980ti is maybe 1.7x the performance? not quite double but thats darn impressive on the same manufacturing node!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×