Jump to content

[UPDATE] AMD Announces Polaris Architecture - 4th Generation GCN Arriving in Mid 2016

HKZeroFive

Interesting name from a company hell bent on poralis-ing the community.

Don't quit the day job :)

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Well I hope to see a flagship card with this soon then.

9900K  / Noctua NH-D15S / Z390 Aorus Master / 32GB DDR4 Vengeance Pro 3200Mhz / eVGA 2080 Ti Black Ed / Morpheus II Core / Meshify C / LG 27UK650-W / PS4 Pro / XBox One X

Link to comment
Share on other sites

Link to post
Share on other sites

I'm confused... Is it 16nm or 14nm?

 

TL:DR says 16nm but amd's video says 14nm, so i have no idea what it is now...

I'm thinking 14? Help me D: someone...

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

I'm kinda disappointed in everyone convinced AMD's test is rigged.

 

Yes, the power draw looks low on the Polaris system, but doesn't it also look low on the 950 powered system? I think they did some trickery with the meters to just measure the GPU power consumption. Maybe put the cards on a separate PSU to the rest of their systems?

The Cheap SFF Wonder -- Turning a Core 2 Duo SFF machine into a useful member of computer society!

i5 3570 -- AS.Rock Z77 Pro3 -- 12GB DDR3 -- R9 380X -- 840Evo 500GB/1TB HDD 

I speak the honest truth with a bit of snark mixed in. I'm sorry if it doesn't line up with your beliefs.

Link to comment
Share on other sites

Link to post
Share on other sites

should i feel bad for buying a 390 last year on black friday?

Depends on what you had before that.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I'm confused... Is it 16nm or 14nm?

 

TL:DR says 16nm but amd's video says 14nm, so i have no idea what it is now...

I'm thinking 14? Help me D: someone...

It's both 16nm and 14nm.

Ryan explains it in the anandtech article.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Depends on what you had before that.

 

a laptop... with a 650M

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

a laptop... with a 650M

I wouldn't feel bad then, lol. Your 390 won't stop working because new cards came out. It's still a solid 1080p/1440p card.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

let's get back to AMD's bullcrap testing shall we?

 

a non-reference GTX950 has a peak power draw (just the video card) of 110W http://media.bestofmicro.com/O/G/520576/gallery/Power_w_600.png

system 2, as shown, drew at the wall 152W - that leaves the rest of the system to use 42W

 

now, system 1 draws at the wall 85W, minus 42W would make the Polaris based card to draw 43W on it's own ?!

 

bce0ef53bcf1247e22edcd2912442dbb5114c303

 

note: reference GTX950 is rated for 90W power draw (not TDP !!!) - http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950/specifications

By the way both cards were at Power Limit set to 80% so 950 was not using 110W but around 90W max.

Link to comment
Share on other sites

Link to post
Share on other sites

By the way both cards were at Power Limit set to 80% so 950 was not using 110W but around 90W max.

The CPUs were also both set to 80% power limit.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

The polaris GPU is running at 850Mhz / 0.8375V.

I dont see their numbers as been completely out of reach..

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

So I can assume this new architecture won't be good because they've been alternating between good and bad every generation-

tahiti- good

hawaii- bad

fiji- good

polaris- bad?

 

What? Hawaii was great. At the time, it nearly matched the MUCH bigger GK110, and nowadays it actually tends to beat GK110. Hawaii still competes well against the much newer GM204.

Link to comment
Share on other sites

Link to post
Share on other sites

By the way both cards were at Power Limit set to 80% so 950 was not using 110W but around 90W max.

 

 

The CPUs were also both set to 80% power limit.

 

 

The polaris GPU is running at 850Mhz / 0.8375V.

I dont see their numbers as been completely out of reach..

 

You three do realize no actual proof was presented, right? When Anandtech digs into this and busts AMD for lying their tails off I'm gonna laugh all the way to the bank.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You three do realize no actual proof was presented, right? When Anandtech digs into this and busts AMD for lying their tails off I'm gonna laugh all the way to the bank.

14nm could very much yield low power GPUs. Beating a 950 isn't exactly hard.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

let's get back to AMD's bullcrap testing shall we?

 

a non-reference GTX950 has a peak power draw (just the video card) of 110W http://media.bestofmicro.com/O/G/520576/gallery/Power_w_600.png

system 2, as shown, drew at the wall 152W - that leaves the rest of the system to use 42W

 

now, system 1 draws at the wall 85W, minus 42W would make the Polaris based card to draw 43W on it's own ?!

 

note: reference GTX950 is rated for 90W power draw (not TDP !!!) - http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950/specifications

 

The GTX 750 Ti draws about 50W when gaming. This is basically a 14nm equivalent of a GTX 750 Ti. The power consumption is not unrealistic, though I'm sure AMD has taken a bit of a best-case reading. So where you're calculating a 43W draw, maybe it'll come in closer to 50W in reviews.

 

Edit: This particular GPU seems to be 14nm, not 16.

Link to comment
Share on other sites

Link to post
Share on other sites

You three do realize no actual proof was presented, right? When Anandtech digs into this and busts AMD for lying their tails off I'm gonna laugh all the way to the bank.

Oh most certainly. I'm very aware that this is an example to put polaris efficiency gains in the best light as possible.

I'm however, going to be doubtful that anandtech will bust AMD for anything, as nothing was really presented..

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Can't wait to see what FinFet gives us.

CONSOLE KILLER: Pentium III 700mhz . 512MB RAM . 3DFX VOODOO 3 SLi

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14nm could very much yield low power GPUs. Beating a 950 isn't exactly hard.

 

No, Dennard Scaling died back at 90nm. This expectation is insane.

 

Oh most certainly. I'm very aware that this is an example to put polaris efficiency gains in the best light as possible.

I'm however, going to be doubtful that anandtech will bust AMD for anything, as nothing was really presented..

They presented that, at equal performance levels and (presumed but not proven) equal system setups, AMD's GPU architecture is 65% more energy efficient. It's utter BS by the above statement. It's the same reason why ARM stagnated at the A57 core and had to pursue clock speeds to get better performance while sacrificing lower power to get the A72. Power scaling with node shrinkage slowed down to a crawl after 90nm. Further, it's Global Foundries. Even with Samsung's 14nm tech (designed for low power SOCs, not performance GPUs), do you honestly believe this given its track record of consistent disappointment against TSMC?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, Dennard Scaling died back at 90nm. This expectation is insane.

 

They presented that, at equal performance levels and (presumed but not proven) equal system setups, AMD's GPU architecture is 65% more energy efficient. It's utter BS by the above statement. It's the same reason why ARM stagnated at the A57 core and had to pursue clock speeds to get better performance while sacrificing lower power to get the A72. Power scaling with node shrinkage slowed down to a crawl after 90nm. Further, it's Global Foundries. Even with Samsung's 14nm tech (designed for low power SOCs, not performance GPUs), do you honestly believe this given its track record of consistent disappointment against TSMC?

If so explain Fermi to Kepler for us please as that was 40NM to a lower process whilst making big gains in performance, NVIDIA pulled efficiency from nowhere?

CONSOLE KILLER: Pentium III 700mhz . 512MB RAM . 3DFX VOODOO 3 SLi

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm cheering for your, RTG. Please don't let me down, I have fond memories of ATI since the x1900xtx days. 

Link to comment
Share on other sites

Link to post
Share on other sites

If so explain Fermi to Kepler for us please as that was 40NM to a lower process whilst making big gains in performance, NVIDIA pulled efficiency from nowhere?

 

No, Fermi was just bad design. GCN isn't bad design. It's just dense (HDL-High Density Libraries and all that). While FinFET voltage control is very different, did we all forget what happened to Intel upon moving to FinFET? Heat went up, and overclocking headroom came down. That was true for both the mainstream and HEDT platforms, where HEDT chips were still soldered to their heat spreaders.

 

Nvidia didn't get it from nowhere. They underwent a paradigm shift in design philosophy. While Koduri hasn't been with AMD for a while, GCN is still based on his previous work. GCN 2.0/4.0 won't be a drastic architectural change other than allowing more TMUs and ROPs as Fiji should have.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

They presented that, at equal performance levels and (presumed but not proven) equal system setups, AMD's GPU architecture is 65% more energy efficient. It's utter BS by the above statement. It's the same reason why ARM stagnated at the A57 core and had to pursue clock speeds to get better performance while sacrificing lower power to get the A72. Power scaling with node shrinkage slowed down to a crawl after 90nm. Further, it's Global Foundries. Even with Samsung's 14nm tech (designed for low power SOCs, not performance GPUs), do you honestly believe this given its track record of consistent disappointment against TSMC?

Have you taken the Nvidias node disadvantage into account?

It really don't seems so unlikely. Systems was locked in med setting with a 60 fps frame cap.

ARM is stagnated because it is in the period of transformation, and these things take time. There are a few ARM archictures coming targeting higher performance. All the got in the meantime are low power designs to tweak.

Also, this seems to be a low-end unit. Who is to say big dies aren't to be fabbed at TSMC?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

You three do realize no actual proof was presented, right? When Anandtech digs into this and busts AMD for lying their tails off I'm gonna laugh all the way to the bank.

I am simply discussing about information that were in OP. If they were lying or not doesn't matter.. Neither of us have any proof about that. My opinion is the results they got are plausible.
Link to comment
Share on other sites

Link to post
Share on other sites

No, Dennard Scaling died back at 90nm. This expectation is insane.

 

Expecting a low-end 14nm GPU to draw a bit less power than a low-end 28nm GPU (GM107) is not insane.

 

 

Power scaling with node shrinkage slowed down to a crawl after 90nm.

 

Yeah that's why the GTX 660 couldn't deliver over 90% the performance of a GTX 580 with half the power consumption.

 

Oh wait. It could.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×