Jump to content

AMD have begun shipping Polaris GPUs - 2 different Polaris chips and 4 different GPUs spotted

Mr_Troll

AMD shows of 14nm Polaris desktop GPU at CES

As expected the power efficiency is far better than Nvidia Maxwell and looks set to go head-to-head with Pascal.

Higher end models will use HBM, lower end models will use GDDR5.

 

Link to comment
Share on other sites

Link to post
Share on other sites

 It would make sense that they will produce the lower end cards at GlobalFoundries so they can release them earlier then if they made them at TSMC which will be swamped with Nvidia as well.

 

 My guess is that they will release a lower end GPUs to showcase a new architecture and get some advantage in portable devices and at the later date they'll release high end stuff from TSMC who are used to producing GPUs that complex.

 

I personally doubt we will see any 16nm TSMC produced AMD products. AMD has probably secured all of Glofo's 14nm FF production, and we know that Samsung will produce some of the products too (probably the low end/mobile parts).

 

That's what makes everything so interesting this year. Not only do we get 2 node shrinks, HBM and DP 1.3. We get two different manufacturers for AMD and NVidia for the first time in forever.

 

Pretty sure x86 CPU's are a lot more complex to produce than GPU's. So GloFo would have a leg up there. However I think it has more to do with the actual node/way of producing the chip that matters.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

AMD had the Crimson drivers mess with fan speeds and gpu's were overheating. That's not good either.  That was recent.

That was fixed within a week. 

Link to comment
Share on other sites

Link to post
Share on other sites

That was fixed within a week.

Within a day.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

 It would make sense that they will produce the lower end cards at GlobalFoundries so they can release them earlier then if they made them at TSMC which will be swamped with Nvidia as well.

 

 My guess is that they will release a lower end GPUs to showcase a new architecture and get some advantage in portable devices and at the later date they'll release high end stuff from TSMC who are used to producing GPUs that complex.

 

Plus a smaller GPU is easier to pull off with regard to yields.

Link to comment
Share on other sites

Link to post
Share on other sites

Why would reference and non-reference make a difference? Put either PCB under a water block and compare.

Non-Reference PCB's usually have improved designs in terms of power delivery and components ;)

Spartan 1.0

Spoiler

CPU: Intel Core i7-4770K 3.5GHz Quad-Core Processor

CPU Cooler: Cooler Master Seidon 120XL 86.2 CFM Liquid CPU Cooler

Motherboard: Asus Maximus VI Extreme ATX LGA1150 Motherboard
Memory: Corsair Dominator 32GB (4 x 8GB) DDR3-1600 Memory
Storage: OCZ Vector Series 512GB 2.5" Solid State Drive
Storage: Seagate Desktop HDD 4TB 3.5" 7200RPM Internal Hard Drive

Video Card: EVGA GeForce GTX 980 4GB Classified ACX 2.0 Video Card
Case: Thermaltake Urban S41 ATX Mid Tower Case
Power Supply: Corsair 1200W 80+ Platinum Certified Fully-Modular ATX Power Supply
Optical Drive: LG BH16NS40 Blu-Ray/DVD/CD Writer
Optical Drive: LG BH10LS30 Blu-Ray/DVD/CD Writer
Operating System: Microsoft Windows 10 Pro 64-bit
Sound Card: Creative Labs ZXR 24-bit 192 KHz Sound Card
Monitor: 2x Asus VG278HE 27.0" 144Hz Monitor
Keyboard: Logitech G19s Wired Gaming Keyboard
Keyboard: Razer Orbweaver Elite Mechanical Gaming Keypad Wired Gaming Keyboard
Mouse: Logitech G700s Wireless Laser Mouse
Headphones: Creative Labs EVO ZxR 7.1 Channel  Headset
Speakers: Creative Labs GigaWorks T40 Series II 32W 2ch Speakers

Hades 1.0

Spoiler

Laptop: Dell Alienware 15 2015

CPU: i7-4720HQ CPU

Memory: 16GB DDR3 SODIMM RAM

Storage: 256GB M.2 SSD

Storage: 1TB 5400rpm 2.5" HDD

Screen: 15.6" FHD Display

Video Card: Nvidia GTX 970M with 3GB

Operating System: Windows 10 Pro

Project: Spartan 1.2 PLEASE SUPPORT ME NEW CHANNEL > Tech Inquisition

Link to comment
Share on other sites

Link to post
Share on other sites

For example they don't try to stop people from overclocking their mobile GPU and then pull a 360 after consumer backlash. Note its a 360 not a 180

Inb4 someone defends Nvidia like the sheep who defend Apple.

 

As a consumer, yeah, that is shitty but in practice it means very little since we are talking about highly volatile computers that shouldn't be overclocked at all. Though they have had other moves like this (Like GSYNC not being open to AMD cards and asking for $$$),things like this is why Nvidia and Apple are good companies for their shareholders/owners and AMD,today, is a clusterfuck being untangled that hopefully might finally catch up. Yeah, I like that AMD has open-source technologies. I don't like Nvidia gameworks clusterfucks. None of these mentioend arguments for either side, don't detract from the fact that AMD GPUs are absolute crap for laptops. They are non-existant on that market. If you have a new AMD APU+GPU (cool technology, sadly it is still lag behind a single uniform Nvidia mobile GPU), you either wasted way too much money or caught a killer deal. Most likely the former one. So you can bash and talk shit all you want, but that isn't something to be taken lightly. 

 

And in the desktop world of GPUs, they are competetive, but with a net loss on their investment. Which to be honest, to us, as customers shouldn't mean much. This was their conscious decision to stay in the market somehow because they are getting hammered in other areas. They are also non-existant in the higher tiers of desktop CPUs. If AMD was competetive in all the mentioned markets, and not just lower-to-medium CPU market and desktop-GPUs, and with a profit for its owners, Nvidia would be forced to up its game. Instead AMD rebrands old CPUs at lower prices and Intel ups every generation by exactly 10% because they control the market. So, you get to pay a premium for every little improvement. Oh, well. Better go bash on the Internet while knowing next to nothing.

 

Oh and the whole Apple bashing things on this forum. It is almost as if my Apple stocks rises everytime somebody bashes them, while all the other companies are still oblivious how exactly they got their asses handed over by them. And all that while having a fuck-the-consumer policy. Brilliant. 10 points to Gryffindor.

Link to comment
Share on other sites

Link to post
Share on other sites

It just wouldn't make sense for NVIDIA to intentially gimp Kepler. If they got caught, the least worst thing is that massive amounts of consumers won't buy their things for a very long time. The worst result is that they get sued and have to pay potentially billions of dollars. Seeing how everyone, their dogs, and their puppies got a GTX 900 series card, it wasn't exactly like NVIDIA was hurting for sales.

There isn't any proof that Nvidia is gimping kepler, we could say that their not as focused on Kepler as they are on Maxwell.
Link to comment
Share on other sites

Link to post
Share on other sites

There isn't any proof that Nvidia is gimping kepler, we could say that their not as focused on Kepler as they are on Maxwell.

Fanboys like to parade around nonsense and FUD. It's best left for the absent minded.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

It just wouldn't make sense for NVIDIA to intentially gimp Kepler. If they got caught, the least worst thing is that massive amounts of consumers won't buy their things for a very long time. The worst result is that they get sued and have to pay potentially billions of dollars. Seeing how everyone, their dogs, and their puppies got a GTX 900 series card, it wasn't exactly like NVIDIA was hurting for sales.

NVIDIA didn't gimp Kepler, countless reviews have proven that. What has happened is that video games have become, slowly but surely, more compute oriented over time. GCN is a better compute architecture. So GCN has gotten progressively better over time.

This trend will continue throughout 2016 where we should see a FuryX outpace a GTX 980 Ti. Who cares though, as most people will be sporting Polaris or Pascal based GPUs.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA didn't gimp Kepler, countless reviews have proven that. What has happened is that video games have become, slowly but surely, more compute oriented over time. GCN is a better compute architecture. So GCN has gotten progressively better over time.

This trend will continue throughout 2016 where we should see a FuryX outpace a GTX 980 Ti. Who cares though, as most people will be sporting Polaris or Pascal based GPUs.

Kepler is better than Maxwell at compute.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Fanboys like to parade around nonsense and FUD. It's best left for the absent minded.

Thing is, Kepler is better than Maxwell IMO anyway.

Once support for maxwell ends then it wouldn't perform well as it would have.

Cutting down the GPU and using software optimization...since Fermi is a "hardware" card, it still stands up today.

Link to comment
Share on other sites

Link to post
Share on other sites

Thing is, Kepler is better than Maxwell IMO anyway.

Once support for maxwell ends then it wouldn't perform well as it would have.

Cutting down the GPU and using software optimization...since Fermi is a "hardware" card, it still stands up today.

Maxwell is fine. It does what nVidia focused on it to do: gaming. Its cool, its efficient and overclocks like a champ. And that's all that matters.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell is fine. It does what nVidia focused on it to do: gaming. Its cool, its efficient and overclocks like a champ. And that's all that matters.

I'm just going to leave it there before any more "heated" debates come up. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just going to leave it there before any more "heated" debates come up.

I'm not debating, nor looking for an argument. Just stating my opinion.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Classic fanboy. Says the guy with a Radeon card avatar. lmao what a hoot.

And I'm still waiting on proof. Not a bunch of conjecture and hypotheticals. Just pure, solid proof.

1. Just be quiet. You asked for proof, there was a youtube video (or two) posted, and you claimed 'cherry picked'. Stock v Stock 980ti v fury X is equal. Overclocked 980ti does win (and normally factory overclocks are higher/actually exist with the 980ti). The burned of proof is now on YOU, as YOU claimed the benchmarks provided where cherry picked. If you are unable to find any others that have recent drivers, you have every right to remain skeptical, just don't go acting like a monkey and start throwing your shit everywhere.

 

2. Claims bias for having Radeon Avatar, yet has "Intel / nVidia Enthusiast" as forum title. "Your a hypocrite, 'arry".

 

3. If you want to be taken seriously, refute the central point of other arguments (https://upload.wikimedia.org/wikipedia/commons/7/7c/Graham%27s_Hierarchy_of_Disagreement.svg).

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA didn't gimp Kepler, countless reviews have proven that. What has happened is that video games have become, slowly but surely, more compute oriented over time. GCN is a better compute architecture. So GCN has gotten progressively better over time.

This trend will continue throughout 2016 where we should see a FuryX outpace a GTX 980 Ti. Who cares though, as most people will be sporting Polaris or Pascal based GPUs.

 

 

Kepler is better than Maxwell at compute.

 

Yeah. If anything, it's that games have put more emphasis on tessellation. Where GCN IIRC beats Kepler, but Maxwell beats GCN. That's also why some people suspect Nvidia is intentionally influencing developers to put more tessellation in their games, even where it doesn't improve the visuals.

 

Just like with Crysis 2, back when Fermi was better at tessellation than VLIW5/VLIW4.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah. If anything, it's that games have put more emphasis on tessellation. Where GCN IIRC beats Kepler, but Maxwell beats GCN. That's also why some people suspect Nvidia is intentionally influencing developers to put more tessellation in their games, even where it doesn't improve the visuals.

 

Just like with Crysis 2, back when Fermi was better at tessellation than VLIW5/VLIW4.

Bingo.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah. If anything, it's that games have put more emphasis on tessellation. Where GCN IIRC beats Kepler, but Maxwell beats GCN. That's also why some people suspect Nvidia is intentionally influencing developers to put more tessellation in their games, even where it doesn't improve the visuals.

Just like with Crysis 2, back when Fermi was better at tessellation than VLIW5/VLIW4.

GCN 1.1 > Keller,> GCN 1.0 in terms of Tess performance. Keller is a better design than Maxwell

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah. If anything, it's that games have put more emphasis on tessellation. Where GCN IIRC beats Kepler, but Maxwell beats GCN. That's also why some people suspect Nvidia is intentionally influencing developers to put more tessellation in their games, even where it doesn't improve the visuals.

 

Just like with Crysis 2, back when Fermi was better at tessellation than VLIW5/VLIW4.

This is essentially what I mean when I and some others say Kepler is being gimped, but for some reason people translate it as us saying Nvidia is changing code in the drivers to make Kepler run worse. Just about every GameWorks game since the launched of Maxwell is tessellated up the ass with no actual visual improvement from using such high levels of tessellation. A combination of that and not bothering with performance optimization for anything below the 780. It's insane how badly the 770/680 and 760 perform today, but apparently it doesn't matter since apparently buying a new card every generation is now considered normal.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Great to see AMD closing to the release date, considering the current rate they might be able to release the new GPUs before or just around the same time as Nvidia.

 

Yup the 980ti > Fury X bandwagon ended 3 months ago. Just like how damn near everyone reccodmended a CM 212 Evo a month ago. Maybe 2 people and I would reccodmend a PURE ROCK or cryorigH7

 

Well the evo cheaper isn't it?

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

1. Just be quiet. You asked for proof, there was a youtube video (or two) posted, and you claimed 'cherry picked'. Stock v Stock 980ti v fury X is equal. Overclocked 980ti does win (and normally factory overclocks are higher/actually exist with the 980ti). The burned of proof is now on YOU, as YOU claimed the benchmarks provided where cherry picked. If you are unable to find any others that have recent drivers, you have every right to remain skeptical, just don't go acting like a monkey and start throwing your shit everywhere.

2. Claims bias for having Radeon Avatar, yet has "Intel / nVidia Enthusiast" as forum title. "Your a hypocrite, 'arry".

3. If you want to be taken seriously, refute the central point of other arguments (https://upload.wikimedia.org/wikipedia/commons/7/7c/Graham%27s_Hierarchy_of_Disagreement.svg).

If you don't understand the context of the discussion between me and another poster, please excuse yourself and don't butt in others peoples business.

Thank you.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

There isn't any proof that Nvidia is gimping kepler, we could say that their not as focused on Kepler as they are on Maxwell.

Ya a lot of Nvidia users are convinced it is happening but I don't think so (not an Nvidia user though).

It's just that modern AAA games are very reliant on game specific driver optimizations for maximum performance. Nvidia's driver team has been focused on the GPUs which they are currently selling (maxwell).

Whereas In AMD's case of continuous GCN revisions (and some rebrands) even people who purchased first gen GCN cards are still enjoying game specific driver optimizations for the newest games (even if it takes a few more days).

 

So this ends up with people asking questions like if the GTX 680 used to be on par with a HD 7970 in 2011 games then why isn't it the same for 2015 games?

After the witcher 3 fiasco and mass protests on the support forum though Nvidia did do some improvements to Kepler performance, so the situation has improved actually..

 

One of the objectives of DX12 and Vulkan is to become less reliant on game-specific driver tuning. Here's to hoping...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×