Jump to content

AMD officially cancels 20nm chips and takes a $33 million charge

Bouzoo

- snip - not sure why they should pay, was TSMC paying AMD and Nvidia for not delivering 20NM on scedule?

Not sure about the latter, but it may be some form of security if they cancel it, probably some small percentage of the full budget that was supposed to be used in the project. Maybe TSMC did some things in advance and they have to cover up expenses. That sort of thing. It's not really uncommon.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

The 260x was not a rebrand. It was a new chip. As it supported true audio and supported freesync. 

 

Not realy.

infact the 260X was a rebrand of the 7790 Bonaire.

And yes the Bonaire chip allready supported true audio, so True audio wasnt exaly new.

Link to comment
Share on other sites

Link to post
Share on other sites

Not realy.

infact the 260X was a rebrand of the 7790 Bonaire.

And yes the Bonaire chip allready supported true audio, so True audio wasnt exaly new.

It was a "refresh" not a "rebrand" it had newer features, and better dx12 and mantle support. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

It was a "refresh" not a "rebrand" it had newer features, and better dx12 and mantle support. 

 

So, its not a new chip.

And new feutures? nope

Link to comment
Share on other sites

Link to post
Share on other sites

So, its not a new chip.

And new feutures? nope

Freesync. True audio. Better dx12 support. more ram. better drivers.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync. True audio. Better dx12 support. more ram. better drivers.

 

Apart from the ram the rebranded cards already had those last gen. Well the drivers is kind of funny tbh -- there's no reason they had to start giving a shit about drivers now, that was an option before.

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync. True audio. Better dx12 support. more ram. better drivers.

You keep saying "better dx12 support" but what exactly does that mean? The r7/r9 200 series should, in theory, have the exact same DX12 feature support as the old 7000 series. In fact, the only cards rumored to take advantage of the latest DX12 features are the Fury X and 980 Ti(and Titan X). I have not followed the 300 series entirely, but they are new enough for me to at least assume they can as well. 

 

In fact, Nvidia has already stated that even cards as old as Fermi will support most of DX12, and AMD has stated that all GCN cards will as well. Very little has changed from DX 11.1 to DX 12, so cards that could support 11.1 should be able to support 12. 

 

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12

 

 

 

Direct3D 12 requires graphics hardware conforming to feature levels 11_0 and 11_1 which support virtual memory address translations and requires WDDM 2.0 drivers. There are two new feature levels, 12_0 and 12_1, which include some new features exposed by Direct3D 12 that are optional on levels 11_0 and 11_1.

 

If you could explain to me specifically what parts of DX 12 that the 200 series will be able to do, that the older 7k series cannot, i would appreciate it. As far as i am concerned, all DX11.1 based cards will perform exactly the same as all other DX 11.1 based cards in the past when it comes to DX 12. Seeing as the 7k series was already DX 11.2, i can't for the life of me believe that something has changed in the 200 series, when DX 12 was not even public. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You keep saying "better dx12 support" but what exactly does that mean? The r7/r9 200 series should, in theory, have the exact same DX12 feature support as the old 7000 series. In fact, the only cards rumored to take advantage of the latest DX12 features are the Fury X and 980 Ti(and Titan X). I have not followed the 300 series entirely, but they are new enough for me to at least assume they can as well. 

 

In fact, Nvidia has already stated that even cards as old as Fermi will support most of DX12, and AMD has stated that all GCN cards will as well. Very little has changed from DX 11.1 to DX 12, so cards that could support 11.1 should be able to support 12. 

 

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12

 

 

If you could explain to me specifically what parts of DX 12 that the 200 series will be able to do, that the older 7k series cannot, i would appreciate it. As far as i am concerned, all DX11.1 based cards will perform exactly the same as all other DX 11.1 based cards in the past when it comes to DX 12. Seeing as the 7k series was already DX 11.2, i can't for the life of me believe that something has changed in the 200 series, when DX 12 was not even public. 

dx12.1 is not avialable on 7000 series. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

dx12.1 is not avialable on 7000 series. 

Where is it stated that it is not available on the 7000 series? Also, where is it stated that it is available on the 200 series?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This should not come as a surprise to anyone. AMD really only uses TSMC for their GPU production. TSMC has failed monumentally on their 20nm node, meaning neither AMD nor NVidia has ever been able to release any GPU on it (high performance desktop/laptop part).

 

It does surprise me that AMD has to pay for it though. Fiji was supposed to be 20nm, but TSMC just could not deliver.

 

As for AMD in the future, I think this is more of a farewell, go to hell, kinda thing. Greenland will be 14nm FinFet on Global Foundries next year, so I doubt AMD will have any business left at TSMC. I even believe Grenada is made on Global Foundries 28nm technology, which is why it's a little better than Hawaii, but I'm not 100% certain on that.

 

If NVidia don't get their contract with Global Foundries soon, they will be stuck with 16nm FinFet+ next year. Or if we know TSMC right, at the very end of 2016 or beginning of 2017. Pascal might be very delayed.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

If NVidia don't get their contract with Global Foundries soon, they will be stuck with 16nm FinFet+ next year. Or if we know TSMC right, at the very end of 2016 or beginning of 2017. Pascal might be very delayed.

There are alot of rumours recently, that pascal will be using TSMC 16nm.

Also rumours about both GDDR5 and HBM2 versions.

Rumours also suggest a Q3 2016 release (october/sebtember), but this can easily change.

Link to comment
Share on other sites

Link to post
Share on other sites

The whole 14nm and HBM2 looming over the current FuryX is giving me the "wait for the next thing" ideas. But I am still waiting to see how bad ass the FuryX2 will be. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There are alot of rumours recently, that pascal will be using TSMC 16nm.

Also rumours about both GDDR5 and HBM2 versions.

Rumours also suggest a Q3 2016 release (october/sebtember), but this can easily change.

 

Indeed. I have not heard about the GDDR5 version though. Fiji X was limited in supply in the first run, and we heard that SK Hynix had ramped up the production, so I assume the demand for HBM is very high. I would believe that SK Hynix would be fully able to meat demand next year on HBM2, although I had a sneaky suspicion, that AMD would get first pick, and NVidia get the left overs.

If a GDDR5 version is indeed in the works, that might support that suspicion. Also we still haven't seen NVidia come out with anything HBM, so the only thing we have to cling to is their released odd Pascal image, that does not use a silicon interposer. I wonder if they even know how to make it work?

 

If NVidia is forced to use TSMC's 16nm FinFet+, then it's anyone's guess when Pascal will be out. Global Foundries already mass produce 14nm FinFet right now, but TSMC has not started production of 16nm yet. We could see Pascal release in Q4 of 16 or worst case scenario, in Q1 17. As we have seen with the news in this thread, TSMC failed hard on 20nm, and we know TSMC will focus on getting 10nm ready in 2017 (good luck), so honestly I don't think 16nm FF+ is their major focus.

 

Next year is not only going to be amazing in terms of pc games, but also in the hardware area with AMD's Zen releasing, AMD going all 14nm FF on everything high end, and Oculus Rift releasing. I can't wait :D

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Indeed. I have not heard about the GDDR5 version though. Fiji X was limited in supply in the first run, and we heard that SK Hynix had ramped up the production, so I assume the demand for HBM is very high. I would believe that SK Hynix would be fully able to meat demand next year on HBM2, although I had a sneaky suspicion, that AMD would get first pick, and NVidia get the left overs.

If a GDDR5 version is indeed in the works, that might support that suspicion. Also we still haven't seen NVidia come out with anything HBM, so the only thing we have to cling to is their released odd Pascal image, that does not use a silicon interposer. I wonder if they even know how to make it work?

 

If NVidia is forced to use TSMC's 16nm FinFet+, then it's anyone's guess when Pascal will be out. Global Foundries already mass produce 14nm FinFet right now, but TSMC has not started production of 16nm yet. We could see Pascal release in Q4 of 16 or worst case scenario, in Q1 17. As we have seen with the news in this thread, TSMC failed hard on 20nm, and we know TSMC will focus on getting 10nm ready in 2017 (good luck), so honestly I don't think 16nm FF+ is their major focus.

 

Next year is not only going to be amazing in terms of pc games, but also in the hardware area with AMD's Zen releasing, AMD going all 14nm FF on everything high end, and Oculus Rift releasing. I can't wait :D

Well the GDDR5 version will be for the low-end, and the high-end will have HBM2.

Kinda like AMD is doing currently.

I doubt there will be any first pick. Just doubt there is a huge benefit for HBM in the low-end.

Nvidia are known to fail with new memory. There are a lot of changes comming, and a lot of those, have been issues they ran into before, with its predecessor technology.

There are certainly some interesting things comming in the next couple of years.

Link to comment
Share on other sites

Link to post
Share on other sites

Well the GDDR5 version will be for the low-end, and the high-end will have HBM2.

Kinda like AMD is doing currently.

I doubt there will be any first pick. Just doubt there is a huge benefit for HBM in the low-end.

Nvidia are known to fail with new memory. There are a lot of changes comming, and a lot of those, have been issues they ran into before, with its predecessor technology.

There are certainly some interesting things comming in the next couple of years.

 

Ah yes, I keep forgetting that Pascal will be an entire line of cards, and not just a high end one. Yeah I agree, that HBM will be a bit overkill for the lower end cards, so that makes sense. As for getting first pick, we know that AMD has exclusive rights for a full generation, which is Fiji, but we don't know if AMD will have any "VIP" status after that. It's speculation of course.

 

What is so odd about HBM, or at least the implementation we've seen in Fiji, is the use of a silicon interposer, which is a first for the GPU market. It seems to be a necessity for keeping yields high. I wonder how NVidia will deal with this issue. Or if AMD is going to find a different solution.

 

What exactly does this effect from a consumer standpoint? I'm a bit lost

 

None what so ever. TSMC's 20nm node has never been able to produce useful yields of high performance chips, so no graphics card for pc's could be made on it. That's why neither AMD nor NVidia has ever released a 20nm GPU, even though they both thought they would.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

@alpenwasser are we going to see the banhammer on every touchy feely subject now?

They are taking their time and it's the one thing they don't have. I really hope this is the right decision. :)

I think the discussion they had about it was a long time ago but we are only now hearing about it.
Link to comment
Share on other sites

Link to post
Share on other sites

@alpenwasser are we going to see the banhammer on every touchy feely subject now?

I have vacation at the moment, so maybe not on every one, but quite a few probably, yes.

EDIT:

I like the picture, and we (the mods) are pretty tired of all the flaming, so things are

likely to get a bit stricter regarding this kind of stuff.

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

So does this mean AMD still be bad and no real rival to intel?

 

Nah, 20nm was only supposed to be used by AMD and NVidia for graphics cards. AMD will use 14nm FinFet for Zen, which is pretty competitive to Intel's 14nm. Next year will be very interesting indeed.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I have vacation at the moment, so maybe not on every one, but quite a few probably, yes.

EDIT:

I like the picture, and we (the mods) are pretty tired of all the flaming, so things are

likely to get a bit stricter regarding this kind of stuff.

That's fine, I'm undecided on the idea, feel it may limit conversation on fear of being banned but on the flip side also hopefully stop the flaming (I feel like the communities respect for each other has taken a slight dive). Fingers crossed.

Just remember with great ban hammering comes great responsibility.

Link to comment
Share on other sites

Link to post
Share on other sites

That's fine, I'm undecided on the idea, feel it may limit conversation on fear of being banned but on the flip side also hopefully stop the flaming (I feel like the communities respect for each other has taken a slight dive). Fingers crossed.

Just remember with great ban hammering comes great responsibility.

No worries, we're not going to start banning people just for having opinions (this thread has

been pretty okay so far, as was the religion thread last I checked :) ). It's more about stuff

like this:

 

You're a fanboy refusing to achknowledge my evidence!

Stop trotting out your damn straw man arguments and answer my question!

You guys are obviously both peasants! #640pMasterRace

etc.etc.etc.

Basically, any halfway sane human being should be able to bring across their points in a civil

and respectful manner, and disagree with people without behaving like the world is about to end

because somebody doesn't share their viewpoint. And if they can't, well, tough luck.

Anyway, we're getting a tad off-topic here, but I think this should make things clear enough.

As the CoC says: "Don't be a dick.", then you should be pretty safe. :)

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

How are they going to defeat Intel?

With psychological warfare

The weird kid in the corner eating glue
“People think that I must be a very strange person. This is not correct. I have the heart of a small boy. It is in a glass jar on my desk.” - Stephen King

Link to comment
Share on other sites

Link to post
Share on other sites

Bold move AMD, hopefully it pays off for you in the end. In a year or two I'll be looking to upgrade from my i7-3770, my system might be all AMD if they keep it up.

 

This is going to be an interesting couple of years.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Aside from that, if someone is wanting a cheap GPU, the 750 Ti end, are they really going to want to spend money on a power supply that can handle a 290X in five year's time? Power efficiency is so important for those low end because they are for people who are really on a budget. The GTX 480 does not make sense as a low-mid end purchase now for this reason.

 

A quality 550W-600W power supply is hardly breaking the bank.  They usually sit at around $70 for a good Corsair or EVGA one.  My sister runs a 290 perfectly fine on a CX600.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

@alpenwasser

Use spoiler maybe ? like please.


Sounds like a good decision as far as I can see.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×