Jump to content

AMD Readying Two New 20nm Cards To Go against Maxwell, 380X Will Take on 980, 390X Will Take on Titan 2 / 980 Ti

CarlosRex

If these are 20nm I'll eat my hat.

Considering the release is for February and not December, this shouldn't come as a surprise... Of course, Nvidia will also do a Maxwell refresh at that process node, further milking Maxwell's value. Honestly I'm shocked AMD and Nvidia haven't started following Intel's tick-tock cycle. It's just damn intelligent to produce chips that way. Now if only Intel bought out Nvidia...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Considering the release is for February and not December, this shouldn't come as a surprise... Of course, Nvidia will also do a Maxwell refresh at that process node, further milking Maxwell's value. Honestly I'm shocked AMD and Nvidia haven't started following Intel's tick-tock cycle. It's just damn intelligent to produce chips that way. Now if only Intel bought out Nvidia...

Granted, Intel's engineering team is quite awesome, but I can't imagine Intel buying NVidia would be good for consumers at all... We need more competition, not consolidation. Besides, Intel and NVidia directly compete in the mobile sector (even if Intel's presence is still very small).

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Lets get gtx 780 or r9 290 performance at below 300usd!

And so GabeN has told us to march forthith unto the Land of Holy welding our swords of mice, shields of keyboards, and helmets of Oculus Rifts where we shall reclaim it-which is rightfully ours-from the PUNY Console Peasants from whom armed only with mere controllers we shall decimate in all forms of battle and we shall dominate even in their most ridiculous tradition and infatuation of CoD. Yes, my brothers- sisters and trans sexuals too- we shall destroy the inferior of races with our might and majesty. And if any Peasants wish to join us they must speak now or forever perish. -Ancient Speech from a Leader of Old, Book of Murratri section 2

Link to comment
Share on other sites

Link to post
Share on other sites

Granted, Intel's engineering team is quite awesome, but I can't imagine Intel buying NVidia would be good for consumers at all... We need more competition, not consolidation. Besides, Intel and NVidia directly compete in the mobile sector (even if Intel's presence is still very small).

They don't really compete in the mobile sector. Intel is nowhere in phones, and the Tegra K1 is so far used in Nvidia proprietary tablets/handhelds. I'm not necessarily saying it would be "good" for consumers, but considering AMD went to Nvidia first instead of ATI, and considering Nvidia got so scared of Intel's Larabe graphics project they yanked their cross-licensing agreements for most of their GPU IP, I can only imagine how powerful computer graphics would become in the span of 2 short years if Intel had all of Nvidia's IP at its disposal.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Lets get gtx 780 or r9 290 performance at below 300usd!

The 970 is pretty close to that price already :)

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

20nm?  Highly unlikely.

I was under the impression that 28nm to 20nm was a full node step-down not a half node step.. no?

Link to comment
Share on other sites

Link to post
Share on other sites

I was under the impression that 28nm to 20nm was a full node step-down not a half node step.. no?

 

It is. 22nm is in between. I think he's referring to the endless delay of TSMC's 20nm process. However, the first products with TSMC 20nm silicon just shipped, in the form of the iPhone 6 and 6 Plus, so getting it into GPUs by February seems realistic.

Link to comment
Share on other sites

Link to post
Share on other sites

Considering the release is for February and not December, this shouldn't come as a surprise... Of course, Nvidia will also do a Maxwell refresh at that process node, further milking Maxwell's value. Honestly I'm shocked AMD and Nvidia haven't started following Intel's tick-tock cycle. It's just damn intelligent to produce chips that way. Now if only Intel bought out Nvidia...

Why would it be milking Maxwells value if Nvidia makes 20nm maxwell cards ? This is an honest question, because I honestly don't understand GPUs as well as you clearly do..Maxwell was the microarchitecture that Nvidia developed to replace Kepler, and it was meant to be 20nm, so I get that doing those the 970 and 980 on 28nm could be considered milking the value instead of just waiting for 20nm to be ready...But would 20nm card not be what was supposed to happen from the start?

The Mistress: Case: Corsair 760t   CPU:  Intel Core i7-4790K 4GHz(stock speed at the moment) - GPU: MSI 970 - MOBO: MSI Z97 Gaming 5 - RAM: Crucial Ballistic Sport 1600MHZ CL9 - PSU: Corsair AX760  - STORAGE: 128Gb Samsung EVO SSD/ 1TB WD Blue/Several older WD blacks.

                                                                                        

Link to comment
Share on other sites

Link to post
Share on other sites

Why would it be milking Maxwells value if Nvidia makes 20nm maxwell cards ? This is an honest question, because I honestly don't understand GPUs as well as you clearly do..Maxwell was the microarchitecture that Nvidia developed to replace Kepler, and it was meant to be 20nm, so I get that doing those the 970 and 980 on 28nm could be considered milking the value instead of just waiting for 20nm to be ready...But would 20nm card not be what was supposed to happen from the start?

It means they don't have to tape out a new architecture for longer, maintain the same designs, cut overall R&D costs, etc.. It helps their bottom line, and we might fit more cores at possibly higher clock rates (though 1.4-1.5GHz is already pretty damn high). 

 

They are making the very best of a bad situation.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It means they don't have to tape out a new architecture for longer, maintain the same designs, cut overall R&D costs, etc.. It helps their bottom line, and we might fit more cores at possibly higher clock rates (though 1.4-1.5GHz is already pretty damn high). 

 

They are making the very best of a bad situation.

The bad situation being the delay of the new nodes? just to be clear

The Mistress: Case: Corsair 760t   CPU:  Intel Core i7-4790K 4GHz(stock speed at the moment) - GPU: MSI 970 - MOBO: MSI Z97 Gaming 5 - RAM: Crucial Ballistic Sport 1600MHZ CL9 - PSU: Corsair AX760  - STORAGE: 128Gb Samsung EVO SSD/ 1TB WD Blue/Several older WD blacks.

                                                                                        

Link to comment
Share on other sites

Link to post
Share on other sites

The bad situation being the delay of the new nodes? just to be clear

Yes.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The 970 is pretty close to that price already :)

Make it lower!

And so GabeN has told us to march forthith unto the Land of Holy welding our swords of mice, shields of keyboards, and helmets of Oculus Rifts where we shall reclaim it-which is rightfully ours-from the PUNY Console Peasants from whom armed only with mere controllers we shall decimate in all forms of battle and we shall dominate even in their most ridiculous tradition and infatuation of CoD. Yes, my brothers- sisters and trans sexuals too- we shall destroy the inferior of races with our might and majesty. And if any Peasants wish to join us they must speak now or forever perish. -Ancient Speech from a Leader of Old, Book of Murratri section 2

Link to comment
Share on other sites

Link to post
Share on other sites

Make it lower!

Have some damn patience.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

TSMC's 20nm is more like a node for phones and such, not high performance devices like GPUs.  Both AMD and Nvidia will likely hold off until 2016 and 16FF+ http://www.tsmc.com/english/dedicatedFoundry/technology/16nm.htm

Yeah, I would be very impressed if they actually made TSMC 20 nm planar work on a big GPU. Phone SoC and such, sure, high performance big GPUs not so much. I do not believe that AMD will release a high performance card on TSMC 20 nm planar in February.

Neither well, nor normal.

Link to comment
Share on other sites

Link to post
Share on other sites

Take my money nvi...

 

Oh, intriguing -stows wallet-

 

This could be interesting, will wait for these before upgrading, not like my 670 is struggling at present and perhaps I can get that new monitor cheaper in the post xmas sales.

I am the Black Mage, I cast the magic that makes the people fall down

Link to comment
Share on other sites

Link to post
Share on other sites

Considering the release is for February and not December, this shouldn't come as a surprise... Of course, Nvidia will also do a Maxwell refresh at that process node, further milking Maxwell's value. Honestly I'm shocked AMD and Nvidia haven't started following Intel's tick-tock cycle. It's just damn intelligent to produce chips that way. Now if only Intel bought out Nvidia...

Because they don't own fabs.

Link to comment
Share on other sites

Link to post
Share on other sites

Come on AMD, beat Nvidia's money-hungry ass!

 

(not an AMD fanboy)

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

I used to be a amd fanboy I'll admit. Still rooting for them, love underdogs and competition is great for consumers. But I was about to start oc my monitor for resolution or fps. The gtx 970 card gave me the DSR feature so it was a pretty easy decision. Now to mess with the fps ...

Link to comment
Share on other sites

Link to post
Share on other sites

I used to be a amd fanboy I'll admit. Still rooting for them, love underdogs and competition is great for consumers. But I was about to start oc my monitor for resolution or fps. The gtx 970 card gave me the DSR feature so it was a pretty easy decision. Now to mess with the fps ...

 

Funny how people miff on about DSR, because we've had GeDoSaTo for a while now. Catalyst and NCP also can inject SSAA on DX11 titles.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

" AMD has three new graphics cards in the works. The R9 290X successor based on the Bermuda GPU , The R9 380X based on Fiji and the R9 370X based on Treasure Island. "

 

" Bermuda, Fiji, Treasure " Lol those GPU name. Can't wait to see those new AMD GPU tho' !

Link to comment
Share on other sites

Link to post
Share on other sites

Because they don't own fabs.

Your point? They can still do the same process/cycle if they convince TSMC that it saves them money in the long run, which is easy to do given Intel's wild success.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Damn, same reference cooler. I wonder how long befo-

Well, that took no time flat.

 

You all joke but really I don't think people understand why the cards heat up so much. Unlike normal gpu's (which try to actively cool itself down by increasing the fan speed as soon as the temps rise). For some reason AMD have tried out another way of letting the card heat up all the way into the 80-90's before letting the fan speed up to cool it down (sure yes this runs warmer but tbh it doesnt really matter its just we are used to lower is better I never hear people compalin about how hot laptop CPU's get). I think they were going for a quiet but warmer approach but really fudged up the cooler. Personally I run two reference 290x's and I leave everything on default and even when Im pushing them super hard I just dont figure they are loud at all. the 480gtx got louder by default than these.

 

---

I always enjoy the retort from one company to another its almost guaranteed that Nvidia will wait for AMD's new cards before releasing the 980ti (so they can OC them just a bit more and beat AMD).... and so it goes on back and fourth. But really it does not matter as the better one company can put pressure on another (especially with prices) it means faster and cheaper hardware for everyone.

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

Your point? They can still do the same process/cycle if they convince TSMC that it saves them money in the long run, which is easy to do given Intel's wild success.

Wouldn't that require some form of collusion?
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×