Jump to content

AMD employee confirms new GPU with HBM and 300W

Kowar

Point being made that I don't think there are many consumers waiting on a $1500 card. Where the R9 390X falls into the market in terms of pricing will be what decides Nvidia's fate in the enthusiast grade market until their next architecture. Right now you can grab a single R9 295x2 for half of what it costs for a TITAN Black and maintain a large margin of performance leadership. Nvidia has to be extremely careful from this point onward of where they price their products. From the sounds of it Nvidia wants to hold out and use 16nm for GM200 which will push them back to Q4 2015 for a launch date with limited availability. Which means most consumers won't get their hands on one until 2016. There's a lot of stir about AMD using either 20nm or 28nm for their 300 series GPU's. Personally I think we will see 20nm as GloFo has been capable of mass producing 20nm 2.5D for a little while now.

 

 

It won't be $1500. Was the 980 or 970 some ridiculous price? No it wasn't. Only reason why the Titan Black hasn't dropped in price is because they aren't targeting solely Gamers with that card anymore, but people who need a cheap Workstation card (which the Titan is in comparison to most Workstation cards).

 

GM200 already exists, in the form of a Quadro Card (this validation would be a 50% increase in everything [shaders, ROPs, memory bandwidth]; 40-50% performance increase over GTX 980 for a fully unlocked variant):

 

2a.jpg

 

They already have it. This will most likely still be on 28nm. Pascal is pretty much reserved for 16nm. This (GM200) isn't coming out in 2016, it will be out in the Summer if not sooner.

Link to comment
Share on other sites

Link to post
Share on other sites

Still don't understand ehy they don't just anounce it to end all these rumors

Steam:Drift and Drift_2 <p>(name is hunter) Lets play:)

Link to comment
Share on other sites

Link to post
Share on other sites

Shhhhhh, we're here to hate AMD and raise up our godking NVIDIA. We also like to make fun little jokes about AMD heating our homes while pretending that any of us actually give a damn about power consumption.

"WE DEMAND 4k CONTENT BUT IT BETTER NOT TAKE MORE THAN 4 WATTS OF POWER TO GET IT"

-Nobody ever, 01/01/never

In all honesty its not much money.

Assume you max out the card for 12 hours of playtime a day (extreme case).

And use a shitty 80+ no bronze PSU

8/10=300/x x= 375watts

375 Watts*12= 4500 Watts , 4.5 Kilowatts.

Do this all month 31 days.

31*4.5 = 139.5 Kilowatts.

Say you pay 15 C / Killowatt (High end of avaerage.)

139.5*15/100= $20.93 in electicity.

So $21 a month if you literally max out the card 12 hours a day every day on a shit psu in a state with kinda pricey Electicty.

Its not bad concidering this is probably a 390x which is Amds Titan black.

vs the 380x which will be vs the 980

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

I was about to buy an 850W PSU until I saw this.

 

I guess it won't be enough for Crossfire 390Xs then.

CPU: i7 2600 @ 4.2GHz  COOLING: NZXT Kraken X31 RAM: 4x2GB Corsair XMS3 @ 1600MHz MOBO: Gigabyte Z68-UD3-XP GPU: XFX R9 280X Double Dissipation SSD #1: 120GB OCZ Vertex 2  SSD #2: 240GB Corsair Force 3 HDD #1: 1TB Seagate Barracuda 7200RPM PSU: Silverstone Strider Plus 600W CASE: NZXT H230
CPU: Intel Core 2 Quad Q9550 @ 2.83GHz COOLING: Cooler Master Eclipse RAM: 4x1GB Corsair XMS2 @ 800MHz MOBO: XFX nForce 780i 3-Way SLi GPU: 2x ASUS GTX 560 DirectCU in SLi HDD #1: 1TB Seagate Barracuda 7200RPM PSU: TBA CASE: Antec 300
Link to comment
Share on other sites

Link to post
Share on other sites

I dont know about you nerds, but i dont really give a fuck about the monster tdp if it performs really well. If it beats the 980 by 25% but at the same price tag then i would have no issues with it.

Desktop -  i5 4670k, GTX 770, Maximums VI Hero, 2X Kingston Hyper X 3k in raid zero.

Laptop - Lenovo X230 Intel 535 480GB, 16GB Gskill memory, Classic Keyboard Mod, Triple USB 3.0 Express Card.

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, something that should perform better than my 970, and by the looks of things puts out as much heat (1.5GHz boost clock is what I have until the card overheats or the display driver crashes (Going to try the first driver for the 900 series since I had an 170MHz stable OC back then-and setting the fans to 100% does nothing as the temps stll rise and easily exceed 90oC, so Nvidia cards not being heaters is BS.)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I dont know about you nerds, but i dont really give a fuck about the monster tdp if it performs really well. If it beats the 980 by 25% but at the same price tag then i would have no issues with it.

It's an issue when Nvidia releases a month later a card with the same performance while using half of that power, being cooler, and quieter which is very likely to happen.

The GTC is in a few months so an GPU announcement from Nvidia is almost certain and it'll be either a GTX960 or a Titan successor and if it's the latter AMD will have a really bad time and so will we, because the prices will sky rocket again.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd stil buy it if its 300w

Steam:Drift and Drift_2 <p>(name is hunter) Lets play:)

Link to comment
Share on other sites

Link to post
Share on other sites

We won't likely see GM200 until 2016 which will come in the form of another TITAN and possibly a GTX 980 Ti. AMD has a card for combating them two in the form of the R9 390X.

Doesn't have to be a full GM200, cutdown version could work out as well. One thing is clear; if AMD's 390X performs better than the 980 (which definitely will) - Nvidia will just release a new card that outperforms it.

 

 

Right now you can grab a single R9 295x2 for half of what it costs for a TITAN Black and maintain a large margin of performance leadership

Or two 970's that will be much quieter and consuming half as much. It's here atm 900 eur, can get three 970's for that price. 

 

 

Nvidia has to be extremely careful from this point onward of where they price their products. 

How so? They released the 780 a month or two later after the Titan came out. Nvidia will price their cards cheaper if AMD outperforms it, you don't need to be extremely careful for this, just like when the 290x came out Nvidia cutted 100-150$ of the 780. They're not going to price a new GPU higher than AMD's equivalent for a higher price while performing worse.

Link to comment
Share on other sites

Link to post
Share on other sites

We won't likely see GM200 until 2016 which will come in the form of another TITAN and possibly a GTX 980 Ti. AMD has a card for combating them two in the form of the R9 390X.

Volta is pretty much "confirmed" for 2017, Pascal will probably come in 2016 which leaves Gm200 coming out in 2015.

http://www.nvidia.com/content/gtc/documents/sc09_dally.pdf

 

Right now Volta: 

2017 GPU Node – 300W (GPU + memory + supply)

2,400 throughput cores (7,200 FPUs), 16 CPUs – single chip
40TFLOPS (SP) 13TFLOPS (DP)
Deep, explicit on-chip storage hierarchy
Fast communication and synchronization 

Oh it's 300watts omg but the electricity bill will cry! No! They don't care it's server. If you think one 300w gpu will heat your room and warm your home. Volta right now is estimated to be 300w mixed with 10mw of other components!

 
Considering their W9100 pulls 275 maximum. 300watts is nothing, and actually pretty impressive.
 

The efficiencies are there. This is a 4096 GCN core GPU. A 45% performance increase over Hawaii for a 3% increase in TDP over Hawaii.

 

I doubt it'll be 4096 cores on 28nm, 20nm it's still pushing it a bit. But Tonga might scale really well in Wattage per CU.

 

What happened to all of the new power efficiencies from Tonga? This is beyond even Hawaii...

AMD FirePro W7100 graphics

GPU: 1792 Stream Processors organized into 28 Compute Units (tonga) 
Power: 150w
Cooling: Active

The 285 has a TDP of 190w. 

Thats pretty much in itself.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

... and no PhysX.  Only saved for the company that makes video cards to play games the way they were meant to be played.  NVIDIA!

post-5177-0-64869900-1421225674.jpg

 

 

      

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

It's good to see that both the AMD defense force and the Nvidia fanclub are alive and well.

Just some random thoughts I got while reading this thread:

 

When Fermi was released it was the exact opposite. AMD users saying that heat output and power consumption were really really important and Nvidia users said it didn't matter. Now that the tables have turned so has the opinions of fanboys apparently.

 

We don't know if this is TDP, average or max consumption. If it's TDP or average then it is very high compared to other cards. If it's max then it's not that high. It's pretty pointless to argue about it right now though since we don't know which one it is.

 

High power consumption is never better than low power consumption if everything else is equal. High power consumption is bad. Just how bad 300W is depends on performance and heat output and we don't know either one of those yet. If it performs really really well then 300W might be excusable. If it uses let's say 30% more power but only performs 10% better then it might not be alright for some people. Some people will not take a 10% performance increase if their computer becomes a lot noisier, their room becomes hotter and their power bill goes up.

 

Don't post stress tests and act like they are normal usage. Don't post total system power consumption and act like it's the GPU alone. Both of these things just makes you look like a fanboy desperately trying to justify a bad quality. You might trick some people and you might give false hope, but neither of those things are good.

Link to comment
Share on other sites

Link to post
Share on other sites

... and no PhysX.  Only saved for the company that makes video cards to play games the way they were meant to be played.  NVIDIA!

attachicon.gifnvidia.jpg

 

I am at a loss of words.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

I am at a loss of words.

I however am not. Nvidia fan-boy alert.

 

... and no PhysX.  Only saved for the company that makes video cards to play games the way they were meant to be played.  NVIDIA!

attachicon.gifnvidia.jpg

When will people learn that its better to go by what you want in a graphics card, not by the manufacturer alone because your too damn narrow minded to consider other options.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

It won't be $1500. Was the 980 or 970 some ridiculous price? No it wasn't. Only reason why the Titan Black hasn't dropped in price is because they aren't targeting solely Gamers with that card anymore, but people who need a cheap Workstation card (which the Titan is in comparison to most Workstation cards).

 

GM200 already exists, in the form of a Quadro Card (this validation would be a 50% increase in everything [shaders, ROPs, memory bandwidth]; 50% performance increase over GTX 980 for a fully unlocked variant):

 

-snip-

 

They already have it. This will most likely still be on 28nm. Pascal is pretty much reserved for 16nm. This (GM200) isn't coming out in 2016, it will be out in the Summer if not sooner.

The GTX 980 and 970 had aspects of the card cut back to help cut costs. Even then they landed in the market $550 and $350 respectively today as to where a R9 290X is only $269. You can start to see the trend where AMD has an upper hand in pricing. You can go crossfire R9 290X over a single GTX 980 while being cheaper and easily doubling their game performance with proper scaling. Pricing has always been Nvidia's downfall as it is for Intel when AMD has a competitive architecture. People will always go for the cheaper option if it even offers dead equivalent performance.

 

Doesn't have to be a full GM200, cutdown version could work out as well. One thing is clear; if AMD's 390X performs better than the 980 (which definitely will) - Nvidia will just release a new card that outperforms it.

A cut down GM200 would land you in the GTX 980 Ti market. Which would more than likely run alongside the R9 390X in pricing but I certainly wouldn't expect in performance. I don't think GM200 will be able to out perform the R9 390X with topping out at just 3072 shaders. That would be a 50% gain in shader count over the GTX 980. While the R9 390X will be a 60% gain in shader count over the R9 290X meanwhile also bringing forth a faster architecture and a large bus with massive amounts of memory bandwidth from HBM. Not to mention the HBM will be on package communicating through TSV's thus cutting out any previous GDDR5 interface latency.

 

Or two 970's that will be much quieter and consuming half as much. It's here atm 900 eur, can get three 970's for that price.

Or go for triple crossfire R9 290X for the win. Power consumption means nothing for most of us as electricity is dirt cheap.

 

How so? They released the 780 a month or two later after the Titan came out. Nvidia will price their cards cheaper if AMD outperforms it, you don't need to be extremely careful for this, just like when the 290x came out Nvidia cutted 100-150$ of the 780. They're not going to price a new GPU higher than AMD's equivalent for a higher price while performing worse.

AMD has already out performed Nvidia's offerings time and time again so don't think for a moment that Nvidia is top dog because AMD fires back with winning solutions every generation. It's an ongoing back and forth battle that really can't yield a true king of the industry. Yet Nvidia is still stagnant on pricing. This is one reason why I haven't ever bought any of their cards out of pocket. I refuse to pay more for an under performing card when I can get higher frame rates for less. AMD has been banking on it by refreshing the same architecture for a couple of generations. The R9 300 series brings forth a new architecture which opens the door for new competition. So far it's looking like a full featured GM200 core will top out at 3072 shaders. If that's the case AMD may have a better card for each and every Maxwell backed by AMD's competitive pricing.

 

Volta is pretty much "confirmed" for 2017, Pascal will probably come in 2016 which leaves Gm200 coming out in 2015.

http://www.nvidia.com/content/gtc/documents/sc09_dally.pdf

 

Right now Volta: 

2017 GPU Node – 300W (GPU + memory + supply)

2,400 throughput cores (7,200 FPUs), 16 CPUs – single chip
40TFLOPS (SP) 13TFLOPS (DP)
Deep, explicit on-chip storage hierarchy
Fast communication and synchronization 

Oh it's 300watts omg but the electricity bill will cry! No! They don't care it's server. If you think one 300w gpu will heat your room and warm your home. Volta right now is estimated to be 300w mixed with 10mw of other components!

 
Considering their W9100 pulls 275 maximum. 300watts is nothing, and actually pretty impressive.

It's really not up to the manufacture after spending a year designing a FinFET architecture while the production line is backed up by other clients. If GM200 rolls out this year it will be on 28nm the same with current Maxwell cards. Tho keep in mind I said if Nvidia decided to push GM200 down to 16nm then we likely won't see it until 2016.

Link to comment
Share on other sites

Link to post
Share on other sites

 

It's really not up to the manufacture after spending a year designing a FinFET architecture while the production line is backed up by other clients. If GM200 rolls out this year it will be on 28nm the same with current Maxwell cards. Tho keep in mind I said if Nvidia decided to push GM200 down to 16nm then we likely won't see it until 2016.

Well if they were to do so then pascal will have to be released along side Volta. IBM and Nvidia want their supercomputer. 

 

Well I really don't believe they'll skip 20nm just like that. According to some sources (reiliablity is unknown) We can't get good yield on even smaller Apple chips let alone <300mm^2 chips. But if this were to happen I doubt we'd see consumer volta until 2019 if at all.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

Like the 980, they'll release 4GB versions first and then release 8GB versions a few months later.

 

4GB Vram is still plenty for even 1440p gaming. It's only when you get into 4K and multi-monitor setups that you need more. Also with the latest generation of GPUs from both companies, the high-speed memory interface/architecture is getting faster and more optimized which means even if you start running out of Vram in-game, you'll barely notice any difference.

dude, no. 4GB is fine even for 4K atm, and 5K in 90% of the games. remember the ultrawide setup of the Autobahnhammer? that had the same(ish) amount of pixels as a 5K display, and the only time it ran out of its 4gig buffer was when Shadow of Mordor was played with all those ultra textures and whatever ;) so be sure that 4Gig is enough

 

and from a developers point of view now

 

if the coders have any idea how the engine fetches stuff from the RAM and vRAM, 4 gigabytes will be enough to get us through 4K comfortably :) if the optimisation is crap (like it seems to be, lately), then yes, we will need crazy 8GB cards soon

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

Or go for triple crossfire R9 290X for the win. Power consumption means nothing for most of us as electricity is dirt cheap.

Yeah 1000W of heat dumping in your room, totally acceptable. And you want a custom loop for that. 3x 970's should be on 450W with each of them at 1200-1400 rpm would be significantly quieter than three 290x.

Link to comment
Share on other sites

Link to post
Share on other sites

what beets me is that most gaming laptops are coming with 8G of ram with 1080p screens, ridicules right, so why didn't the put 8G of ram on the new gen Maxwell's and the upcoming AMD 300 series when they know there are several 4k options out there and make the cards more future prof.

I5 3570K@ 4.4 - GIGABYTE Z77- Kingston 8G 2400 RAM - MSI GTX980 - HAF-X - 27'' ROG SWIFT + 32'' LG IPS - OCZ 250G SSD + WD 4TB HDD - ASUS XONAR DX -Noctua NH-D14

Link to comment
Share on other sites

Link to post
Share on other sites

what beets me is that most gaming laptops are coming with 8G of ram with 1080p screens, ridicules right, so why didn't the put 8G of ram on the new gen Maxwell's and the upcoming AMD 300 series when they know there are several 4k options out there and make the cards more future prof.

Its a marketing ploy.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Could 2.5D process not be used to increase preformance without die shrinkages? If so could this mean a move to slower die shrinkage but increased focus on 2.5D, then 3D die stacking? Seems like a logical way to go as each die shrink becomes move difficult due to electrons quantum tunneling.

Link to comment
Share on other sites

Link to post
Share on other sites

Could 2.5D process not be used to increase preformance without die shrinkages? If so could this mean a move to slower die shrinkage but increased focus on 2.5D, then 3D die stacking? Seems like a logical way to go as each die shrink becomes move difficult due to electrons quantum tunneling.

Link to comment
Share on other sites

Link to post
Share on other sites

I am at a loss of words.

... and obviously losing with the video card you have, the R9 280x, as it has no PhysX so you don't get to play all games the way they were meant to be played.  So sad. :(

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×