Jump to content

AMD employee confirms new GPU with HBM and 300W

Kowar

You people who OC your CPUs and GPUs thus increasing your power consumption are arguing about how much does AMD cards draw power while in fact it doesent matter at all... Plus we still dont really know how much power it will draw or how well it will perform untill the card will be relased. I am more exciteda bout HBM if anything else.

Link to comment
Share on other sites

Link to post
Share on other sites

Since when did everyone become so power conscious for desktop components?

As I mentioned earlier, if power consumption and heat output is always a secondary thought, will it be viable long-term strategy?

Link to comment
Share on other sites

Link to post
Share on other sites

Are fanboy that desperate that they are bashing AMD for the word "300W", in a CV context?

Link to comment
Share on other sites

Link to post
Share on other sites

Are fanboy that desperate that they are bashing AMD for the word "300W", in a CV context?

 

Yes?

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

As I mentioned earlier, if power consumption and heat output is always a secondary thought, will it be viable long-term strategy?

That depends... If performance increase is 40-50% and power consumption stays then it is ok. Problem would be if there is negligible performance increase with no reduction or even slight increase of power draw.
Link to comment
Share on other sites

Link to post
Share on other sites

As I mentioned earlier, if power consumption and heat output is always a secondary thought, will it be viable long-term strategy?

 

You don't draw out a blueprint for a car meant to be extremely fast with the primary though being fuel efficiency. Does it play an important role? definitely. But if neuter it just to make it slightly more efficient you'll never have have the fastest car.

 

So if you can push it harder and still get stable performance increase why not?

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Are fanboy that desperate that they are bashing AMD for the word "300W", in a CV context?

I bought my R9 290 because it was the best card for the money at the time. The only reason I consider myself part of "team red" is because there are so many obnoxious 12-year old fanboys on "team green".

      

Link to comment
Share on other sites

Link to post
Share on other sites

I bought my R9 290 because it was the best card for the money at the time. The only reason I consider myself part of "team red" is because there are so many obnoxious 12-year old fanboys on "team green".

 

I assure you there are lots of 12 year old fanboys on team red too.  :)

 

Being a fan of something is cool, but pure fanboyism is deeming your side undeniably right regardless of what they do, this is toxic and just not good in general. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Whine all you want, that's worst case scenario when you apply constant demanding load to the GPU similar to prime95 constantly. The gaming load, the actual real world use case for this card total power consumption is much, much lower. And the 280ish W is an over clock card with massive cooler on it with butt load of fans on it. Fans don't consume power to move air right? /s

All in all everything is rumor until proven other wise but I'll not put this card in my rig any time soon

The 970 g1 has a 250watt TDP bios on it by default, it can definitely pull more than the 145w spec they give it.

yeah right . the gtx 970 is a 145W TDP card . whine about it as much as you want

 

2zqvuwp.jpg

 

also http://www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html . average power consumption ~160W

---

 

WOW 300W,thats a dual gpu range TDP. i think the stock card will either have a liquid cooler,or a very very loud stock cooler.again

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

about time

I5 3570K@ 4.4 - GIGABYTE Z77- Kingston 8G 2400 RAM - MSI GTX980 - HAF-X - 27'' ROG SWIFT + 32'' LG IPS - OCZ 250G SSD + WD 4TB HDD - ASUS XONAR DX -Noctua NH-D14

Link to comment
Share on other sites

Link to post
Share on other sites

You people who OC your CPUs and GPUs thus increasing your power consumption are arguing about how much does AMD cards draw power while in fact it doesent matter at all... Plus we still dont really know how much power it will draw or how well it will perform untill the card will be relased. I am more exciteda bout HBM if anything else.

It's actually a huge deal and is a really bad sign.

If AMD is releasing that card they'll already be on the limit of what they can do with the architecture and 28nm Nvidia on the other hand isn't they will either hold cards back or release them for insane high prices like they did with 700 series for months.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I could care less if it draws 1000w. As long as its cheaper and performs on par with the 980  itll be a win in my books :D .

Link to comment
Share on other sites

Link to post
Share on other sites

Looking forward to this new card from AMD. 3D stacking with HBM is the future and I'm not really concerned with its TDP being a PC enthusiast. I"m also looking forward to AMD releasing future APU's with HBM L3 cache memory or on chip HBM. To bad AMD has so much debt. its hard to innovate with poor cassh flow

Test ideas by experiment and observation; build on those ideas that pass the test, reject the ones that fail; follow the evidence wherever it leads and question everything.

Link to comment
Share on other sites

Link to post
Share on other sites

it didn't matter if it draw 300W or more, all that matter is performance/watt, and of course Price. and to answer that we need a review!, otherwise we all could speculate all we want and it didn't hold any meaning, it just add more thing to the fanboy war.

Link to comment
Share on other sites

Link to post
Share on other sites

My previous speculations about this happening are starting to show true. This could very well mean that Zen may have HBM on package if pricing comes down by then.

 

You clearly don't pay your own power bills.

Electricity is cheap.

Link to comment
Share on other sites

Link to post
Share on other sites

Meh.. my plasma draws more power and if im gaming its turned off.

Link to comment
Share on other sites

Link to post
Share on other sites

TOTAL SYSTEM POWER CONSUMPTION not 'GPU POWER CONSUMPTION"

The eyes sees what it want to see. They have an i7 4960x at 4.2 GHZ with cooling and what ever the heck they stuff in there drawing power from the wall. Check what you post

an i7 extreme edition over will take up like 1000 so that means that the r9 290x is a 265w card?

FANBOY OF: PowerColor, be quiet!, Transcend, G.Skill, Phanteks

FORMERLY FANBOY OF: A-Data, Corsair, Nvidia

DEVELOPING FANBOY OF: AMD (GPUS), Intel (CPUs), ASRock

Link to comment
Share on other sites

Link to post
Share on other sites

Couldn't this be a dual GPU solution and the single GPU is 150W?

FANBOY OF: PowerColor, be quiet!, Transcend, G.Skill, Phanteks

FORMERLY FANBOY OF: A-Data, Corsair, Nvidia

DEVELOPING FANBOY OF: AMD (GPUS), Intel (CPUs), ASRock

Link to comment
Share on other sites

Link to post
Share on other sites

Couldn't this be a dual GPU solution and the single GPU is 150W?

 

I somehow doubt it'll be a dual gpu solution. I'm sure the dual gpu solution will be name the 395x2 was last generation.

Or because three they will give us a tri-gpu card with the 395x3 so we can make more home heating jokes then ever. Like the clever bastards we are.

 

plus "AMD ASIC engineer Shternshain mentioned the "Radeon R9 380X GPU" as the largest chip of the "King of the Hill" product line." makes it sound like a single chip.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

I feel bad for AMD, they are going to drop this and take the GPU Crown away from the 980. Then BOOM, NVIDIA will release GM200. 

We won't likely see GM200 until 2016 which will come in the form of another TITAN and possibly a GTX 980 Ti. AMD has a card for combating them two in the form of the R9 390X.

Link to comment
Share on other sites

Link to post
Share on other sites

We won't likely see GM200 until 2016 which will come in the form of another TITAN and possibly a GTX 980 Ti. AMD has a card for combating them two in the form of the R9 390X.

 

Not likely at all, NVIDIA Released the original Titan 10-11 months after the release of the GTX 680. We will likely see the next Titan by the Summer. Since the release date for the 980 was in September.

 

Also, the 290X didn't release until 8 months later after the first Titan. So the Titan held that crown for that long. AMD Took the crown again, for a brief period. Then NVIDIA released the 780 Ti literally a month later (290X launch was in October, 780 Ti launched November). So following that pattern. Still feel bad for AMD. 

Link to comment
Share on other sites

Link to post
Share on other sites

What happened to all of the new power efficiencies from Tonga? This is beyond even Hawaii...

Link to comment
Share on other sites

Link to post
Share on other sites

Not likely at all, NVIDIA Released the original Titan 9-10 months after the release of the GTX 680. We will likely see the next Titan by the Summer. Since the release date for the 980 was in September.

Point being made that I don't think there are many consumers waiting on a $1500 card. Where the R9 390X falls into the market in terms of pricing will be what decides Nvidia's fate in the enthusiast grade market until their next architecture. Right now you can grab a single R9 295x2 for half of what it costs for a TITAN Black and maintain a large margin of performance leadership. Nvidia has to be extremely careful from this point onward of where they price their products. From the sounds of it Nvidia wants to hold out and use 16nm for GM200 which will push them back to Q4 2015 for a launch date with limited availability. Which means most consumers won't get their hands on one until 2016. There's a lot of stir about AMD using either 20nm or 28nm for their 300 series GPU's. Personally I think we will see 20nm as GloFo has been capable of mass producing 20nm 2.5D for a little while now.

 

What happened to all of the new power efficiencies from Tonga? This is beyond even Hawaii...

Keep in mind the large boost in compute units. The R9 380X is rumored at 48 compute units (3072 SP) while the R9 280X has 32 compute units (2048 SP). The R9 280X is a 300w card as well if he's referring to power consumption. Meanwhile the R9 280X has a TDP of 250W so that's a 50w boost in TDP to add 16 compute units (1024 SP).

Link to comment
Share on other sites

Link to post
Share on other sites

What happened to all of the new power efficiencies from Tonga? This is beyond even Hawaii...

The efficiencies are there. This is a 4096 GCN core GPU. A 45% performance increase over Hawaii for a 3% increase in TDP over Hawaii.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×