Jump to content

AMD Wants To Crash Nvidia's GTX 980 & 970 Launch Party

We talk about gpu´s right, so your "fucking FX cpu " statement is irrelevant :D

 

The naming scheme is nothing wrong with, if you dont have enough braincells to understand AMD´s naming scheme, then thats not AMDs fault isn it?

 

Too bad the naming scheme lacks consistency and therefore is bad.

 

The R9 295x2 suggests that the R9 295X2's GPU core(s) are somehow better than the one in the R9 290X. However, they have the exactly the same resources.

The R9 285 is not faster than a 280x like how the R7 265 is faster than the 260x.

The 270 and the 270X again have the exactly the same GPU resources while the -x suffix suggests that the card that has the -x suffix has more GPU resources.

Kaveri's GPU is simply dubbed "R7". A layperson/the uninformed can assume that it could be anything from a R7 240 to a R7 265. Also, they do not distinguish between the 384 SP "R7" and the 512 SP "R7.

CPU: Intel Core i3 4370 (3.8GHz, 2C/4T) GPU: AMD R9 380X 4GB

Link to comment
Share on other sites

Link to post
Share on other sites

Too bad the naming scheme lacks consistency and therefore is bad.

 

The R9 295x2 suggests that the R9 295X2's GPU core(s) are somehow better than the one in the R9 290X. However, they have the exactly the same resources.

The R9 285 is not faster than a 280x like how the R7 265 is faster than the 260x.

The 270 and the 270X again have the exactly the same GPU resources while the -x suffix suggests that the card that has the -x suffix has more GPU resources.

Kaveri's GPU is simply dubbed "R7". A layperson/the uninformed can assume that it could be anything from a R7 240 to a R7 265. Also, they do not distinguish between the 384 SP "R7" and the 512 SP "R7.

 

The R9-285 does indeed looks like a strange move.

However it is not that strange, because its just a preview peak to the new R9-300 series.

Tonga is going to replace Tahiti thats all.

 

The R9-370X is going to be based on tonga aswell.

Pitcaim current 270 (x) previous 7870, will probably be abandoned aswell.

Or it will be reduced to a R7 card.

 

So it isnt that messed up, like you think it is.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd happily wear a blue intel iGUP/APU shirt to the event if I could afford to go.

Link to comment
Share on other sites

Link to post
Share on other sites

Tonga is only a few percent better in power consumption and heat output. Not even close to Nvidia's 50% sadly.

 

http://techgage.com/article/amd-radeon-r9-290x-nvidia-geforce-gtx-780-ti-review/10/

 

Where's the 50% more efficient power usage?

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

He's referring to Maxwell, not Kepler.

 

Considering the two Maxwell desktop cards we have use only about 15-25 watts less than their competitors R7 265/R7 260x, that doesn't add up to 50% either.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Considering the two Maxwell desktop cards we have use only about 15-25 watts less than their competitors R7 265/R7 260x, that doesn't add up to 50% either.

We haven't seen power usage for the 970 and 980 yet, nor TDP. If you mean the 750TI, it easily uses 50% less electricity on average than the R7 270, and the power/clock scaling on it is beautiful.

Competition is measured by more than price point. The 750TI matches the 270 in performance.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

We haven't seen power usage for the 970 and 980 yet, nor TDP. If you mean the 750TI, it easily uses 50% less electricity on average than the R7 270, and the power/clock scaling on it is beautiful.

Competition is measured by more than price point. The 750TI matches the 270 in performance.

 

GTX 750 Ti is below the R7 265 in performance in fact.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

GTX 750 Ti is below the R7 265 in performance in fact.

If you're stupid enough to buy reference and/or run at stock speeds. The thing overclocks like a champ. Put the Gigabyte Windforce maxed out (you hit Nvidia's voltage barrier without hitting the thermal one) up against the R7 270. The 270 loses handily.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If you're stupid enough to buy reference and/or run at stock speeds. The thing overclocks like a champ. Put the Gigabyte Windforce maxed out (you hit Nvidia's voltage barrier without hitting the thermal one) up against the R7 270. The 270 loses handily.

 

http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/23

 

Doesn't quite look like that.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Wtf? Is AMD seriously this immature? This is a really low point for AMD, if they have to bash nvidia just to get more fanboys.

Another reason why I don't consider AMD to be a serious company.

It's just a little bit of fun.

 

You don't stop playing because you get old, you get old because you stop playing.

Link to comment
Share on other sites

Link to post
Share on other sites

Company's should be competing with their products, not by disturbing or trolling the other company. I think it is childish..

Link to comment
Share on other sites

Link to post
Share on other sites

Company's should be competing with their products, not by disturbing or trolling the other company. I think it is childish..

 

How is it disturbing if they're there just to enjoy the show?

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Yea... This is a little childish IMHO. Just release competitive GPU's and be done with it. 

^^^This. I just lost more respect for AMD as a company. Grow up.

ヽ༼ຈل͜ຈ༽ノ raise your dongers ヽ༼ຈل͜ຈ༽ノ


It feels as though no games ever leave the BETA stage anymore, until about 3 years after it officially releases. - Shd0w2 2014

Link to comment
Share on other sites

Link to post
Share on other sites

Green team will crush red team! Go green team go! Proud NVIDIA supporter xD. But seriously, this is going to be fun and funny ^_^ . Imagine a whole group of red people just bashing in hahaha, NVIDIA gonna be like what the hell, when did we change our colours haha

Link to comment
Share on other sites

Link to post
Share on other sites

Green team will crush red team! Go green team go! Proud NVIDIA supporter xD. But seriously, this is going to be fun and funny ^_^ . Imagine a whole group of red people just bashing in hahaha, NVIDIA gonna be like what the hell, when did we change our colours haha

 

More people like you and less butthurt people please.

Link to comment
Share on other sites

Link to post
Share on other sites

yup, tonga will but it will never match maxwells power consumption ratio !!! long live nvidia 

 

why?

 

i still saw the new 980 cards with a 8+6 pin pci-e power connector, this means that it still is a  ~250W - 300W card, same as the current R9-290´s.

So in my opinnion the power consumption is irrelevant. because there will be not much of diffrence between both Nvidia and AMD equilevants went it comes to power consumption.

 

They won´t put an 8 + 6 pin power connector that is 250W + 75W from pci-e slot, on a card that only consumes 180W right?

 

Well i suppose that everybody now understands that the whole powerconsumption statements are just irrelevant, power consumption depends on so much more then only the GPU, allot of those reviewers realy have no clue what they are talking about.

That allways makes me kinda giggle. :D

Link to comment
Share on other sites

Link to post
Share on other sites

why?

i still saw the new 980 cards with a 8+6 pin pci-e power connector, this means that it still is a ~250W - 300W card, same as the current R9-290´s.

So in my opinnion the power consumption is irrelevant. because there will be not much of diffrence between both Nvidia and AMD equilevants went it comes to power consumption.

They won´t put an 8 + 6 pin power connector that is 250W + 75W from pci-e slot, on a card that only consumes 180W right?

Well i suppose that everybody now understands that the whole powerconsumption statements are just irrelevant, power consumption depends on so much more then only the GPU, allot of those reviewers realy have no clue what they are talking about.

That allways makes me kinda giggle. :D

980 is 6/6 reference with 6/8 for aftermarket or maybe for the 980TI. Please look at the actual connectors instead of just the solder points. Also, remember the overclocking headroom on the 750TI? It was HUGE! It's also quite possible Nvidia did this to let overclockers have a field day.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

why?

 

i still saw the new 980 cards with a 8+6 pin pci-e power connector, this means that it still is a  ~250W - 300W card, same as the current R9-290´s.

So in my opinnion the power consumption is irrelevant. because there will be not much of diffrence between both Nvidia and AMD equilevants went it comes to power consumption.

 

They won´t put an 8 + 6 pin power connector that is 250W + 75W from pci-e slot, on a card that only consumes 180W right?

 

Well i suppose that everybody now understands that the whole powerconsumption statements are just irrelevant, power consumption depends on so much more then only the GPU, allot of those reviewers realy have no clue what they are talking about.

That allways makes me kinda giggle. :D

 

Well, the TDP of Maxwell seems to be much lower... so it should have a big impact on temps if not the total power draw... Power consumption is not irrelevant...

Link to comment
Share on other sites

Link to post
Share on other sites

this is looking BIG! i have been needing a good GPU war lately, looks like i'll be getting it!

Build: Sister's new build |CPU i5 2500k|MOBO MSI h61m-p23 b3|PSU Rosewill 850w  |RAM 4GB 1333|GPU Radeon HD 6950 2GB OCedition|HDD 500GB 7200|HDD 500GB 7200|CASE Rosewill R5|Status online


Build: Digital Vengeance|CPU i7 4790k 4.8GHz 1.33V|MOBO MSI z97-Gaming 7|PSU Seasonic Xseries 850w|RAM 16GB G.skill sniper 2133|GPU Dual R9 290s|SSD 256GB Neutron|SSD 240GB|HDD 2TB 7200|CASE Fractal Design Define R5|Status online

Link to comment
Share on other sites

Link to post
Share on other sites

Well, the TDP of Maxwell seems to be much lower... so it should have a big impact on temps if not the total power draw... Power consumption is not irrelevant...

 

TDP doesn´t tell a jackshit about power consumption.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×