Jump to content

The World's Most Powerful Computer Chip Award for 2013 Goes To...

In terms of what FPS? 

Max? Average? Min? 

Question: Which of those matters most in the highest tier of Single chip GPU cards?

My Answer is Min. Because average and max are usually over your refresh rate and so don't matter. And the 290X (and AMD in general) tends to do better at minimums than Nvidia. Just saying.

Fair point, I didn't think about that.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

Fair point, I didn't think about that.

Yeah, most people don't. That's my argument for choosing AMD over Nvidia (back when I did).

I completely accept that the 780 Ti beats the 290X at all of them, but back when I was choosing, the 780 didn't, and the 290X was cheaper. 

Note that when I say "AMD tends to do better than Nvidia at minimum FPS", I mean at equivalent GPUs. I wouldn't equate a reference 290X to a 780 Ti. They aren't the same in price or performance. But that's just me. Others will be different. 

Mind you, the data I used to come to that conclusion about Nvidia VS AMD was from a 7970 GHz edition VS a Titan. The 7970's mins were ~10 FPS higher than the Titan, although the Titan crushed the 7970 in all other areas. 

This also isn't true when talking about ultra high resolution. Simply because the whole "Max and Average FPS are usually higher than your refresh rate" is only true at 1080p. Not at 5760x1080 or 4K (unless you lower the settings a bit).

Eh, it's complicated.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I completely accept that the 780 Ti beats the 290X at all of them, but back when I was choosing, the 780 didn't, and the 290X was cheaper.  

Also a good point, it isn't fair to pit one card against another with a $100 price difference.

 

And it (might) have been a larger price difference before the coin mining frenzy took off.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

The Bottom Line

 

As we have noted several times now, the Radeon R9 290X dominated the GTX 780 and GTX TITAN in Ultra HD 4K display gaming. The new GeForce GTX 780 Ti changes this ownage, and gives AMD competition at Ultra HD 4K resolution. The GeForce GTX 780 Ti gives you exactly the same gameplay experience as the Radeon R9 290X at Ultra HD 4K display gaming.

 

The statement above is very important. Though the AMD Radeon R9 290X now has competition at Ultra HD 4K display gaming, it isn't being "owned" by the GTX 780 Ti. The GTX 780 Ti, at $150 more, only equals the R9 290X. Both video cards are even, or on par with each other at Ultra HD 4K gaming.

 

The fact is that the Radeon R9 290X is delivering the same performance once again for a $150 savings. It takes the competition a $150 more expensive video card just to perform on par with the Radeon R9 290X. This reinforces our conclusion once again that the Radeon R9 290X is an incredible value right now. You really do get a lot of performance for your money.

 

The GeForce GTX 780 Ti has helped NVIDIA match the Radeon R9 290X in performance at Ultra HD 4K display gaming. However, AMD is doing it for $150 less. The only benefits the GTX 780 Ti holds over the reference R9 290X at this point, is that it is quieter and produces less heat. It's hard to justify the sound profile as being worth a $150 price premium.

http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/7#.Us1sv7Sc5_M

 

Now also keep in mind that a Hawaii GPU is only 438mm² in size while the GK110 GPU is 551mm² large which means AMD has a better performing architecture for high resolution gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/7#.Us1sv7Sc5_M

 

Now also keep in mind that a Hawaii GPU is only 438mm² in size while the GK110 GPU is 551mm² large which means AMD has a better performing architecture for high resolution gaming.

Isn't the opposite true for AMD VS Intel? AMD's CPUs are large while Intels are small and do better or the same. 

And my goodness. That review was redundant. Almost painfully so.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@Vitalius thanks for saying practically everything I would have said whilst I was asleep.

I just thought I'd add the R9 290 is priced @ £300 here whilst being a small amount below the 780Ti which costs a grand total of £520 minimum (not taking into account P&P).

When you can get crossfired 290s for a small amount (relative to the large sums of money we're dealing with £80-100) more than a 780Ti I struggle to see where Nvidia can justify such a cost.

I will admit that generally Nvidia does have a more "premium" feel to it but at the same time AMD has a more prosumer feel to it, they just overall feel like a nicer company.

Look at FreeSync for example, whilst it most likely isn't perfect in comparison to G-Sync it's most likely at least 50% as good for "free".

 

Don't get me wrong I know that all every company wants to do is get your money (all your money are belong to them) but AMD isn't afraid to take risks with their own financial status in order to gain a reputation and loyalty within customers. With the amount of different APUs they've got in thousands (millions) of devices I'm sure that their combined marked share across all of their sectors is most likely on par or even above the competition's combined because not only do they make so many different x86 solutions but they also make ARM devices too. One may say they've stretched themselves too thin across the board but with their purchase of ATI last decade they've been using developments in every sector to benefit others.

 

Maybe it's just me but doesn't a company that is very open with their developments and prospects appeal to you guys?

 

(Just thought I'd say I'm extremely dizzy right now, just woke up and I fell down the stairs about 15 minutes ago so please mind any punctuation/grammar mistakes, thank you.)

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

It consumes marginally less power at any decent load (1-20 watts).

 

And as said above this source seems like a joke.

It is more efficient is what people mean I think. The GK 110 consumes less power and wastes less of that through heat output yet is more powerful.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Not if they can both crush 1080p to the same degree. Take a Ferrari Enzo and a Bugatti Veyron. Both will do 0-60 in under 3 seconds. Both will do 150 mph without breaking a sweat. If you want to do 200 mph then both will do that too. But the Veyron will continue up to 260 mph while the Enzo will stop accelerating at about 220.

But the R9-290X and GTX780Ti both can't handle 4K at all.

20-30Fps is not playable.

And the complete 4K talk is bullshit because most games don't have 4K textures or 4k shadowmaps.

Not only are the GPUs not ready for 4k but the games are also not ready.

If they want to talk about high resolution gaming then they should talk 1440p or 1600p/1620p which will be the real next gaming res not 4k.

And those low Fps are not even on Next Gen games.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

But the R9-290X and GTX780Ti both can't handle 4K at all.

20-30Fps is not playable.

And the complete 4K talk is bullshit because most games don't have 4K textures or 4k shadowmaps.

Not only are the GPUs not ready for 4k but the games are also not ready.

If they want to talk about high resolution gaming then they should talk 1440p or 1600p/1620p which will be the real next gaming res not 4k.

And those low Fps are not even on Next Gen games.

 

Yes. They can. 

Did you completely miss my post about that?

The game also decides how "playable" certain FPS are. i.e. Far Cry 3 is known for being smooth/playable at low FPS. 

Cheap (relatively) 4K monitors are on the way. I have no doubt that 4K is where I'm going next from 1080p, in 6 months, at most. 

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Matters aside... I'm guessing this was more of a troll post, since this website looks like it was made by some kid in an hour...

"Her tsundere ratio is 8:2. So don't think you could see her dere side so easily."


Planing to make you debut here on the forums? Read Me First!


unofficial LTT Anime Club Heaven Society

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA FTW

 

  1. GLaDOS: i5 6600 EVGA GTX 1070 FE EVGA Z170 Stinger Cooler Master GeminS524 V2 With LTT Noctua NFF12 Corsair Vengeance LPX 2x8 GB 3200 MHz Corsair SF450 850 EVO 500 Gb CableMod Widebeam White LED 60cm 2x Asus VN248H-P, Dell 12" G502 Proteus Core Logitech G610 Orion Cherry Brown Logitech Z506 Sennheiser HD 518 MSX
  2. Lenovo Z40 i5-4200U GT 820M 6 GB RAM 840 EVO 120 GB
  3. Moto X4 G.Skill 32 GB Micro SD Spigen Case Project Fi

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think many of you misses the point of a GPU being capable and "build around" 4k gaming...

Ofcourse you are not gonna get 60fps stable on a 4k monitor setup on ULTRA settings with MSAA and the like..

I'm running a 5870... I'm NOT running full graphics on newer titles on 1080p... Does that mean it is not capeable?.. Certainly not!. That's my opinion atleast.

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

I call BIAS. The 780Ti consumes less power and still performs better. Even the 290 performs better than the 290X (in Linus' benchmarks).

 

I don't get it.

 

EDIT: This site also has AMD plastered all over it. I wouldn't trust it. It's also a WIX site. Like, the creator did not even want to create a proper website anyway.

Edited by Stefan Becker

ヽ༼ຈل͜ຈ༽ノ raise your dongers ヽ༼ຈل͜ຈ༽ノ


It feels as though no games ever leave the BETA stage anymore, until about 3 years after it officially releases. - Shd0w2 2014

Link to comment
Share on other sites

Link to post
Share on other sites

I call BIAS. The 780Ti consumes less power and still performs better. Even the 290 performs better than the 290X (in Linus' benchmarks).

 

I don't get it.

 

EDIT: This site also has AMD plastered all over it. I wouldn't trust it. It's also a WIX site. Like, the creator did not even want to create a proper website anyway.

 

The 290X technically has better compute performance, but the 780 Ti generally gets better fps (although above 1440p they get a little more even). So it depends how you want to cherrypick your criteria, I suppose. I mean Ryan Shrout and the guys over at PC Perspective chose the 780 as their GPU of the year. It's just people's end of the year round-up/lists, the idea of which are a tad silly, imo.

Link to comment
Share on other sites

Link to post
Share on other sites

Like the ASUS option, the Sapphire card performed better than the GTX 780 Ti in Bioshock Infinite and Crysis 3 while the GTX 780 Ti's only definitive victory came in Battlefield 3.  In the three other games tested, the cards were so close that I'll call it a performance tie.

2 out of 3 games, the 290x beats a 780Ti. Add that to its superior compute power, I wouldn't feel bad about calling it the world's most powerful video card.

link

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

2 out of 3 games, the 290x beats a 780Ti. Add that to its superior compute power, I wouldn't feel bad about calling it the world's most powerful video card.

link

Thank you for bringing in aftermarket coolers. Everyone still assumes that 290x's are shit due to the reference cooler.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

2 out of 3 games, the 290x beats a 780Ti. Add that to its superior compute power, I wouldn't feel bad about calling it the world's most powerful video card.

link

 

Out of curiosity, do we have any benchmarks comparing aftermarket 780 Ti's to aftermarket 290X's? I only glanced at PC Per's reviews, and it's tough since they didn't seem to do a full review on the EVGA GTX 780 Ti ACX. But in Metro: LL, the aftermarket 780 Ti seems to outperform the Sapphire 290X (admittedly it sucks to compare graphs of FPS vs. time in two different reviews). And does the situation look the same at 1080p as well?

Link to comment
Share on other sites

Link to post
Share on other sites

Not if they can both crush 1080p to the same degree. Take a Ferrari Enzo and a Bugatti Veyron. Both will do 0-60 in under 3 seconds. Both will do 150 mph without breaking a sweat. If you want to do 200 mph then both will do that too. But the Veyron will continue up to 260 mph while the Enzo will stop accelerating at about 220.

Actually the enzo is listed at 3.1 and higher for old models while the bugatti is 2.4. Comparing acceleration at a low speed to top speed makes no real sense anyway.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The 290X technically has better compute performance, but the 780 Ti generally gets better fps (although above 1440p they get a little more even). So it depends how you want to cherrypick your criteria, I suppose. I mean Ryan Shrout and the guys over at PC Perspective chose the 780 as their GPU of the year. It's just people's end of the year round-up/lists, the idea of which are a tad silly, imo.

I'd also say that the 780 is my GPU of the year. But in terms of sheer performance, heatoutput and noise, the 780Ti wins.

ヽ༼ຈل͜ຈ༽ノ raise your dongers ヽ༼ຈل͜ຈ༽ノ


It feels as though no games ever leave the BETA stage anymore, until about 3 years after it officially releases. - Shd0w2 2014

Link to comment
Share on other sites

Link to post
Share on other sites

lol. Looks like a 14 year old amd fanboy found out how to use the <p> and <img> tag in html and made that website. have you seen their other 5 news items? let me mention them "new radeons" "the amd advantage" and amd launches 290x beats titan and costs 450$ less"

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

What a joke....that website is hardly reputable.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×