Jump to content

Radeon R9 290X hits $900 -- AnandTech

i hate VAT T_T, 20% on everything i buy, if i want to buy a 780, i have to pay £60, FOR NO FUCKING REASON! seriously, thats around $100 more, just because of this shitty government who just likes to take money from anything, anything, i pay VAT on bread, and milk.

We have free and good healthcare and free education alongside a bunch of free things, stop complaining.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

We have free and good healthcare and free education alongside a bunch of free things, stop complaining.

all im saying is

-tumblr_m0h40ow99W1qc7mh1.jpg

bring it down to 15 :P or 10

Link to comment
Share on other sites

Link to post
Share on other sites

 

-snip-

 

I've explained it before. Not doing it again, so just go to that post.

TL;DR:

If you can't understand why average FPS is a terrible metric for how you experience the game, then you won't get it. Minimum FPS is king at the top tier of GPUs and AMD GPUs are better at it than Nvidia GPUs are in general. A 7970 GHz edition beat a 780 and a Titan at it. After checking the 290 (X or not), it beats them and the Ti too. 

Because unless you have a 120/144hz monitor, any FPS over 60 is pointless to have, but everything below it is very relevant. 

So since I'm going 4K in the future with a 60hz IPS monitor (when they exist and are relatively cheap), the AMD is better for me than the Nvidia because minimum FPS needs to stay playable, or it can't handle High-Ultra graphics (I am crossfiring them). Which would defeat the purpose, in my opinion, of having the best GPU the manufacturer offers.

Therefore, I will not buy an inconsistent GPU because inconsistent means the minimum FPS is low which means there will be points that noticeably feel unplayable.

We have free and good healthcare and free education alongside a bunch of free things, stop complaining.

See, I wouldn't consider free healthcare necessarily a positive because, by nature, it is always seeking to cut costs, which means it's very possible for patients to not be screened for issues enough to find anything that actually may be wrong.

In the US healthcare system, being for profit and such, it may cost a lot, but you can be dang sure they are going to find something if you go to the doctor because they are going to schedule every possible scan they can to get away with to get money out of you. Whereas, they wouldn't do that in a free healthcare system to minimize costs.

People can talk about costs all day, but an ounce of prevention is worth a pound of cure, and that sort of holds true in a similar way in this situation as finding it early makes treatment much easier.

Now, you could argue varying other circumstances and options, but in this one at least, for profit healthcare seems better. It creates a lot of waste, but in the face of all that waste and all that money, I am left to ask "But what is the value of a human life?"

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

i hate VAT T_T, 20% on everything i buy, if i want to buy a 780, i have to pay £60, FOR NO FUCKING REASON! seriously, thats around $100 more, just because of this shitty government who just likes to take money from anything, anything, i pay VAT on bread, and milk.

Yo brah,there's 24% vat here.Then,on top of that,retailers add another 5-10% because they want to make money.Also,Europe gets prices in Euro.Soo...I have to pay $800 for a 780 here.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

My welcome what ?

where really sure it's talking about his welcome 

 

No but seriously, I don't understand how people still don't get the difference between your and you're or its and it's or were and where. It's elementary grade grammar that anybody could understand. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yo brah,there's 24% vat here.Then,on top of that,retailers add another 5-10% because they want to make money.Also,Europe gets prices in Euro.Soo...I have to pay $800 for a 780 here.

u should move to the UK, only 20% hahahahahhahahah

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I got my 290 when I did!

Intel Core i7 4770K | Gigabyte R9 290 OC 4GB Windforce Edition | MSI Z87-G45 Gaming Mobo

Noctua NF-S12A FLX 120 + 140mm Fans | Corsair H100i CPU Cooler | Samsung 840 EVO 250GB SSD | BitFenix Ghost Case

Seagate Barracuda 2TB | Corsair RM-850 PSU | G. Skill Sniper 16GB (4x4GB) | AOC G2460PQU 144Hz Monitor

Link to comment
Share on other sites

Link to post
Share on other sites

I've explained it before. Not doing it again, so just go to that post.

TL;DR:

If you can't understand why average FPS is a terrible metric for how you experience the game, then you won't get it. Minimum FPS is king at the top tier of GPUs and AMD GPUs are better at it than Nvidia GPUs are in general. A 7970 GHz edition beat a 780 and a Titan at it. After checking the 290 (X or not), it beats them and the Ti too. 

Because unless you have a 120/144hz monitor, any FPS over 60 is pointless to have, but everything below it is very relevant. 

So since I'm going 4K in the future with a 60hz IPS monitor (when they exist and are relatively cheap), the AMD is better for me than the Nvidia because minimum FPS needs to stay playable, or it can't handle High-Ultra graphics (I am crossfiring them). Which would defeat the purpose, in my opinion, of having the best GPU the manufacturer offers.

Therefore, I will not buy an inconsistent GPU because inconsistent means the minimum FPS is low which means there will be points that noticeably feel unplayable.

 

 

Yeah, you're not going convince anyone that a  7970 GHz edition is better than 780 or Titan, so just stop it. It's nonsense.

•  i7 4770k @ 4.5ghz • Noctua NHL12 •  Asrock Z87 Extreme 4 •  ASUS GTX 780 DCII 1156/6300 •

•  Kingston HyperX 16GB  •  Samsung 840 SSD 120GB [boot] + 2x Seagate Barracuda 2TB 7200RPM •

•  Fractal Design Define R4  •  Corsair AX860 80+ Platinum •  Logitech Wireless Y-RK49  •  Logitech X-530  •

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, you're not going convince anyone that a  7970 GHz edition is better than 780 or Titan, so just stop it. It's nonsense.

Pfff, Do you even read, bro?

It makes no logical sense to measure the top tier GPU's performance by average FPS when 90% of the time, that average is above 60 FPS at 1080p and most people have 60Hz 1080p monitors.

The only time a 780 or Titan is a better choice (relative to the 290, X or not, at MSRP) is if you are gaming at a resolution greater than 1080p or have a 120/144hz monitor. And most people don't have either of those. 

I never said one was better than the other in general. I said the 7970 GHz edition has better minimum FPS than the 780 or the Titan, and the same is true of the 290X. Which, based on the tests done by the reviewers, is a fact. 

So unless you classify facts as nonsense (which would be what a troll would do), it's not nonsense. 

Minimum FPS is king at the top tier because it's the thing that logically matters to most gamers (the exceptions being those who play at resolutions greater than 1080p or have 120/144hz monitors).

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Pfff, Do you even read, bro?

It makes no logical sense to measure the top tier GPU's performance by average FPS when 90% of the time, that average is above 60 FPS at 1080p and most people have 60Hz 1080p monitors.

The only time a 780 or Titan is a better choice (relative to the 290, X or not, at MSRP) is if you are gaming at a resolution greater than 1080p or have a 120/144hz monitor. And most people don't have either of those.

 

While it's true that most people have a 60hz 1080p monitor, I think that everyone who are considering buying one of those top cards has a better monitor than that.

Link to comment
Share on other sites

Link to post
Share on other sites

While it's true that most people have a 60hz 1080p monitor, I think that everyone who are considering buying one of those top cards has a better monitor than that.

<Owns a 290X

<Has a 1080p 60Hz monitor

Steam has figures for all of this stuff. Rather than just using Anecdotal things (like my situation), we could use that. Although it doesn't show us the percentage of people with [insert top tier GPU] and 1080p60 monitors. 

I don't agree with you on that. They might have 1440p, but I would expect that to be a rarity since that's not main stream anyway. If you are going to go Eyefinity or 4K, odds are you are going Crossfire/SLI which puts you out of this demographic anyway. 

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Keep in mind that the top GPUs keep up with the games of today. That won't most likely be the case in 2-3 years. You'll have to lower the settings or upgrade.

Motherboard - Gigabyte P67A-UD5 Processor - Intel Core i7-2600K RAM - G.Skill Ripjaws @1600 8GB Graphics Cards  - MSI and EVGA GeForce GTX 580 SLI PSU - Cooler Master Silent Pro 1,000w SSD - OCZ Vertex 3 120GB x2 HDD - WD Caviar Black 1TB Case - Corsair Obsidian 600D Audio - Asus Xonar DG


   Hail Sithis!

Link to comment
Share on other sites

Link to post
Share on other sites

<Owns a 290X

<Has a 1080p 60Hz monitor

Steam has figures for all of this stuff. Rather than just using Anecdotal things (like my situation), we could use that. Although it doesn't show us the percentage of people with [insert top tier GPU] and 1080p60 monitors. 

I don't agree with you on that. They might have 1440p, but I would expect that to be a rarity since that's not main stream anyway. If you are going to go Eyefinity or 4K, odds are you are going Crossfire/SLI which puts you out of this demographic anyway. 

 

 

1440p monitors are a rarity, sure. But so are people willing to spend $700 on a gpu. While it's true that I have no actual numbers on that, I still think that there is a fairly significant overlap between the groups of people who has a 1440p monitor and people who are willing to spend $700 on a gpu. But maybe I'm biased since I own one myself.

Link to comment
Share on other sites

Link to post
Share on other sites

1440p monitors are a rarity, sure. But so are people willing to spend $700 on a gpu. While it's true that I have no actual numbers on that, I still think that there is a fairly significant overlap between the groups of people who has a 1440p monitor and people who are willing to spend $700 on a gpu. But maybe I'm biased since I own one myself.

That's now. I'm talking about the GPUs are their MSRP or wtv it's called (which is $550 for the 290X). The situation we are currently in kind of throws all that I've said out as the 290X is obscenely more expensive than it normally would be. And out of stock everywhere.

But in terms of sheer FPS, if price is ignored, the 290X is best (in terms of minimum FPS, as I've said). 

And I might be biased because of what I own. We just don't have numbers on this, so I just go with what I know I need. 

The 290X particularly fits well with my upgrade path because I intend to crossfire them later (once the prices drop back to normal), and buy a 4K monitor (once they drop in price as well). I'm looking long term from the point when I could buy a GPU (before the 290X's were $700+, and the 780 Ti was released).

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's now. I'm talking about the GPUs are their MSRP or wtv it's called (which is $550 for the 290X). The situation we are currently in kind of throws all that I've said out as the 290X is obscenely more expensive than it normally would be. And out of stock everywhere.

But in terms of sheer FPS, if price is ignored, the 290X is best (in terms of minimum FPS, as I've said). 

And I might be biased because of what I own. We just don't have numbers on this, so I just go with what I know I need. 

The 290X particularly fits well with my upgrade path because I intend to crossfire them later (once the prices drop back to normal), and buy a 4K monitor (once they drop in price as well). I'm looking long term from the point when I could buy a GPU (before the 290X's were $700+, and the 780 Ti was released).

I certainly agree that the 290x is good value at the proper price. I have one of those myself. I just don't think it's fair to say that it is ridiculous to compare top-end cards on average fps, because on higher resolutions the average fps is still a relevant measure.

Link to comment
Share on other sites

Link to post
Share on other sites

I certainly agree that the 290x is good value at the proper price. I have one of those myself. I just don't think it's fair to say that it is ridiculous to compare top-end cards on average fps, because on higher resolutions the average fps is still a relevant measure.

See, I don't think it's a relevant measure because it doesn't accurately convey the gaming experience you are going to have. These numbers aren't taken from anywhere, they are examples:

Say I get 75 max FPS, 60 average FPS, and 35 minimum FPS. 35 FPS kind of sucks. And just telling you the average isn't going to tell you the minimum. The minimums and maximums are relevant because minimums mean lag/slideshow or noticeable degradation to your gaming experience, and maximums (unless you use some form of frame sync) means the same degradation, but with frame tearing. And if you use Vsync, you get mouse input lag, and if you use GSync, you have to pay a lot more. "FreeSync" is kind of non-existent currently.

Both Min and Max FPS have negatives if they are too extreme. Therefore, the more consistent GPU is best because it's extremes aren't as extreme so the negatives are minimized. 

Whereas, if I had 65 max FPS, 55 average FPS, and 40 minimum FPS, and you just look at average, you will think the first option is better because it's "more", and that's it. Even though your gaming experience will likely be worse due to the extreme max/min FPS, relative to this max/min FPS.

Most reviews (not all, but most) I've seen only bother with frame latency and average FPS, and I find that somewhat deceitful because it doesn't tell me what I need to know about a GPU. 

I guess, technically, Average FPS is good for measuring GPU power, but GPU power doesn't translate to good gaming experience (because of drivers, among various other things). 

I'm just saying. It doesn't make sense to me why the average is always used. It's so little information, to me at least, that's it worthless.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

See, I don't think it's a relevant measure because it doesn't accurately convey the gaming experience you are going to have. These numbers aren't taken from anywhere, they are examples:

Say I get 75 max FPS, 60 average FPS, and 35 minimum FPS. 35 FPS kind of sucks. And just telling you the average isn't going to tell you the minimum. The minimums and maximums are relevant because minimums mean lag/slideshow or noticeable degradation to your gaming experience, and maximums (unless you use some form of frame sync) means the same degradation, but with frame tearing. And if you use Vsync, you get mouse input lag, and if you use GSync, you have to pay a lot more. "FreeSync" is kind of non-existent currently.

Both Min and Max FPS have negatives if they are too extreme. Therefore, the more consistent GPU is best because it's extremes aren't as extreme so the negatives are minimized. 

Whereas, if I had 65 max FPS, 55 average FPS, and 40 minimum FPS, and you just look at average, you will think the first option is better because it's "more", and that's it. Even though your gaming experience will likely be worse due to the extreme max/min FPS, relative to this max/min FPS.

Most reviews (not all, but most) I've seen only bother with frame latency and average FPS, and I find that somewhat deceitful because it doesn't tell me what I need to know about a GPU. 

I guess, technically, Average FPS is good for measuring GPU power, but GPU power doesn't translate to good gaming experience (because of drivers, among various other things). 

I'm just saying. It doesn't make sense to me why the average is always used. It's so little information, to me at least, that's it worthless.

I agree that using only one parameter for comparison is a bad idea. Ideally reviews should show all parameters, including power usage, noise and such. I guess I might have just misunderstood what you said. I thought you meant that the average should be completely disregarded. I agree that it shouldn't be the only comparison parameter.

Link to comment
Share on other sites

Link to post
Share on other sites

The price of the 290x didn't really increase in my country. It's the same as when it released.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree that using only one parameter for comparison is a bad idea. Ideally reviews should show all parameters, including power usage, noise and such. I guess I might have just misunderstood what you said. I thought you meant that the average should be completely disregarded. I agree that it shouldn't be the only comparison parameter.

My bad. You are right. I should've added "on it's own" to saying "Average FPS is a terrible way to measure GPU performance for a consumer" (paraphrased).

I just don't like it in general. Because it doesn't tell me anything useful, whether I look at it along with min/max or not. You could say it tells you what FPS you can expect while playing a game most of the time, but that's not necessarily true. It's just an average of FPS taken per second over a period of time. 

If you want that information, they should tell use the mode FPS (most often occurring).

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×