Jump to content

The r9 295X2 is the best card on the market?

Marcelo Delicia

personally I have no problem with AMD drivers, I thought that Nvidia wetre having driver problems currently?

 

Nvidia does have the less stable driver ATM, just try to ignore this guy. He's spitting mad fanboy fire today, just let him be or add him to your ignore preferences like I already did.

Updated 2021 Desktop || 3700x || Asus x570 Tuf Gaming || 32gb Predator 3200mhz || 2080s XC Ultra || MSI 1440p144hz || DT990 + HD660 || GoXLR + ifi Zen Can || Avermedia Livestreamer 513 ||

New Home Dedicated Game Server || Xeon E5 2630Lv3 || 16gb 2333mhz ddr4 ECC || 2tb Sata SSD || 8tb Nas HDD || Radeon 6450 1g display adapter ||

Link to comment
Share on other sites

Link to post
Share on other sites

Says the one who claims 60hz is "enthousiast"

 

Playing The Witcher 3 at 5K 40 fps and downsampling to 1440p complaining that anything that's not 144hz is not "enthusiast". Logic isn't your strong point, is it?

 

You may as well have got 4K 60hz and actually gotten full advantages of the pixels you're wasting computational resources rendering. Could even have gotten more fps that way, too.

Link to comment
Share on other sites

Link to post
Share on other sites

personally I have no problem with AMD drivers, I thought that Nvidia wetre having driver problems currently?

Oh yeah, can you use VSR for 4K, 5K and 8K.

If I had a 295X2 now, I wouldn't be able to play Witcher 3 at 5K downscaled to 1440P.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Playing The Witcher 3 at 5K 40 fps and downsampling to 1440p complaining that anything that's not 144hz is not "enthusiast". Logic isn't your strong point, is it?

 

You may as well have got 4K 60hz and actually gotten full advantages you're wasting computational resources rendering. Could even have gotten more fps that way, too.

And stuck with 60hz in the hunderd of hours that I play Battlefield and CS:GO?

Also I play Witcher at 4K DSR Or 5K DSR because the game simply doesn't have any AA other then FXAA and forcing the AA in Nvidia Panel doesn't work for me.

 

My K/D has simply just doubled since I switched from my old 27 inch 1080P 60hz LG monitor back in April.

 

Also you forgot that I currently sitting in front of a Acer XB280HK:

 

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

And stuck with 60hz in the hunderd of hours that I play Battlefield and CS:GO?

 

Also I play Witcher at 4K DSR Or 5K DSR because the game simply doesn't have any AA other then FXAA and forcing the AA in Nvidia Panel doesn't work for me.

 

I play with AA off entirely. Isn't really needed at 4K. If you want to buy a monitor specific for two games then whatever, that's your choice. But in a lot of games it just doesn't matter. If you play RPGs like The Witcher and Dragon Age instead of competitive FPSes you are going to get a tonne more benefit out of 4K than 144Hz. Your vitriol at the entire concept of high res just because it doesn't suit two games that you play a lot doesn't make that much sense.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia does have the less stable driver ATM, just try to ignore this guy. He's spitting mad fanboy fire today, just let him be or add him to your ignore preferences like I already did.

 

You are the one that bought the Fury X while the 980Ti performs better. We are PC gamers, we use proof not opinions to justify our opinions.

 

 

 

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

I play with AA off entirely. Isn't really needed at 4K. If you want to buy a monitor specific for two games then whatever, that's your choice. But in a lot of games it just doesn't matter. If you play RPGs like The Witcher and Dragon Age instead of competitive FPSes you are going to get a tonne more benefit out of 4K than 144Hz. Your vitriol at the entire concept of high res just because it doesn't suit two games that you play a lot doesn't make that much sense.

 

Well, I have tried the Acer XB280HK and it's shit mainly because it's TN, but yeah, only G-SYNC 4K monitor out there.

The difference between IPS 1440P and TN 4K is just not there when you are immersed in a game. Maybe if you stop and sit and stare in the Witcher 3 you can see a difference.

So 4K will not enter my wishlist until next year, if AMD/Nvidia finally wake up to add Displayport 1.3 to their cards.

 

It's kinda funny how we are having this discussion here, while I was the one laughing at people buying 970 and 980's at the start of this year when I recommend the 295X2 because it was 550$ on Newegg and Amazon. I also recommend the 390 every single time over the shitty 3.5Gb 970. Nvidia just freaking sucks at making good cards for budget and mainstream section. But then, AMD is the one lacking in hardware and software for enthusiast like me. My last card was a TRI-X R9 290 btw.

 

I just buy what ever is the best on the market, currently the Acer XB270HU is the best gaming monitor on the market and the 980Ti is the most powerfull single core graphics card out there.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Well I've SLI'd 660ti's didn't really see a great performance increase, most the games that I play have either terrible scaling or didn't support it. I just like having a single card because it has less issues. Also regarding the frame timings I just assumed it hadn't been fixed haven't really looked into it at all as it doesn't really affect me in anyway whatsoever... 

 

I'm not an nvidia fan boy, just don't like dual GPU card solutions, for me they don't make sense, sure others will see the use case but I don't. I guess the advantage of a 295x2 is the price and the tier 3 compatibility with dx12 so bigger performance gains due to the async shaders. 

AMD fixed their timing issues... nvidia has done some work on it, but still has some issues as to date. Unfortunate as it is. It seems Nvidia is ignoring SLI issues for the most part and focusing entirely on just having a strong enough single GPU so they do not need to deal with fixing SLI....

 

microstutters do happen. Yes...

 

but nearly 3 years of using dual 7950s and now about 3/4 of a year with my 295x2... things are really nice now.... It wasnt really bad before, but even the minor issues that CF had before has been ironed out these days... At this point, it really is about whether you have, or havent got a crossfire profile for the game.... even then. If my CF doesnt work, i still have a OCd 290X

Link to comment
Share on other sites

Link to post
Share on other sites

Do you own a dual GPU card?

have you owned one?

 

if not, shut up.

 

I own one, ad ive been using CF for years... all this frame sync bullshit is seriously annoying. Fun fact though, guess where i see the most microstutters??

 

In Unigine Valley, Unigine Heaven and Firestrike.... Which are not games... shocking isnt it?

 

Some games do stutter due to my OC... yeah, that is what happens when you OC a R9 295x2... Things get a bit unstable (mostly cuz i am hitting the TJmax limiter of 75c.... wish i could set it to 80 or 85c...)

 

 

I've had 2 dual GPU setups in the last 3 years (2 780's and 2 970s), not once did I have any issues with microstutter.  Not sure where everyone gets this shit, for every person that has problems with microstutter, I'm willing to bet there are 20+ people that don't.

 

I did benchmarks with a friend that has a Titan X when I had my 2 970's, and I was consistently beating him by over 10% in every benchmark we tested and every game we tested, even when both parties had OC'd their card(s) and CPU. Whilst testing I had zero issues with stutters or bad scaling, I was seeing scaling above 85% in everything I tested.  (Hell, in BF4 I was seeing scaling over 90%, that game is amazingly optimized for SLI/CF from what I have seen)

 

Even after going back to a single GPU setup that's watercooled & OC'd balls to the walls, it doesn't seem any more smooth to me than my 970's were.

 

And the whole "some games scale bad" is such a bullshit excuse, 99% of the time if a game doesn't support CF/SLI, it will run PERFECTLY fine on 1 GPU.

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

You are the one that bought the Fury X while the 980Ti performs better. We are PC gamers, we use proof not opinions to justify our opinions.

 

 

 

I bet that these benchmarks were not done on W10 (which has improved AMD performance pretty much across the board.)

Link to comment
Share on other sites

Link to post
Share on other sites

personally I have no problem with AMD drivers, I thought that Nvidia wetre having driver problems currently?

Nah, nvidia just have had a few tiny tiny issues of bluescreens and hard restarts lately... nothing major though. I mean, they said tehy would patch it soon... then again, they said that for the past 3 months :|

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×