Jump to content

AMD's own testing? Fury X vs GTX 980Ti

zMeul

No DVI port in this shit.

Not an issue for 99.9% of users, when you can use any number of third party DP to DVI passive adapters that cost ~$10, and even if you need an active adapter for DL-DVI, they're still only like $20-30 now.

 

I love when people are whining about 4K being unplayable. Then I have to ask: what's wrong with FHD? What's wrong with a nice 1080p? suddenly it's not enough? Put yourselves together, people.

There's nothing "wrong" with FHD, but we as PC Gamers have been playing at 1080p for over a decade now, and in some cases, even longer, if you happened to have a high res CRT back in the day.

 

When sitting at a computer, you can visually perceive the pixels of a 1080p monitor. If you can up that to 1440p or higher, it can greatly increase visual enjoyment and immersion - assuming you have the hardware to power it.

 

Not to mention, the additional added desktop real estate you get with 1440p and 4K is excellent for productivity oriented tasks.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Lol how much you wanna bet they are doing their ovwrclocked watercooled card vs a reference 980 ti at stock (maybe even with gpu boost locked).

Stock doesn't mean shit, I want to hear the overclock margins on the fury x. (I really have to hope they are significantly better than the 200 series was...)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not an issue for 99.9% of users, when you can use any number of third party DP to DVI passive adapters that cost ~$10, and even if you need an active adapter for DL-DVI, they're still only like $20-30 now.

 

When sitting at a computer, you can visually perceive the pixels of a 1080p monitor. If you can up that to 1440p or higher, it can greatly increase visual enjoyment and immersion - assuming you have the hardware to power it.

It is an issue for people with korean monitors that only accept DVI-D and that do not work at all with any adapters. At least they didn't work with adapters last time I checked them.

 

And yeah, 1440p is a great resolution to game on. I found that 1080p isn't that nice, especially for bigger monitors.

Asus B85M-G / Intel i5-4670 / Sapphire 290X Tri-X / 16GB RAM (Corsair Value 1x8GB + Crucial 2x4GB) @1333MHz / Coolermaster B600 (600W) / Be Quiet! Silent Base 800 / Adata SP900 128GB SSD & WD Green 2TB & SG Barracuda 1TB / Dell AT-101W / Logitech G502 / Acer G226HQL & X-Star DP2710LED

Link to comment
Share on other sites

Link to post
Share on other sites

Every other game is listed in both tables, why was Skyrim left out when it's still a fairly demanding game (after mods) and still hugely popular... Showcasing it running better than the competition would only draw in more Skyrim enthusiasts who enjoy the high resolution texture packs and all of that. What I'm getting at is the more that you go over both of these tables combined the more fabricated they appear.

Remember those 100+ mod versions of Skyrim? The ones that absolutely crushed GPUs at the time?

 

I want someone to show us it happening in real time, with modern GPUs.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

Remember those 100+ mod versions of Skyrim? The ones that absolutely crushed GPUs at the time?

 

I want someone to show us it happening in real time, with modern GPUs.

IIRC, Linus and Slick bench a heavily modded version of Skyrim, right? Hopefully, we'll see some in-depth coverage once they get their hands on the card.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why arent people complaining about this? The top tire Nvidia AND AMD cards are both struggling to get 60fps in 4K. We have had "Mainstream" 4k for over 2 years, and we still dont have a powerful enough gpu to run games at 60fps? 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

No DVI port in this shit.

I think this is a bigger issue that everyone crying over HDMI 2.0.

Not sure who to blame though... AMD for not including common connectors or the display manufacturers for not using the obviously superior connector (DisplayPort).

 

 

Not an issue for 99.9% of users, when you can use any number of third party DP to DVI passive adapters that cost ~$10, and even if you need an active adapter for DL-DVI, they're still only like $20-30 now.

Active adapters are terrible. All the ones I have tried (very very small sample size though) have had an enormous loss in quality.

Link to comment
Share on other sites

Link to post
Share on other sites

Why arent people complaining about this? The top tire Nvidia AND AMD cards are both struggling to get 60fps in 4K. We have had "Mainstream" 4k for over 2 years, and we still dont have a powerful enough gpu to run games at 60fps?

4k is not mainstream. And it won't be for a long time in fact it may never be. Gpu's are progressing much faster than CPU's are. It takes time and r&d I think both sides are doing the best they can and are making great products. You can't expect them to make a card that's 4x faster at the push of a button.

i7 4770k/Maximus VII Hero/Titan X/ 500gb 850evox2 r0/1000 EVGA G2

Link to comment
Share on other sites

Link to post
Share on other sites

I think this is a bigger issue that everyone crying over HDMI 2.0.

Not sure who to blame though... AMD for not including common connectors or the display manufacturers for not using the obviously superior connector (DisplayPort).

 

 

Active adapters are terrible. All the ones I have tried (very very small sample size though) have had an enormous loss in quality.

DVI would be a huge bottleneck for this card. The Fury X is targetted for 1440p and 4k resolutions, and most monitors at those resolutions already support DP and HDMI anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

4k is not mainstream. And it won't be for a long time in fact it may never be. Gpu's are progressing much faster than CPU's are. It takes time and r&d I think both sides are doing the best they can and are making great products. You can't expect them to make a card that's 4x faster at the push of a button.

But they are advertising it for "4K" and "5K" gaming. That is the point I am trying to make. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

DVI would be a huge bottleneck for this card. The Fury X is targetted for 1440p and 4k resolutions, and most monitors at those resolutions already support DP and HDMI anyway.

Then why not just have the 3 display ports? Id it is for 1440p and 4K? The HDMI will hold the card back. And bottleneck is the wrong word.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Alright, granted the price between a FuryX and a Titan X is the same as the 980ti and the FuryX is doing better so it is a Titan X killer.

But... the gap isn't as wide as I'd like it to be..  It's not doing that much better, however there's still 1 reason to get this over the 980ti and that's eyefinity! 

The only reason why I haven't bothered with AMD in the last few years as AMD cards weren't doing as well as the high end NVIDIA cards, thankfully that's no longer the case with the Fury X.

But there's no HDMI 2.0, and I can't find anything if Displayport 1.3 is supported but judging by what little I can find it does not support 1.3<.<

 

DP to HDMI 2.0 adapters will eventually come so I'm not that worried for HDMI 2, but would like DP 1.3...

I'm Batman!

Steam: Rukiri89 | uPlay: Rukiri89 | Origin: XxRukiriXx | Xbox LIVE: XxRUKIRIxX89 | PSN: Ericks1989 | Nintendo Network ID: Rukiri

Project Xenos: Motherboard: MSI Z170a M9 ACK | CPU: i7 6700k | Ram: G.Skil TridentZ 16GB 3000mhz | PSU: EVGA SuperNova 850w G2 | Case: Caselabs SMA8 | Cooling: Custom Loop | Still in progress 

Link to comment
Share on other sites

Link to post
Share on other sites

That card can support up to 350 watts of power via the 2x8pin. Imagine what you'll be able to achieve with a custom block and voltage hacks. My god..

What if fiji cant overclock very well? The pure amount of cores may limit headroom. Why do you think nvidia got SUCH good overclocks on the 900 series?

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

The FC4 benchmark matches, so I would say yes

 

But people will still be disappointing.

Because people are always dissapointed, forum is too green

The only thing I'm disappointed about is the fact that 7970 equivalents and 680 equivalents are still mid-high end solutions :\ and that is just me being generally sad with the industry. 

Link to comment
Share on other sites

Link to post
Share on other sites

Alright, granted the price between a FuryX and a Titan X is the same as the 980ti and the FuryX is doing better so it is a Titan X killer.

But... the gap isn't as wide as I'd like it to be..  It's not doing that much better, however there's still 1 reason to get this over the 980ti and that's eyefinity! 

The only reason why I haven't bothered with AMD in the last few years as AMD cards weren't doing as well as the high end NVIDIA cards, thankfully that's no longer the case with the Fury X.

But there's no HDMI 2.0, and I can't find anything if Displayport 1.3 is supported but judging by what little I can find it does not support 1.3<.<

 

DP to HDMI 2.0 adapters will eventually come so I'm not that worried for HDMI 2, but would like DP 1.3...

I dont see why the dont just put 3 display port and two dvi and an hdmi port on? Nvidia did it. It works, as it still has 3 DP, and compatibility with most displays

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Then why not just have the 3 display ports? Id it is for 1440p and 4K? The HDMI will hold the card back. And bottleneck is the wrong word.

 

Bottleneck is the right word. DVI only supports a resolution of 2560 x 1600 60hz or 1920 x 1080 120hz and monitors at that resolution would have DisplayPort anyway. The HDMI can be used for a secondary monitor and the DP should be used for the primary.

Link to comment
Share on other sites

Link to post
Share on other sites

The only thing I'm disappointed about is the fact that 7970 equivalents and 680 equivalents are still mid-high end solutions :\ and that is just me being generally sad with the industry. 

Its irritating. But it makes sense if they still have products that need selling, why undercut them. So the 7970, is still a mid-high range card, but is CHEAP, so sells quite well.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Bottleneck is the right word. DVI only supports a resolution of 2560 x 1600 60hz or 1920 x 1080 120hz and monitors at that resolution would have DisplayPort anyway. The HDMI can be used for a secondary monitor and the DP should be used for the primary.

In what way does DVI reduce the speed of a computer? A

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Its irritating. But it makes sense if they still have products that need selling, why undercut them. So the 7970, is still a mid-high range card, but is CHEAP, so sells quite well.

I didn't say that I don't understand it. I understand it perfectly. That doesn't mean that I should be happy about almost 4 year old cards still being considered mid-high end.

Link to comment
Share on other sites

Link to post
Share on other sites

Bottleneck is the right word. DVI only supports a resolution of 2560 x 1600 60hz or 1920 x 1080 120hz and monitors at that resolution would have DisplayPort anyway. The HDMI can be used for a secondary monitor and the DP should be used for the primary.

Quite a lot of monitors with above 1920x1080 do not have DisplayPort, and I am pretty sure a lot of people buying this card will use it with 1920x1080 screens.

Link to comment
Share on other sites

Link to post
Share on other sites

I dont see why the dont just put 3 display port and two dvi and an hdmi port on? Nvidia did it. It works, as it still has 3 DP, and compatibility with most displays

They should have just gone with quad DP,  but on the bright side there's the possibility of a single slot bracket with a waterblock.  

What I want to see is benchmarks for multidisplays especially 4k surround.

I'm Batman!

Steam: Rukiri89 | uPlay: Rukiri89 | Origin: XxRukiriXx | Xbox LIVE: XxRUKIRIxX89 | PSN: Ericks1989 | Nintendo Network ID: Rukiri

Project Xenos: Motherboard: MSI Z170a M9 ACK | CPU: i7 6700k | Ram: G.Skil TridentZ 16GB 3000mhz | PSU: EVGA SuperNova 850w G2 | Case: Caselabs SMA8 | Cooling: Custom Loop | Still in progress 

Link to comment
Share on other sites

Link to post
Share on other sites

What if fiji cant overclock very well? The pure amount of cores may limit headroom. Why do you think nvidia got SUCH good overclocks on the 900 series?

This is entirely speculation. What if it barfs purple rainbow icecream out of the DP port instead of a video signal?

 

It might overclock like crap. Or it might be a complete overclockers DREAM card. Or somewhere in between. Also, even if one single Reviewer gets terrible OC's, well, that's the silicon lottery for you.

 

I forget which card it was, but Linus and Slick mentioned there was one GPU where they got an absolute shitty overclocker, yet in general, most people reported being able to get very high GPU's. Even the best binned GPU's can have the occasional shitty silicon that doesn't OC very well.

 

What we need, is to wait for all the major publications - anand, HWC, Toms, LTT, TekSyndicate, TTL, HardOCP, etc, to all publish their reviews. Once that's complete, we can analyze each review for consistency and variance, and see how each review sample OC'd for reach reviewer.

 

Then, we can get an idea of what the "average" OC will look like.

 

I really don't understand why you or anyone would speculate "BUT WHAT IF IT CAN'T OC????". What's the point of such a doom and gloom speculation? There's literally no basis for such speculation.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is entirely speculation. What if it barfs purple rainbow icecream out of the DP port instead of a video signal?

 

It might overclock like crap. Or it might be a complete overclockers DREAM card. Or somewhere in between. Also, even if one single Reviewer gets terrible OC's, well, that's the silicon lottery for you.

 

I forget which card it was, but Linus and Slick mentioned there was one GPU where they got an absolute shitty overclocker, yet in general, most people reported being able to get very high GPU's. Even the best binned GPU's can have the occasional shitty silicon that doesn't OC very well.

 

What we need, is to wait for all the major publications - anand, HWC, Toms, LTT, TekSyndicate, TTL, HardOCP, etc, to all publish their reviews. Once that's complete, we can analyze each review for consistency and variance, and see how each review sample OC'd for reach reviewer.

 

Then, we can get an idea of what the "average" OC will look like.

 

I really don't understand why you or anyone would speculate "BUT WHAT IF IT CAN'T OC????". What's the point of such a doom and gloom speculation? There's literally no basis for such speculation.

Everyone is saying its an "overclockers dream" how is what im saying any differant? We have waited for a decent high end card since the 7970.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone is saying its an "overclockers dream" how is what im saying any differant? We have waited for a decent high end card since the 7970.

To be fair, the 290x was very decent when it first came out, stock cooler aside. All the 3rd party variants by Gigabyte, etc, were very very rock solid, and could match or beat NVIDIA's equivalent cards (780, etc).

 

But you're right, no one should automatically assume it will OC super awesome either. The biggest problem in tech in general (which promotes fanboyism for any brand), and this forum in particular, is people just blinding assuming SOMETHING as fact, before having it verified.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

To be fair, the 290x was very decent when it first came out, stock cooler aside. All the 3rd party variants by Gigabyte, etc, were very very rock solid, and could match or beat NVIDIA's equivalent cards (780, etc).

 

But you're right, no one should automatically assume it will OC super awesome either. The biggest problem in tech in general (which promotes fanboyism for any brand), and this forum in particular, is people just blinding assuming SOMETHING as fact, before having it verified.

Those custom coolers were not available in the uk until january 2014! 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×