Jump to content

Unofficial R9 Fury X reviews master post.

marldorthegreat

-snip-

 

He did post the HardwareCanucks review, which shows that the Fury X uses somewhat around 10w more power than the Titan X. Torture test power consumption is something you really shouldn't ever take into account, since absolutely no gaming application produces that kind of load.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.extremetech.com/gaming/208874-amds-radeon-fury-x-previewing-performance-power-consumption-and-4k-scaling/3

 

60watt more draw then a 980ti and even more then a TItanX. Either way close to 400Watts is not ok for a single gpu.

post-236231-0-90162800-1435216129.png

MainRig- CPU: 4790k, RAM: 32gb 2400mhz, MOBO: Maximus Formula VII, COOLING: Full EK cooling, GPU: Titan X SLI, PSU: 1200w evga , STORAGE: 250gb SSD, 4TB hybrid CASE: 760T, EXTRAS: Sleeved cables

SecondRig- CPU:4690K, RAM: 16gb 1600mhz, MOBO: Maximus Gene VII, COOLING: H105, GPU: 970ftw, PSU: EVG650W,  STORAGE: 250gb SSD, 3TB, CASE: 540air 

Steam: pizzatime6 Plus two other pc rigs and a craptop.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm praying the 1080p and 1440p results are just driver flukes. That makes no sense to be matching the 980TI reference at 4K or slightly beating it only to lose in 1080p and 1440p.

AMD drivers are more CPU bound than NV in DX11. Could this be the reason?

Link to comment
Share on other sites

Link to post
Share on other sites

He did post the HardwareCanucks review, which shows that the Fury X uses somewhat around 10w more power than the Titan X. Torture test power consumption is something you really shouldn't ever take into account, since absolutely no gaming application produces that kind of load.

If you look at the other graphs theres one that shows in game load.

MainRig- CPU: 4790k, RAM: 32gb 2400mhz, MOBO: Maximus Formula VII, COOLING: Full EK cooling, GPU: Titan X SLI, PSU: 1200w evga , STORAGE: 250gb SSD, 4TB hybrid CASE: 760T, EXTRAS: Sleeved cables

SecondRig- CPU:4690K, RAM: 16gb 1600mhz, MOBO: Maximus Gene VII, COOLING: H105, GPU: 970ftw, PSU: EVG650W,  STORAGE: 250gb SSD, 3TB, CASE: 540air 

Steam: pizzatime6 Plus two other pc rigs and a craptop.

Link to comment
Share on other sites

Link to post
Share on other sites

I trust a site that has no bias on the results. Tomshardware doesn't sell gpu's therefore their results are more credible then someone who sells that product.

Neither does HardwareCanucks, Tech of Tomorrow, Tek Syndicate, LinusTechTips, OC3d and many others. Your point being?

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Really your going to go there? ppfff good to know your too immature to look at facts from a reviewer who isn't selling you the product. I can see where this is going bye who ever you are. I'm not going to argue with someone who replies with meme's because they cant find reviews specifically on power usage. Y

 

Dude, I gave you a graph + source for my quoted benchmark. I also NEVER ever said that the Fury X uses less power than either the Titan X or the 980ti. I said, that HardwareCanucks found out, that the card in their system, with their sample, that the Fury X only pulls 8W more out of the wall, and 15W more than the 980ti. 

 

You went ahead and acted like it's a tragedy and how bad the Fury X is! But then again, I never expected you to act otherwise since you already have an nVidia avatar and your posts further indicated, that you're a class A - fanboy. The meme was simply to depict how you act and how I think about your behaviour. No judgement on your personality, as I'm sure you're a cool dude. 

phanteks enthoo pro | intel i5 4690k | noctua nh-d14 | msi z97 gaming 5 | 16gb crucial ballistix tactical | msi gtx970 4G OC  | adata sp900

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.extremetech.com/gaming/208874-amds-radeon-fury-x-previewing-performance-power-consumption-and-4k-scaling/3

 

60watt more draw then a 980ti and even more then a TItanX. Either way close to 400Watts is not ok for a single gpu.

 

These are exactly one of the benchmarks with 15.4 drivers.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Dude, I gave you a graph + source for my quoted benchmark. I also NEVER ever said that the Fury X uses less power than either the Titan X or the 980ti. I said, that HardwareCanucks found out, that the card in their system, with their sample, that the Fury X only pulls 8W more out of the wall, and 15W more than the 980ti. 

 

You went ahead and acted like it's a tragedy and how bad the Fury X is! But then again, I never expected you to act otherwise since you already have an nVidia avatar and your posts further indicated, that you're a class A - fanboy. The meme was simply to depict how you act and how I think about your behaviour. No judgement on your personality, as I'm sure you're a cool dude. 

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,10.html

My above posts shows it pulls a lot more then the 8 watts you claim. In fact most graphs side with my claims while only one graph sides with yours. Don't call me a fan boy for stating facts. Bottom line it draws a lot more power then a titanX or a 980 or a 980ti. I listed multiple sources that prove me correct> I'll ask you again can you produce more then one graph to support your claim or are you going to remain immature? This is deff not more then 8 to 15 watts.

post-236231-0-81475800-1435216631.png

post-236231-0-63774800-1435216762_thumb.

MainRig- CPU: 4790k, RAM: 32gb 2400mhz, MOBO: Maximus Formula VII, COOLING: Full EK cooling, GPU: Titan X SLI, PSU: 1200w evga , STORAGE: 250gb SSD, 4TB hybrid CASE: 760T, EXTRAS: Sleeved cables

SecondRig- CPU:4690K, RAM: 16gb 1600mhz, MOBO: Maximus Gene VII, COOLING: H105, GPU: 970ftw, PSU: EVG650W,  STORAGE: 250gb SSD, 3TB, CASE: 540air 

Steam: pizzatime6 Plus two other pc rigs and a craptop.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess the point is that the Fury X is still drawing more power and yet it's not blowing away the 980Ti or Titan X.

 

If you are going to draw more power, then you better be beating the competition by at least 10 fps in every test and since it has actually lost a few benchmarks, it does say a lot.

 

I'm not even remotely interested in the watercooled version honestly...looks like a hassle. I like to plug in and go.

 

So if the nice and cool GPU of the Fury X loses some benchies... then the Fury will certainly lose almost every test against the 980Ti. The difference here is that the card is cheaper and is more geared towards the 980. I bet it runs using less power as well with less Stream Processors.

 

 

People still want this card though...it's out of stock everywhere. I mean... it's close to Titan X performance for $649 or $679 depending on where you buy it. It's the team red top card for now.... it is what it is.

 

Part of the (still) high power consumption is definitely the 28nm manufacturing process, which is most likely due to AMD still struggling to gaining traction. As of yet, I think the Fury X is the best AMD can deliver with their current situation. If they'd had the same traction as nVidia, and be on the same manufacturing size, then it would get much more interesting.

 

Matter of fact is, that many benchmarking results show the Fury X at near proximity and sometimes ever beating Team Green, especially @ higher resolutions. Then you also have to take into consideration that the drivers won't most likely be 100% optimized and overclocking is still locked as of right now. The results will improve again (same as with nVidia) as soon as DirectX 12 is out, and the drivers are optimized for the games, and of course overclocking is enabled, because AMD has thought ahead and went with W/C which should, in theory, allow people to overclock the card further than on air. So until that all comes together, having a 100% certain answer as of if the Fury X is really a disappointment, which I personally think it isn't considering it's equipment and price point, is close to impossible. 

phanteks enthoo pro | intel i5 4690k | noctua nh-d14 | msi z97 gaming 5 | 16gb crucial ballistix tactical | msi gtx970 4G OC  | adata sp900

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,10.html

My above posts shows it pulls a lot more then the 8 watts you claim. In fact most graphs side with my claims while only one graph sides with yours. Don't call me a fan boy for stating facts. Bottom line it draws a lot more power then a titanX or a 980 or a 980ti. I listed multiple sources that prove me correct> I'll ask you again can you produce more then one graph to support your claim or are you going to remain immature? This is deff not more then 8 to 15 watts.

 

Why do I have to prove you wrong for something I didn't even disagree on with you? I urge you to quote me, where I said that the Fury X draws less power than either of the GM200 cards, please!

 

I quoted HardwareCanucks, because I trust their opinions, and because I quoted their results! 

 

Not mention how I don't see, how I am immature, when you're acting so incredibly defensive and call me immature for posting a meme. It's the internet culture, "buddy", and if you can't handle a simple meme or rather complete take off of the fact, then I think I haven't acted immature, but have hit the nail on the head ;)

 

Anyways, this will be the last thing you'll hear from me on this topic, as I don't really have the time nor nerves to argue with someone over the internet over such an overdiscussed topic. 

phanteks enthoo pro | intel i5 4690k | noctua nh-d14 | msi z97 gaming 5 | 16gb crucial ballistix tactical | msi gtx970 4G OC  | adata sp900

Link to comment
Share on other sites

Link to post
Share on other sites

Dude, I gave you a graph + source for my quoted benchmark. I also NEVER ever said that the Fury X uses less power than either the Titan X or the 980ti. I said, that HardwareCanucks found out, that the card in their system, with their sample, that the Fury X only pulls 8W more out of the wall, and 15W more than the 980ti. 

 

You went ahead and acted like it's a tragedy and how bad the Fury X is! But then again, I never expected you to act otherwise since you already have an nVidia avatar and your posts further indicated, that you're a class A - fanboy. The meme was simply to depict how you act and how I think about your behaviour. No judgement on your personality, as I'm sure you're a cool dude. 

 

Why do I have to prove you wrong for something I didn't even disagree on with you? I urge you to quote me, where I said that the Fury X draws less power than either of the GM200 cards, please!

 

I quoted HardwareCanucks, because I trust their opinions, and because I quoted their results! 

 

Not mention how I don't see, how I am immature, when you're acting so incredibly defensive and call me immature for posting a meme. It's the internet culture, "buddy", and if you can't handle a simple meme or rather complete take off of the fact, then I think I haven't acted immature, but have hit the nail on the head ;)

You are clearly not worth my time. I'm tired of trying to explain this to you. The fact you can't see your immaturity proves my point. I know I cant support my claim so Ill just call hiom a fanboy and make a meme to make fun of someone else just for arguing their side. Keep thinking it only pulls 8 more watts though when all my posts disproves you and your only source.

MainRig- CPU: 4790k, RAM: 32gb 2400mhz, MOBO: Maximus Formula VII, COOLING: Full EK cooling, GPU: Titan X SLI, PSU: 1200w evga , STORAGE: 250gb SSD, 4TB hybrid CASE: 760T, EXTRAS: Sleeved cables

SecondRig- CPU:4690K, RAM: 16gb 1600mhz, MOBO: Maximus Gene VII, COOLING: H105, GPU: 970ftw, PSU: EVG650W,  STORAGE: 250gb SSD, 3TB, CASE: 540air 

Steam: pizzatime6 Plus two other pc rigs and a craptop.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess the point is that the Fury X is still drawing more power and yet it's not blowing away the 980Ti or Titan X.

 

If you are going to draw more power, then you better be beating the competition by at least 10 fps in every test and since it has actually lost a few benchmarks, it does say a lot.

 

I'm not even remotely interested in the watercooled version honestly...looks like a hassle. I like to plug in and go.

 

So if the nice and cool GPU of the Fury X loses some benchies... then the Fury will certainly lose almost every test against the 980Ti. The difference here is that the card is cheaper and is more geared towards the 980. I bet it runs using less power as well with less Stream Processors.

 

 

People still want this card though...it's out of stock everywhere. I mean... it's close to Titan X performance for $649 or $679 depending on where you buy it. It's the team red top card for now.... it is what it is.

 

Hmm.  Well, The fury is powering a pump and fan, where the other examples just have a fan.  Not sure if I'd run on the basis you're thinking of with power.  10 extra FPS is a lot to ask for, unless we're talking 720p.

Link to comment
Share on other sites

Link to post
Share on other sites

You are clearly not worth my time. I'm tired of trying to explain this to you. The fact you can't see your immaturity proves my point. I know I cant support my claim so Ill just call hiom a fanboy and make a meme to make fun of someone else just for arguing their side. Keep thinking it only pulls 8 more watts though when all my posts disproves you and your only source.

 

I didn't make the meme, a quick google search suffices to find plenty nVidia fanboy-ism memes. But please, tell me how I'm immature? 

 

Furthermore, and again something you don't seem to adress, WHY do I have to prove my point, when my point didn't even argue against your point? I'm asking you again to quote me, saying that the GM200 cards pull more watts than the Fury X or rather where I said the Fury X pulls less. I don't and never have disagreed with you on that point, so there's no need to prove you wrong in any way other than to quote my source for the 8W and 15W claim, which I did with the graph from the video and with providing the video, where the graph was taken from. 

 

You then proceeded to quote various review sites to prove your point that the Titan X and the 980ti draw less power, and in some/most cases more than 8W and respectively 15W, which was completely useless in itself, since, and I'm repeating myself I NEVER DISAGREED WITH YOU ON THAT POINT, every review site has different results, which show the GM200 cards further or closer to the power consumption of the Fury X, but always below the power consumption of the Fury X, the same as with HardwareCanucks' result. 

 

You then proceeded to tell that HardwareCanucks', and I quote: 

 

 

 

I trust a site that has no bias on the results. Tomshardware doesn't sell gpu's therefore their results are more credible then someone who sells that product.

 

isn't a credible source, implying they have a bias towards one over the other, and that they want to sell the product, trying to discredit my source, which is just as viable as the Tom's Hardware, Anandtech, PCPer, and so on, because they have a set testing method, which are all done on the same system, just as Tom's Hardware does. Not to mention that HardwareCanucks doesn't sell the product, because they have no online shop. 

 

So, this behaviour has led me to believe, that you don't want to be disproven, not that this was ever my intention, and therefore I figured that YOU have a certain bias towards nVidia, therefore the callout of you being a fanboy, or rather fanboy-ish behaviour and the meme. You continued to go on a rand about how you posted sources to disprove me on something I didn't disagree with you on and continued to rant about my immature behaviour, and trying to prove me of a certain stance against nVidia or rather for AMD, which I never had. 

phanteks enthoo pro | intel i5 4690k | noctua nh-d14 | msi z97 gaming 5 | 16gb crucial ballistix tactical | msi gtx970 4G OC  | adata sp900

Link to comment
Share on other sites

Link to post
Share on other sites

Power consumption must be one of the parameters least worth arguing over. >.> The difference between the 980Ti and Fury X probably amounts to a few tens of dollars at most over 3-4 years, if you even keep the card that long. It will certainly be a tiny fraction of the cost of your PC, and a miniscule fraction of your electricity bill. Better to look at the experience you'll be getting from the card instead of sweating over a dollar on your monthly electricity bill.

 

 

People still want this card though...it's out of stock everywhere. I mean... it's close to Titan X performance for $649 or $679 depending on where you buy it. It's the team red top card for now.... it is what it is.

I guess people are thankful to it for knocking $350 off the price of this tier of video card. Unless they've found something that actually needs all that extra VRAM over the 6GB mark, Titan X owners must be kicking themselves.

Link to comment
Share on other sites

Link to post
Share on other sites

no citation needed. just because you disagree.

cuda has been a standard across nvidia for the longest time now making it easy to fix for any generation not so much so on the amd when a new problem pops up

Yes, you do actually need a citation for claims like that because it could just be made up bullshit. I am not even saying I agree or disagree with your statement, I am just saying I want you to prove that your statement is correct.

If you have evince that supports your claim then I can't disagree because it would be like arguing that the sky isn't blue. Right now though, without ever having seen the sky, I would like you to prove it with a source.

If you want to play the "no citation needed" card then I might as well say the exact opposite. That Nvidia requires more careful programming than AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

Actually they are, but that does not mean the temperatures are acceptable by any stretch; they are capable of enduring the heat, though.

 

You are really starting to strike me as an AMD fanboy. :P

 

The modules used on Titan X are SK Hynix H5GCQ4H24MFR - R2C. So unless they are made for the automotive industry (where the modules can handle up to 105c), then these modules are only made to go to 95c. So it is safe to say that it is either going above limits or close to them. Either way, this is a simple issue a simple backplate with heat transfer pads, could have easily solved. But I guess a backplate is too expensive for a 1000$ graphics card :rolleyes:

 

As for Fury X, the little copper heat pipe is a really nice design. But if the MOSFETs are really running 100+c, then the copper connection is either defective, or under performing. Maybe running the cold water through the MOSFETs first would be better, since GPU temps are not an issue on this card; or ramp up the pump speed. It's a shame either way, I hope will be fixed in the future.

I will say though, that MOSFETs can run at 125c for the normal crappy ones, up to 175c for high end ones (most graphics cards uses the normal ones).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Had to change my profile picture simply out of spite.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

not bad... really was hoping it would have stayed consistently above the 980Ti, but it finally close the gap to it.

 

Also, noticed on some review sites like techpowerup, how some of the benchmarks paint a whole different picture how the Fury X does... way worse than some others show it. Really it is hard to judge how a card does based on some of the sites, since they seem to be biased in some regards.

 

One of the benchmarks i've looked at showed witcher 3 running same on 295x2 as on my Sapphire 290TriOC on the same 1080P settings. And I know that cannot be right. I was really surprised that a lot of better cards from AMD were on par or worse than my setup on that particular site.

Link to comment
Share on other sites

Link to post
Share on other sites

One of the benchmarks i've looked at showed witcher 3 running same on 295x2 as on my Sapphire 290TriOC on the same 1080P settings. And I know that cannot be right. I was really surprised that a lot of better cards from AMD were on par or worse than my setup on that particular site.

You can't really compare your results to the results on a review site unless you know exactly the settings they use as well as where they tested it.

Which site are you talking about specifically and how much do their results differ from yours?

Link to comment
Share on other sites

Link to post
Share on other sites

 Either way, this is a simple issue a simple backplate with heat transfer pads, could have easily solved. But I guess a backplate is too expensive for a 1000$ graphics card :rolleyes:

 

Nvidia said several times that the backplate was removed for SLI support as the plate would block the other card and would cause overheating.

This isn't an issue with the 980 because it outputs way less heat than 980Ti and Titan X.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

You can't really compare your results to the results on a review site unless you know exactly the settings they use as well as where they tested it.

Which site are you talking about specifically and how much do their results differ from yours?

 

Let's not forget most review sites also run open air benches, while we common folks use a full chassis. Airflow isn't nearly as good.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia said several times that the backplate was removed for SLI support as the plate would block the other card and would cause overheating.

This isn't an issue with the 980 because it outputs way less heat than 980Ti and Titan X.

 

They could easily have made a half length plate, just covering the vram, and not the fan intake area, so I call BS on that one. Or they could have made a better cooler.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You can't really compare your results to the results on a review site unless you know exactly the settings they use as well as where they tested it.

Which site are you talking about specifically and how much do their results differ from yours?

it was on techpowerup and their Fury X benches... while it said it used all max settings except hairworks on witcher 3, it showed the 295x2 average 45FPS... while my system I routinely see it just over 50fps and as low as 42fps in outdoor settings, I have vsync on an it will not go over 60FPS indoors or in caves, but have seen it in 85s without vsync.

 

I have near identical setup with a 4790K CPU and vengance 16gb ram at 1866MHZ.

 

Like I said, it is hard to judge the benchmarks they put out there, especially when that one seems to portray the 295x2 GPU peformance to mine which is a R290OC version.

 

The reason I used The Witcher 3 example, is because I play that game currently and have been testing my card to see how it did in it. And was surprised to see that the 295X2 did seem to be the same average FPS on their benchmark as my system. I used Fraps to measure it, my average is 45.8 last time I checked.

Link to comment
Share on other sites

Link to post
Share on other sites

it was on techpowerup and their Fury X benches... while it said it used all max settings except hairworks on witcher 3, it showed the 295x2 average 45FPS... while my system I routinely see it just over 50fps and as low as 42fps in outdoor settings, I have vsync on an it will not go over 60FPS indoors or in caves, but have seen it in 85s without vsync.

I have near identical setup with a 4790K CPU and vengance 16gb ram at 1866MHZ.

Like I said, it is hard to judge the benchmarks they put out there, especially when that one seems to portray the 295x2 GPU peformance to mine which is a R290OC version.

The reason I used The Witcher 3 example, is because I play that game currently and have been testing my card to see how it did in it. And was surprised to see that the 295X2 did seem to be the same average FPS on their benchmark as my system. I used Fraps to measure it, my average is 45.8 last time I checked.

The 295 was only running 1 core as amd never released the crossfire driver when they tested.

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-5.html

 

"While Nvidia reports the GeForce Titan X’s TDP at 250W, it never actually reaches this figure under normal circumstances. In fact, not one of the usual stress test applications pushes the card past 247W. "

Toms hardware doesn't seem to agree with you buddy. Last I checked Tomshardware doesnt lie...as for NCIX I don't trust any review made by a company that wants to sell me a product. Thats why all my information comes from sites that don't sell you products they review. 

 

They are just talking about the average power draw though, not the maximum power pulled at any one time. It isn't really fair to compare the average power use of the Titan X to the max power use of the Fury X. At a max the Titan X spikes at 325 watts during a stress test, as shown in the oscilloscope in the Tom's Hardware article.

 

http://media.bestofmicro.com/O/R/484299/gallery/04-Torture-05-All-Rails_r_600x450.png

 

The Fury X during a stress test did reach 347 watts of power draw average, which is higher than the Titan X by quite a bit but stress test power consumption is also a useless figure since it doesn't represent a real gaming load.

 

Under an actual gaming load the Titan X used an average of 224 watts of power, spiking up to 400 watts, as shown here: http://media.bestofmicro.com/O/M/484294/gallery/02-Direct3D-05-All-Rails_r_600x450.png

 

Compared to the Fury X which used an average of 220 watts spiking up to 450 watts during a gaming load, again shown here: http://media.bestofmicro.com/J/A/506134/gallery/11-Gaming-All-Rails_r_600x450.png

 

 

 

So they use the same amount of power in normal use.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×