Jump to content

Jayz2cents' video on the Fury X

Sauron

I don't believe jay

r srs?

 

he's one of the best reviewers online and his 3dmark fire strike scores are on point. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just saying that I wouldn't trust a graph with data like that.

With the exception of the 290x, for which I gave a plausible but not definite reason, all the temps line up with every other review. Techreport is top tier when it comes to reliable reviews. I can assure you with my left testicle that there is no tom foolery going on.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

r srs?

 

he's one of the best reviewers online and his 3dmark fire strike scores are on point.

 

Nope, completely disagreed. I'll quote notional on it:

Jay is mediocre when it comes to benches. We are talking about the guy who plays BF4 @ 75 fov, and did a full bench run in Witcher 3 with vsync on (and used the results in his vid). He is ok, and I do watch a lot of his stuff, but he is an admitted green team person. He tries to justify this with numbers not lying, but really, we all know that is not true, as you can just use certain settings and games, that favour one over the other. To his defence, it does seem like he doesn't use GameWorks, so that is very much competent.

 

But in this review, he uses only massively overclocked NVidia cards against a ~50mhz OC'd Fury X. He states this and then does a quick run of factory 980ti's, though we don't know what speed they run at, and compares that to the 50mhz OC'd Fury X, and uses that tiny oc against it. Not exactly fair.

 

I do get the critique of the small OC of the card, since AMD themselves, have stated this is an OC's dream card. Let's see, once the voltage regulation is unlocked, which we might see on software soon. As long as the MOSFETS can handle it, it should be good all around. The critique on not being able to OC HBM is really dumb though. NOTHING comes even close to the HBM Fury X has, no matter how much OC the GDDR5 vram gets, so what's the point? A new tech, that might not have the necessary headroom or stability in OC, as manufacturing is completely new. At least PCPer stated this was a complete non issue.

 

There is a reason why 980ti's are clocked the way they are: It is what NVidia can guarantee, so not all will be able to OC to 1500, or even 1300mhz. But at least he admits his bias, so just bear that in mind.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

r srs?

 

he's one of the best reviewers online and his 3dmark fire strike scores are on point. 

 

Cognitive dissonance is a bitch. His benchmarks upset my emotional stability so he must be wrong. :rolleyes:

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

With the exception of the 290x, for which I gave a plausible but not definite reason, all the temps line up with every other review. Techreport is top tier when it comes to reliable reviews. I can assure you with my left testicle that there is no tom foolery going on.

Yeah, found the article. Techreport is good but that graph, without a source, isn't.

Link to comment
Share on other sites

Link to post
Share on other sites

Not to mention the 980 beat the Fury X in a few benchies even at 1440p.  AMD....ummm.... what is going on? Titan killer.... no.... not even 980 killer.

 

It's a disappointment and dat coil whine....YIKES. I was really hoping this would be a great card but dang.... I don't know. I don't see it being the better option at any price close to the 980 unless you use 4k. If you use 1080p this card is literally pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

Not as simple. Part of what makes the Fury X an awesome piece of tech is how compact it is and how the vrms are so close to the gpu itself a single 10cm heatpipe is enough to keep them nice and cool. A 980ti would need pretty much a full waterblock to get adequate cooling, and those are expensive (as shown by some models already on the market, there's a reason the EVGA one is hybrid as opposed to full water).

 

AMD invented a solution to have a standard AIO cool the mosfets, so why can't NVidia do the same? Or they could use a hybrid. Either way, a stock card is always comparable to another stock card. Obviously in this case there are pros and cons to each solution, so pointing them out is of course fair.

 

5:41 -> 980 Ti Reference at 1402MHz

5:57 in Jayz video -> Reference 980 Ti OC beating the Fury X in Metro Last Light.

 

I wonder if those shipped 980ti's for Jay are cherry picked or not. If all 980ti's could handle those speeds, why not just make It the stock clock?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Don't care if it was a joke or not. The card fell out of the car and landed on the ground. Of course it should not go defective from that, but we don't know what was wrong with it.

The box was already on the ground.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

When 980ti was released= wait until Fury X gets released. It's going to murder it.

After the Fury X release= Wait for drivers. Wait for DX12. It's going to murder it.

I swear, this shit is turning into Playstation.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

When 980ti was released= wait until Fury X gets released. It's going to murder it.

After the Fury X release= Wait for drivers. Wait for DX12. It's going to murder it.

I swear, this shit is turning into Playstation.

DX12 is not going to make Fury more powerful... It will give similar results to both sides.

 

And just remember;

Hype, Exclusively on PlayStation 4.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if those shipped 980ti's for Jay are cherry picked or not. If all 980ti's could handle those speeds, why not just make It the stock clock?

power consumption 

Link to comment
Share on other sites

Link to post
Share on other sites

DX12 is not going to make Fury more powerful... It will give similar results to both sides.

 

Wanna bet?

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

The box was already on the ground.

 

You sure about that? Either way it doesn't really matter. We don't know what went wrong with the card. But hopefully these cards means AMD will have mastered HBM/interposer tech, by the time this tech goes mainstream.

 

DX12 is not going to make Fury more powerful... It will give similar results to both sides.

 

And just remember;

Hype, Exclusively on PlayStation 4.

 

Actually it might, as NVidia has multi thread DX11 drivers, but AMD does not. Both will have multi thread DX12 drivers, so AMD might get an overall boost on all cards.

 

power consumption 

 

Might be it, but then the argument of low power consumption and tdp is less useable. You can't have both at the same time after all.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Digitalstorms CFX vs SLI benchmarks had FuryX CFX beating and matching TitanX SLI. Which is why I'm still here waiting for FuryX2 and launch prices to go away. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it might, as NVidia has multi thread DX11 drivers, but AMD does not. Both will have multi thread DX12 drivers, so AMD might get an overall boost on all cards.

 Hmm, AFAIK D3D11 had no method for multiple draw threads. I know D3D12 adds support asynchronous shaders and multiple draw threads among other things.

Link to comment
Share on other sites

Link to post
Share on other sites

 Hmm, AFAIK D3D11 had no method for multiple draw threads. I know D3D12 adds support asynchronous shaders and multiple draw threads among other things.

 

Indeed, but the driver itself and it's overhead also counts. D3D11 still clocks core 0/1 on NVidia, but the driver itself does not.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Indeed, but the driver itself and it's overhead also counts. D3D11 still clocks core 0/1 on NVidia, but the driver itself does not.

Yeah, I expect to see an increase in fps if D3D12 is properly implemented. From reading it's current spec it seems to have a bunch of improvements in Memory Management that might allow HBM pull a lead.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, I expect to see an increase in fps if D3D12 is properly implemented. From reading it's current spec it seems to have a bunch of improvements in Memory Management that might allow HBM pull a lead.

 

What I'm really curious about is if AMD's multi threaded DX12 drivers, will also multi thread manage DX11 games as well (as in driver not running 1 core and same core as D3D11.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

As someone who has lost multiple cards to heat problems I must disagree. I'm glad it hasn't happened to you as often, but thermals are very important. For that reason alone I think the Fury x is at least as good of a deal and for most people not getting into custom water cooling loops, better in the long run.

I heard somewhere on the internet that Fiji max work temperature is somewhere in the 75-80C range due to HBM and pump. 

 

It's sad to see the AMD fanboys in denial. 

 

I OC all my cards and so I really liked seeing his OC video comparison. Stock numbers mean nothing to me. 

 

I think you guys complaining about drivers or voltage are in total denial. Fury is based on the same GCN architecture as the 290X so the drivers will already support it well as it's NOT a new architecture. As for voltage, how many 290Xs do you see hitting 1400 or 1500 Mhz? 1150, maybe 1200, is about right for the max a Fury will ever hit. So it'll never get the same performance as 980 Ti.

 

Perhaps a better question you guys should ask is why didn't AMD release a Titan X killer? Answer because their architecture isn't good enough. 

Driver side HBM efficiency is pretty important. It's not based on the same GCN architecture as the 290x. It's Tonga's bigger better brother. Of course their architecture isn't good enough to be an all around, niether is NVidia's though just an FYI. I'd pick up this card anyday over a Titan X for Compute alone.

http://www.hardocp.com/image.html?image=MTQzNTEwODU5MTlTMEhPT1prR0FfMV82X2wuZ2lm it is infact different from tonga

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

I would consider this to be mini-news, if they are not please feel free to move the topic. And delete it if by chance it is a repost.

 

 

The reasons behind me posting this are the following:

I was a bit disappointed with LTT's Fury X video because their card (for no fault of their own) was DOA. Furthermore, Jay is amongst my top 3 sources for benchmarks and pc hardware reviews in general, he knows what he is doing and I tend to trust him more than most websites. He is also one of the few who consistently provides overclocked performance.

 

What I see is a bit of a let down in my opinion. While the card is an engineering feat and pushes a new and cool technology, it falls short against its obvious competitor, the 980ti. At stock speeds it holds its own, but its downfall is ultimately overclocking performance.  I don't think anyone who is thinking of purchasing a card of this power and price is going to leave it at stock speeds if at all possible, and the 980ti has already shown of being estremely capable (on average) on that front. Overclocking is not a perfect science, but it is a pretty safe bet that most Fury Xs will not go far from what Jay could obtain, whereas it's pretty common to see 980tis over 1500mhz. Ultimately I feel it should have been priced lower, but with the water cooling I'm not sure it would have been possible. For the time being, it's not going to make its way alongside the 980ti in my signature.

If you watch his tech talks apparently they may have released another driver (beyond the one he got which was better than the first wave of testers) which may show improvements. The Fury X MAY be better than it has shown to be. You have to remember that the 980Ti is based on the Titan which means the drivers already had better support early on and to top it off Fuji is a brand new architecture which means there is much to be done in terms of getting all the performance out of a FuryX.

 

Also the most exciting things Jay said is that he can no longer recommend the 970 because the 390 (not the X) is showing far better numbers at the same price.

Link to comment
Share on other sites

Link to post
Share on other sites

When 980ti was released= wait until Fury X gets released. It's going to murder it.

After the Fury X release= Wait for drivers. Wait for DX12. It's going to murder it.

I swear, this shit is turning into Playstation.

That's all AMD fanboys have left. "Wait for drivers!," "wait for DX12!." Nonsense, the Fury was an overhyped (albeit still nice) card and that's all the excuses left.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm really curious about is if AMD's multi threaded DX12 drivers, will also multi thread manage DX11 games as well (as in driver not running 1 core and same core as D3D11.

It is worth nothing that driver overhead will not be directly fixed by DX12. The CPU overhead inherent in the API has been reduced going from D3D11 to D3D12, the API also allows lower level programming which potentially means the drivers have less case handling in them and thus have less driver overhead.

Link to comment
Share on other sites

Link to post
Share on other sites

Nobody on a reference 980ti will ever get to 1400 or 1500 either. So yeah, for the 0.001% of extreme overclockers with massive custom water loops like Jay yeah, Nvidia it's still the way to go I'd agree to that.

Jay does this on air dude. Watch his videos. He always states how the card is cooled and if it modified. It is a LOT of work to hook up a graphics card to a custom loop. Jay couldn't produce as many videos as he does if he did this.

Link to comment
Share on other sites

Link to post
Share on other sites

It is worth nothing that driver overhead will not be directly fixed by DX12. The CPU overhead inherent in the API has been reduced going from D3D11 to D3D12, the API also allows lower level programming which potentially means the drivers have less case handling in them and thus have less driver overhead.

 

Yeah, but none of that is useful in DX 11 games. I'm sure the free Windows 10 upgrade means that all gamers, will have DX12, meaning a much faster and higher industry adoption, but there will still be a lot of DX 11 games left to play. So if the DX12 AMD drivers will multi thread itself on other cores than the D3D11 main thread, then we might see DX11 performance increase on certain CPU limited games as well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×