Jump to content

AMD Fury X Far Cry 4 game performance from AMD

ahhming

And along with that AMD will validate my purchase of a Fury X. Seriously, I am happy that AMD has been able to pull this off against Nvidia. With all odds against them in funding, they managed to do it. Now we will have to wait and see what the response is from Nvidia.

They have no retaliation. They just launched the TITAN X a few months ago and the GTX 980 Ti literally just came out. The only step they have is "TITAN X2" (dual GM200 card). Which AMD has already shown off a GTX 560 sized card rocking two Fiji chips. AMD has far better multiple GPU drivers as well. The soup is too thick for Nvidia to swim in it anymore if these numbers are validated across the board.

Link to comment
Share on other sites

Link to post
Share on other sites

Now if only they can make good drivers I may want one in the future for my second pc....

MainRig- CPU: 4790k, RAM: 32gb 2400mhz, MOBO: Maximus Formula VII, COOLING: Full EK cooling, GPU: Titan X SLI, PSU: 1200w evga , STORAGE: 250gb SSD, 4TB hybrid CASE: 760T, EXTRAS: Sleeved cables

SecondRig- CPU:4690K, RAM: 16gb 1600mhz, MOBO: Maximus Gene VII, COOLING: H105, GPU: 970ftw, PSU: EVG650W,  STORAGE: 250gb SSD, 3TB, CASE: 540air 

Steam: pizzatime6 Plus two other pc rigs and a craptop.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna go with those benchmarks are a load of bullshit on this one.

 

I wonder how much power it's gonna draw, I'm guessing that you're gonna need at least a 1000W PSU to have two in crossfire.

Why is this so unrealistic? According to the e3 presentation the memory takes far less power and there is less of it so there is more power for the gpu. I also read that the gpu is using a smaller manufacturing technique at 20nm vs the 28nm on maxwell. Also 2 titan Xs can be put on like 600watts

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna go with those benchmarks are a load of bullshit on this one.

 

I wonder how much power it's gonna draw, I'm guessing that you're gonna need at least a 1000W PSU to have two in crossfire.

A single Fury X draws 275w of power. So you can Crossfire two with a quality 750w PSU...

Link to comment
Share on other sites

Link to post
Share on other sites

This look very nice , but lets focus on the small star next to the Ultra* settings , there is something there , we don't know what ultra in this case is and what is the level of AA . The benchmark from techpowerup for the reference is , as they always say 4xMSAA if the game has the option , yes even in 4k . So that might not be the perfect perspective .

Link to comment
Share on other sites

Link to post
Share on other sites

Why is this so unrealistic? According to the e3 presentation the memory takes far less power and there is less of it so there is more power for the gpu. I also read that the gpu is using a smaller manufacturing technique at 20nm vs the 28nm on maxwell. Also 2 titan Xs can be put on like 600watts

 

I was making a jab at AMD for how they tend to have pretty damn high power usages on their processors and gpus :P

 

But don't forget the titan x has one 6 pin and one 8 pin power connector, whereas the fury x has two 8 pins.

 

Unless it's a multi-gpu card, I don't see how exactly those numbers could be legit.

Specs: CPU - Intel i7 8700K @ 5GHz | GPU - Gigabyte GTX 970 G1 Gaming | Motherboard - ASUS Strix Z370-G WIFI AC | RAM - XPG Gammix DDR4-3000MHz 32GB (2x16GB) | Main Drive - Samsung 850 Evo 500GB M.2 | Other Drives - 7TB/3 Drives | CPU Cooler - Corsair H100i Pro | Case - Fractal Design Define C Mini TG | Power Supply - EVGA G3 850W

Link to comment
Share on other sites

Link to post
Share on other sites

Farcry 4 did favor the better memory setups and the R9 FuryX was likely to rock UHD but that's insane. There has to be some targeted driver optimization at work here. Even half that performance boost over the 980ti would be great.

 

Absolutely amazing if this is accurate across most games. Really hope the Beijing crew aren't bullshitting.

Link to comment
Share on other sites

Link to post
Share on other sites

So a 2 year wait isn't entirely wasted huh.... OMG show us how the nano performs  :wub:

 

The nano is supposed to perform similarly to a 290x, while drawing roughly 175W of power. 

 

I was making a jab at AMD for how they tend to have pretty damn high power usages on their processors and gpus  :P

 

But don't forget the titan x has one 6 pin and one 8 pin power connector, whereas the fury x has two 8 pins.

 

Unless it's a multi-gpu card, I don't see how exactly those numbers could be legit.

 

You can't base power consumption purely on the power connectors. AMD is notorious for overbuilding the power delivery on their reference cards, so the additional power can easily be used to overclock. 

Link to comment
Share on other sites

Link to post
Share on other sites

So if these numbers are valid AMD's $649 card stomps GM200 into the ground with 1/3 of the memory?

 

820.jpeg

Well apparently you need more vRAM if your using GDDR5 due to the relatively low bandwidth. HBM might have solved quite a few problems for AMD.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Well apparently you need more vRAM if your using GDDR5 due to the relatively low bandwidth. HBM might have solved quite a few problems for AMD.

 

Not true, not true at all. HBM1 is simply limited to 4GB, which might be problematic. People have argued that it will be less of an issue because swapping memory can be done faster, that's it. 

Link to comment
Share on other sites

Link to post
Share on other sites

Not true, not true at all. HBM1 is simply limited to 4GB, which might be problematic. People have argued that it will be less of an issue because swapping memory can be done faster, that's it. 

WAN show.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

They have no retaliation. They just launched the TITAN X a few months ago and the GTX 980 Ti literally just came out. The only step they have is "TITAN X2" (dual GM200 card). Which AMD has already shown off a GTX 560 sized card rocking two Fiji chips. AMD has far better multiple GPU drivers as well. The soup is too thick for Nvidia to swim in it anymore if these numbers are validated across the board.

What I sort of meant is either hardware next year (Pascal), or driver updates. There's the potential scenario that Nvidia has been holding back on the driver side until AMD made their move. However I hope this isn't the case.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how far down the line is the 2nd gen. I'm only settling for dual Fiji or nothing.

ROG X570-F Strix AMD R9 5900X | EK Elite 360 | EVGA 3080 FTW3 Ultra | G.Skill Trident Z Neo 64gb | Samsung 980 PRO 
ROG Strix XG349C Corsair 4000 | Bose C5 | ROG Swift PG279Q

Logitech G810 Orion Sennheiser HD 518 |  Logitech 502 Hero

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not true, not true at all. HBM1 is simply limited to 4GB, which might be problematic. People have argued that it will be less of an issue because swapping memory can be done faster, that's it. 

Apart from the faster memmory swapping AMD had made a bold claim that current gen GPUs and games are unintelligent and wasteful in terms of what they store in VRAM and they claim that this will be alleviated with a more intelligent driver for the Fury. Once again the proof will be in the independent reviews...

 

expressed confidence in AMD's ability to work around this capacity constraint. In fact, he said that current GPUs aren't terribly efficient with their memory capacity simply because GDDR5's architecture required ever-larger memory capacities in order to extract more bandwidth. As a result, AMD "never bothered to put a single engineer on using frame buffer memory better," because memory capacities kept growing. Essentially, that capacity was free, while engineers were not. Macri classified the utilization of memory capacity in current Radeon operation as "exceedingly poor" and said the "amount of data that gets touched sitting in there is embarrassing."

With HBM, he said, "we threw a couple of engineers at that problem," which will be addressed solely via the operating system and Radeon driver software. "We're not asking anybody to change their games."

http://techreport.co...ory-explained/2

 

We will have to look closely at the independant reviews to see if Fury hits a VRAM bottleneck and stutters in modern games at 4K. If it does bad news for AMD. If not they're good...

Link to comment
Share on other sites

Link to post
Share on other sites

8537512_1434509745147_500.jpg

r7 370 is a 1024SP card???? it says it is 

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

So a 2 year wait isn't entirely wasted huh.... OMG show us how the nano performs  :wub:

on amd's twitch channel I was asking one of them and the nano is trying to hit 5k so it can handle most games at 4k with 60 fps

Link to comment
Share on other sites

Link to post
Share on other sites

I was making a jab at AMD for how they tend to have pretty damn high power usages on their processors and gpus :P

 

But don't forget the titan x has one 6 pin and one 8 pin power connector, whereas the fury x has two 8 pins.

 

Unless it's a multi-gpu card, I don't see how exactly those numbers could be legit.

Do you even AMD bro?

 

Seriously though, the only reason the Fury X uses 2x 8-pin is because it's an overclocking enthusiast card.

 

Farcry 4 did favor the better memory setups and the R9 FuryX was likely to rock UHD but that's insane. There has to be some targeted driver optimization at work here. Even half that performance boost over the 980ti would be great.

 

Absolutely amazing if this is accurate across most games. Really hope the Beijing crew aren't bullshitting.

If these numbers hold true then something tells me these slides might be valid, just using the R9 390X branding before AMD decided on a custom moniker.

 

AMD-Radeon-R9-390X_Performance.jpg

 

 

What I sort of meant is either hardware next year (Pascal), or driver updates. There's the potential scenario that Nvidia has been holding back on the driver side until AMD made their move. However I hope this isn't the case.

Actually, Nvidia might be in the pit deeper than you would think. They've already optimized their drivers for a threaded topology. Which means once DirectX 12 rolls around AMD hardware is going to see a bigger performance improvement than Nvidia. After we see Pascal we will then see Arctic Islands where I predict that we will see a new Fiji GPU packing 6000+ SPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

A single Fury X draws 275w of power. So you can Crossfire two with a quality 750w PSU...

Cpu

- snip-

Link to comment
Share on other sites

Link to post
Share on other sites

This look very nice , but lets focus on the small star next to the Ultra* settings , there is something there , we don't know what ultra in this case is and what is the level of AA . The benchmark from techpowerup for the reference is , as they always say 4xMSAA if the game has the option , yes even in 4k . So that might not be the perfect perspective .

 

I noticed that too. I can only assume it says something like all GameWorks bs off. 4xMSAA at 4k is just stupid waste of everyones time.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

8537512_1434509745147_500.jpg

r7 370 is a 1024SP card???? it says it is 

It's an overclocked 265. I think the benchmarks should be higher if it were a 270x.

Link to comment
Share on other sites

Link to post
Share on other sites

I was making a jab at AMD for how they tend to have pretty damn high power usages on their processors and gpus :P

 

But don't forget the titan x has one 6 pin and one 8 pin power connector, whereas the fury x has two 8 pins.

 

Unless it's a multi-gpu card, I don't see how exactly those numbers could be legit.

 

At 1,5-2x the performance per watt to a 290x card, it's not really that difficult to think it uses 275 watts. Especially due to HBM's low power use.

 

Also don't forget that the Fury X has to power a water cooling pump and a 12cm fan with those connectors. Probably a LOT of overhead for OC as well.

 

With 4096 stream processors, I can.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It really depends what they were doing in FC4 during benchmark. It could be meaningless.

 

Anyways, it's AMD so it uses as much power as small village and melts alluminum cases. 

I'm glad that Maxwell is so efficent so it can run using AAA batteries and under ambient temps.

Link to comment
Share on other sites

Link to post
Share on other sites

Cpu

What about it? If you're referring to power consumption than there's plenty left over to push the rest of the system.

 

power-comsumption.jpg

 

A SS-750KM3 would do the job nicely.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×