Jump to content

AMD Fury X Far Cry 4 game performance from AMD

ahhming

when you go from 16x to 8x you don't see any drop? oh yes you do - even when going from PCIe gen3 to gen2

Prove it...

http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/5#.VYExZ_mqhGE

It's a bit old, but still relevant.

 

You feel the 1-2 FPS difference???

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

there is but one tiny issue with that theory

HBM only improves bandwidth between GPU and VRAM, not between system and video card

did the PCIe changed? no! so, in the grand scheme of the system as a whole, HBM is rather irrelevant

a total game changer would be when we'll have HBM far system RAM and a CPU to support it

Why are you in a quest against progress ? It's not just mild progress, but huge progress.

You were that guy that said GDDR3 was just fine when AMD brought GDDR5, am I right?

Link to comment
Share on other sites

Link to post
Share on other sites

I was making a jab at AMD for how they tend to have pretty damn high power usages on their processors and gpus :P

 

But don't forget the titan x has one 6 pin and one 8 pin power connector, whereas the fury x has two 8 pins.

 

Unless it's a multi-gpu card, I don't see how exactly those numbers could be legit.

Dont read too much into the whole power connector thing... You could power a 290X on a single 6pin if the PSU and GPU were buildt the right way. Its not practical because its pretty expensive to do - just saying.

Core i7 4820K  |  NH-D14 | Rampage IV Extreme | Asus R9 280X DC2T | 8GB G.Skill TridentX | 120GB Samsung 840 | NZXT H440  |  Be quiet! Dark Power Pro 10 650W

Link to comment
Share on other sites

Link to post
Share on other sites

You feel the 1-2 FPS difference???

and there it is, you proved it for me - don't look just at the averages, look at the min and max too ;)
Link to comment
Share on other sites

Link to post
Share on other sites

943.jpg

I doubt this is true, mostly because it's coming directly from AMD.

I'll believe it when I see benchmarks from a Reputable source/ benchmarks.

Same would be if it was from Nvidia.

Would you like some salt with that salt sir? :P

CPU : i5-8600k , Motherboard: Aorus Z370 Ultra Gaming , RAM: G.skill Ripjaws 16GB 3200mhz ,GPU : Gigabyte 1070 G1 ,Case: NZXT Noctis 450 ,Storage : Seagate 1TB HDD , Seagate Barracuda 2TB, Samsung 860 EVO 500GB , KINGSTON SHFS37A/240G HYPERX FURY 240GB  , PSU : Corsair RM 750X , Display(s) : LG Flatron W2243S , Dell U2715H , Cooling: Coolermaster Hyper 212x Evo, Keyboard: Coolermaster Rapid-i , Drevo Tyrfing Black , Coolermaster Masterkeys Pro S, Mouse : Logitech G502 Proteus Spectrum , Windows 10 

Link to comment
Share on other sites

Link to post
Share on other sites

I noticed that too. I can only assume it says something like all GameWorks bs off. 4xMSAA at 4k is just stupid waste of everyones time.

I didn't say that 4xMSAA is needed at 4k , but that Techpowerup example uses it . And we don't know the setup of the AMD FarCry test . If they used FXAA , it is absolutely possible to have a disparity of 50 avg vs 35 avg , even though at the same conditions they will actually be extremely close .

What Im saing is , before eveyone goes crazy for these results , lets see some reviews and benchmarking .

If these results are true , and with that price it would almost be too good to be true .

Link to comment
Share on other sites

Link to post
Share on other sites

and there it is, you proved it for me - don't look just at the averages, look at the min and max too ;)

There's very little difference regardless of the slot you place a card (1-3 FPS). Linus has done at least three videos about this.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There's very little difference regardless of the slot you place a card (1-3 FPS). Linus has done at least three videos about this.

 

 

 

Last 2 are the same.

Almost got it......

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

There's very little difference regardless of the slot you place a card (1-3 FPS). Linus has done at least three videos about this.

the problem is not the difference is little, the problem is that there is a difference

and as you go up in resolution that PCIe bandwidth matters even more

just compare TW3 on PCIe gen3 vs gen2 and see how objects will have pop-in on gen2 - extremely noticeable

---

here is one more example on why HBM is kinda irrelevant:

XB1 vs PS4 - XB1 uses DDR3 while PS4 uses GDDR5, both as unified RAM

and yet, there is very little difference between them

Link to comment
Share on other sites

Link to post
Share on other sites

in the conference at e3 earlier today they called 4k 4096 by 2160

And ?? that's 4K for you.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Yes and that's the fact that bothers me. I always thought 4GB VRAM is barely enough for 4k? Or does it depend on the game and if the game actually uses that much VRAM or not?

i think people try to simplify things because it's easy to understand then. The card and the drivers are still a huge factors even at 4k. The best example is 290x 4gb edition vs 8gb edition, going to 8gb you barely get any actual difference and some time you lose fps , the same thing with 980ti vs titan x , and because 980ti can be actually overclocked it performes better when you go for it.

 

Now we still need to wait for benchmarks. But if fury x can be overlocked like 970 (even with all it's faults it's still a great card) it's gonna be the best value on the market.

fx-8350 @4,4Ghz/sapphire r9 fury/2x 8gb Kingstone ddr3 @2030Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

I seriously hope this is true, as Far Cry 4 doesn't have a built in benchmark from what I know. 

I really hope the Fury X smashes the Titan X in all gaming tests. Some proper competition and decent prices are what we need. As a current NVIDIA owner I welcome AMD giving them a damn good thrashing.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

I seriously hope this is true, as Far Cry 4 doesn't have a built in benchmark from what I know. 

I really hope the Fury X smashes the Titan X in all gaming tests. Some proper competition and decent prices are what we need. As a current NVIDIA owner I welcome AMD giving them a damn good thrashing.

Not enough people think this way, too many stick fanatically to their preferred brand. Some just love to hate AMD, others are naive thinking that things will get better if AMD dies.

 

Personally, I want to see Crysis 3, GTA V, AC Unity, Total War and COH2 benchmarks. The more the merrier but these five would allow me to gauge it's performance enough to start drawing conclusions.

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

I seriously hope this is true, as Far Cry 4 doesn't have a built in benchmark from what I know. 

I really hope the Fury X smashes the Titan X in all gaming tests. Some proper competition and decent prices are what we need. As a current NVIDIA owner I welcome AMD giving them a damn good thrashing.

Well this is a gameworks game with the titan probably enabling all of that while the AMD card wouldn't dare... so I'm assuming the gap isn't so great.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

Well this is a gameworks game with the titan probably enabling all of that while the AMD card wouldn't dare... so I'm assuming the gap isn't so great.

 

I always assume Ultra means Ultra. Not Ultra* where * = features not enabled.

 

That's why I'm really hoping it can beat the Titan X, hell even if it just matches it 1-1 it means it's better than the 980Ti at the same price point.

I'm hoping lots of reviewers actually do test games with Gameworks on. I really want to see how it performs truly at Ultra.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not enough people think this way, too many stick fanatically to their preferred brand. Some just love to hate AMD, others are naive thinking that things will get better if AMD dies.

 

Personally, I want to see Crysis 3, GTA V, AC Unity, Total War and COH2 benchmarks. The more the merrier but these five would allow me to gauge it's performance enough to start drawing conclusions.

 

I want a 4K Crysis 3 benchmark, just because it's the closest you're going to get to knowing what Star Citizen is going to run like.

 

 

I always assume Ultra means Ultra. Not Ultra* where * = features not enabled.

 
The less idiotic of reviewers have finally starting ignoring MSAA though. Or at the very least only used one step of MSAA. Because frankly, if you use MSAA at 4K you're not the sharpest knife in the drawer.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

 

The less idiotic of reviewers have finally starting ignoring MSAA though. Or at the very least only used one step of MSAA. Because frankly, if you use MSAA at 4K you're not the sharpest knife in the drawer.

 

 

Not the sharpest knife? more like a ladle.

Link to comment
Share on other sites

Link to post
Share on other sites

The less idiotic of reviewers have finally starting ignoring MSAA though. Or at the very least only used one step of MSAA. Because frankly, if you use MSAA at 4K you're not the sharpest knife in the drawer.

 

 

Hey now! Nothing wrong with x16 MSAA at 4K! How else will be know if the cards can get over 15 fps?!  :P

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

So much performance per square inch on that card x.x

 

Looks like the 980 Ti has a contender in my upcoming mITX box. I planned on going 12" x 12" x 2.6", but if i could get my hands on this bad boy, i might be able to shave a few inches off to get a 12x8x2.6 box going on. Super stoked to see this card is not falling behind like the rest of them.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

and there it is, you proved it for me - don't look just at the averages, look at the min and max too ;)

I asked you specifically to prove that I or you for that matter, with our human eyes can feel the difference.

I was completely aware of the fact that a difference exists.

But can you feel it? I am sure you can't.

 

And also...I was 99.99% positive that you might say I proved it for you....although I didn't.

You said that you can feel the difference. I said that you can't and believe me, you can't feel a 1-3 FPS difference, not even in minimums.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

So pretty much the Fury was the only good thing they released?

 

lol isn't that enough ? you have 4 products, it's like Nvidia releasing Maxwell line up 970, 980, 980Ti, and Titan

you have :

1- Full Fiji, the XT, known as Fury X (Titan X perf+) 50% more performance than it's previous flagship with the same TDP  275

2- Fury Pro, known as Fury ( 980- 980 Ti perf )

3- Fury LE, known as Nano ( 970-980 perf) , a 6inch Card with 175 TDP

4-  Dual Fiji GPU ( World Fastest Graphic Card)

and on top of that they rebrand Low-Mid end GPUs With DOUBLE The Memory

and you are not happy!!! seriously WTF!

Link to comment
Share on other sites

Link to post
Share on other sites

lol isn't that enough ? you have 4 products, it's like Nvidia releasing Maxwell line up 970, 980, 980Ti, and Titan

you have :

1- Full Fiji, the XT, known as Fury X (Titan X perf+)

2- Fury Pro, known as Fury ( 980- 980 Ti perf )

3- Fury LE, known as Nano ( 970-980 perf) , a 6inch Card with 175 TDP

4-  Dual Fiji GPU ( World Fastest Graphic Card)

and on top of that they rebrand Low-Mid end GPUs With DOUBLE The Memory

and you are not happy!!! seriously WTF!

 

I'm not happy with their decision to rebrand. Yes. Also, just recently learned about nano and dual.

 

 

And before the record breaks; I know nvidia rebrands too.

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not happy with their decision to rebrand. Yes. Also, just recently learned about nano and dual.

 

 

And before the record breaks; I know nvidia rebrands too.

 

i understand why they did rebrand, i would have done the same if i was in their position, i dont see the need to spend money on 2 or 3 new extra architectures to  fill Low-Mid end slots in the line up, thats like 0,5 to 1 billion investment on R&D on another 28nm designs, when im gonna start producing 16nm finfet tech in 6 month, when i dont have any money to spare to start with.

i release last 28nm gen i have i double it's memory, slight overclock, and hang on for 6 months, the only thing i dont like in this, is the pricing of 300 series they should have cut more on them.

PS : although the 370 shouldn't have been in the line up, i mean 290, and 285 never rebranded...OK, the 260 is still ok, but the 270 should have been thrown out like the 280 out of the line up.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not happy with their decision to rebrand. Yes. Also, just recently learned about nano and dual.

 

 

And before the record breaks; I know nvidia rebrands too.

Not as badly though, and any re-brands get bumped down a notch, 680>770.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

....

as I posted here yes you can

just compare TW3 on PCIe gen3 vs gen2 and see how objects will have pop-in on gen2 - extremely noticeable

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×