Jump to content

AMD Fury X Far Cry 4 game performance from AMD

ahhming

So, AMD could have most of the market regarding value/price-to-performance... well maybe (tech reviewer benches in wait)  Only the $300 price point eludes them, that pesky 970.

 

Anything below a 750Ti is being taken by IrisPro, and the 750Ti is not the best deal in it's price range.  How off am I?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna go with those benchmarks are a load of bullshit on this one.

I wonder how much power it's gonna draw, I'm guessing that you're gonna need at least a 1000W PSU to have two in crossfire.

--Note to those who don't understand things--

The whole 1000W PSU for two fury xs in CF was a jab at amd as their cards tend to be on the higher power consumption side of things.

you mean thats something that fanboys say

how long before people say omg its too hot because its 20W higher than nvidia and start making comments about it burning down their house

Link to comment
Share on other sites

Link to post
Share on other sites

as I posted here yes you can

Wow you really do know nothing about technology don't you.

"I can notice the difference with pop-in!" (somehow, look at all this proof I am showing against the mountain of counter evidence refuting my claim)

 

Also since consoles have unified memory they do not have to do with the PCIe bus, and NO, the PS4 and XB1 DO NOT, ANYWHERE NEAR, "perform nearly the same"

Also, DDR3 is very similar performance wise to GDDR5, so the point about consoles being "not very different" (which they very much are) again means nothing.

If you knew what you were talking about again you would know this. DDR2 similar to GDDR3, DDR3 similar to GDDR5.

Link to comment
Share on other sites

Link to post
Share on other sites

as I posted here yes you can

Posted what? A statement no one can verify? Where are your PCIE 3.0 and PCIE 2.0 platforms? Where is a video or any research data to back up your statement? You just pulled that out of your ass and acting like your statement sticks just because it can't be verified.

 

The other poster, the one with the cock avatar totally exposed your PCIE 3 nonsense and instead of conceding like you should have, you keep on talking BS.

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

...

and that was exaclty my point

because HBM does not increase PCIe bandwidth

and also HBM is only relevant in GPU-VRAM communication

case and point the DDR3 vs GDDR5 make very little difference in real world performance

Posted what? A statement no one can verify? Where are your PCIE 3.0 and PCIE 2.0 platforms? Where is a video or any research data to back up your statement? You just pulled that out of your ass and acting like your statement sticks just because it can't be verified.

The other poster, the one with the cock avatar totally exposed your PCIE 3 nonsense and instead of conceding like you should have, you keep on talking BS.

I'm sorry, do I look like I work for LTT?!?!

and yes I have a PCIe 2.x platform and I can attest to the occasional assets pop-in in TW3

care to disprove me? be my guest

Link to comment
Share on other sites

Link to post
Share on other sites

So, AMD could have most of the market regarding value/price-to-performance... well maybe (tech reviewer benches in wait)  Only the $300 price point eludes them, that pesky 970.

 

Anything below a 750Ti is being taken by IrisPro, and the 750Ti is not the best deal in it's price range.  How off am I?

 

Yep, they're getting greedy with the 390x they could have price it close to the 970 and it would fair reasonably well, maybe even win. It seriously baffles the fucking mind that they think people will be willing to pay 450 for it, supreme fucking idiocy on their part.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

case and point

 

You have no case, you have no point. And the idiom is "case in point".

 

 

Yep, they're getting greedy with the 390x they could have price it close to the 970 and it would fair reasonably well, maybe even win. It seriously baffles the fucking mind that they think people will be willing to pay 450 for it, supreme fucking idiocy on their part.

 

 

They're just fishing for alpha fanboy money, it'll probably take a cut down the road.

 

However, GDDR5 is expensive so kitting it out with 8GB was a mistake to begin with.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes I will. I have a PCIE 3 platform and I also get pop in.

What's funny is that you are clueless as to why that occurs.

You are so bent on knocking Radeon cards, you stoop so low with that agenda

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

You have no case, you have no point. And the idiom is "case in point".

yes it is because unified RAM architecture gets rid of the need to make system RAM - VRAM swaps

thus, you have very concrete DRR3 vs GDDR5 comparison

 

if you fail to understand that ... not my problem

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, they're getting greedy with the 390x they could have price it close to the 970 and it would fair reasonably well, maybe even win. It seriously baffles the fucking mind that they think people will be willing to pay 450 for it, supreme fucking idiocy on their part.

 

I guess they have it there to battle with the 980.  The problem is the price point for the 980 is kind of bad for what it is after the Ti release.  ($450-$475 980?)

 

Should be moderately interesting to see this play out.

Link to comment
Share on other sites

Link to post
Share on other sites

snip

 

My bad, I forgot; you're ignored now, don't worry about quoting me, good luck on your crusade

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess they have it there to battle with the 980.  The problem is the price point for the 980 is kind of bad for what it is after the Ti release.  ($450 980?)

 

Should be moderately interesting to see this play out.

 

If that is the case why price the Fury air version at 550? That's right in the 980s (500s) ballpark so it's not that. Again they're just trying to get away with scamming customers with the 390x which will of course backfire and they will cut down the price quickly so it's only an early adopter trap.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

If that is the case why price the Fury air version at 550? That's right in the 980s (500s) ballpark so it's not that. Again they're just trying to get away with scamming customers with the 390x which will of course backfire and they will cut down the price quickly so it's only an early adopter trap.

 

F*ck, you're right.  :mellow:

Link to comment
Share on other sites

Link to post
Share on other sites

lol isn't that enough ? you have 4 products, it's like Nvidia releasing Maxwell line up 970, 980, 980Ti, and Titan

you have :

1- Full Fiji, the XT, known as Fury X (Titan X perf+) 50% more performance than it's previous flagship with the same TDP  275

2- Fury Pro, known as Fury ( 980- 980 Ti perf )

3- Fury LE, known as Nano ( 970-980 perf) , a 6inch Card with 175 TDP

4-  Dual Fiji GPU ( World Fastest Graphic Card)

and on top of that they rebrand Low-Mid end GPUs With DOUBLE The Memory

and you are not happy!!! seriously WTF!

The Fury and the Fury X are the exact same card except the X is water cooled. 550 and 650 respectively

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, do I look like I work for LTT?!?!

and yes I have a PCIe 2.x platform and I can attest to the occasional assets pop-in in TW3

care to disprove me? be my guest

Easy, I get pop in on a PCIE 3 platform.

So you admit to not having access to both PCIE 2 & 3 platforms, no videos, no links to articles, no research yet you jumped at the opportunity to spread your baseless claim.

What was that thought process like? "Hurr durr, I get pop in on PCIE 2.0, I'm going to ignore all other possible explanations, my PCIE is too low"

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

One thing I never do is believe benchmarks in a release or reveal event. 

I will happily wait for sites to get their hands on the card and do their own tests.

Command Center:

Case: Corsair 900D; PSU: Corsair AX1200i; Mobo: ASUS Rampage IV Black Edition; CPU: i7-3970x; CPU Cooler: Corsair H100i; GPU: 2x ASUS DCII GTX780Ti OC; RAM: Corsair Dominator Platinum 64GB (8x8) 2133MHz CL9; Speaker: Logitech Z2300; HDD 1: Samsung 840 EVO 500GB; HDD 2: 2x Samsung 540 EVO 500GB (Raid 0); HDD 3: 2x Seagate Barracuda 3TB (Raid 0); Monitor 1: LG 42" LED TV; Monitor 2: BenQ XL2420TE; Headphones 1: Denon AH-D7000Headphones 2Audio-Technica AD1000PRMHeadphones 3Sennheiser Momentum Over-EarHeadsetSteelseries Siberia Elite; Keyboard: Corsair Strafe RBG; Mouse: Steelseries Rival 300; Other: Macbook Pro 15 Retina (Mid-2014), PlayStation 4, Nexus 7 32GB (2014), iPhone 6 64GB, Samsung Galaxy S6 64GB
Link to comment
Share on other sites

Link to post
Share on other sites

yes it is because unified RAM architecture gets rid of the need to make system RAM - VRAM swaps

thus, you have very concrete DRR3 vs GDDR5 comparison

 

if you fail to understand that ... not my problem

 

The performance difference between the XB1 and PS4 which DOES exist (regardless of your stated points with no proof) is due to the faster GPU on the PS4, NOT MEMORY. It is FAR from a concrete DDR3 vs GDDR5 comparison.

 

Did you actually not read my post AT ALL? or did you just notice the fact that I also said there was no PCIe communication and immediately assume everything else I said also agreed with you 100%?

 

Specifically the part stating that DDR3 is SIMILAR PERFORMANCE WISE TO GDDR5. THEREFORE there won't be much performance difference. DDR3 simply has much lower latency, but also much lower bandwidth.

HBM is not similar performance wise to DDR3 and GDDR5.

 

THEREFORE there WILL be a performance difference.

Link to comment
Share on other sites

Link to post
Share on other sites

Easy, I get pop in on a PCIE 3 platform.

So you admit to not having access to both PCIE 2 & 3 platforms, no videos, no links to articles, no research yet you jumped at the opportunity to spread your baseless claim.

What was that thought process like? "Hurr durr, I get pop in on PCIE 2.0, I'm going to ignore all other possible explanations, my PCIE is too low"

if I'd known in advance I'd have this discussion, I'd be prepared with recorded footage

but demand me to pop with evidence just like snapping fingers .. wow!

 

no one even bothered testing for this and bam! ... I'm instantly wrong  :blink:

Link to comment
Share on other sites

Link to post
Share on other sites

if I'd known in advance I'd have this discussion, I'd be prepared with recorded footage

but demand me to pop with evidence just like snapping fingers .. wow!

Well before you have any proof don't make claims that are refuted by every other benchmark on the internet and just say "well I had some personal experience so I MUST be right and everyone else MUST be wrong!"

Link to comment
Share on other sites

Link to post
Share on other sites

Well before you have any proof don't make claims that are refuted by every other benchmark on the internet and just say "well I had some personal experience so I MUST be right and everyone else MUST be wrong!"

what otehr benchmarks on the internet? you mean the ones that show there is a difference in PCIe gen 3 vs gen2

or the ones that show exactly what I say: assets pop in; but wait .. there aren't any - suddenly I'm at fault

 

yeah ... I think we're done here

Link to comment
Share on other sites

Link to post
Share on other sites

and that was exaclty my point

because HBM does not increase PCIe bandwidth

and also HBM is only relevant in GPU-VRAM communication

case and point the DDR3 vs GDDR5 make very little difference in real world performance

You really have no clue as to how graphics cards and their components work, do you?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

You really have no clue as to how graphics cards and their components work, do you?

enlighten me master

Link to comment
Share on other sites

Link to post
Share on other sites

what otehr benchmarks on the internet? you mean the ones that show there is a difference in PCIe gen 3 vs gen2

or the ones that show exactly what I say: assets pop in; but wait .. there aren't any - suddenly I'm at fault

 

yeah ... I think we're done here

 

Are you... actually.. this... dumb

 

Literally every benchmark you can find on PCI-E bandwidth testing in games shows almost NO difference in FPS, only once you get down to crazy low amounts like PCI-E 1.1 x8/PCI-E 2.0 x4 does it even make a miniscule difference.

 

There have already been links of videos in this thread of similar things, but here.

 

https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/1.html

http://www.tomshardware.com/reviews/graphics-performance-myths-debunked,3739-3.html

https://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

 

inb4 argues over some of the charts prove his point slightly but completely ignores the margin of error.

 

EDIT: PCI-E bandwidth ONLY MATTERS for GPGPU purposes, in which it is incredibly important, as swapping large amounts of data multiple times in a single frame will show the PCI-E's weakness. This has almost NO important on games however as VRAM is not swapped anywhere near as much as in a GPGPU environment.

Link to comment
Share on other sites

Link to post
Share on other sites

I really do hope this card performs as they claim.  If the price is right then that will make all the lower end stuff even cheaper again.  I fear however that all the rebrands might have damaged AMD's image too much already for this to happen.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

enlighten me master

1. GDDR5 has significantly higher bandwidth than DDR3, even though GDDR5 is based on DDR3-which is why the lower performing DDR3 is used on cheaper graphics cards with weak GPU's

2. PCIe 2.0 has been proven time and again-even by Linus himself-to have enough bandwidth for any modern Graphics card, only a few very rare and specific situations not related at all to gaming need more bandwidth than is supplied by PCIe Gen 2.0-mainly tasks using co-processing cards. And I speak from experience when I say that playing games on PCIe Gen 2.0 and PCIe Gen 3.0 makes no discernible difference (i5 4440, GTX 970, Asus H87M-Pro set to PCIe Gen 3.0 x16 and Gen 2.0 x16 respectively).

3. Pop in is related to the game engine itself, normally caused by a bug or poor optimization.

Edit: BTW, your an idiot for using the Witcher 3 as an example: http://www.polygon.com/2015/5/26/8659397/graphics-complaints-arise-following-new-witcher-3-patch-for-pc

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×