Jump to content

AMD Fury X Far Cry 4 game performance from AMD

ahhming

2 versions

they are not any 2 versions of "4K"

there is 4K (4096x2160) and there is UHD (3840x2160) - this is why "the internet" should get with the programme and stop mislabeling stuff

Horace Mann:

Ignorance breeds monsters to fill up the vacancies of the soul that are unoccupied by the verities of knowledge.

Link to comment
Share on other sites

Link to post
Share on other sites

No- we got 4 good things, 4 new models with HBM based on Fiji

 

Fury

Fury X

R9 nano

Dual Fury

no we didnt fury and fury x are essentialy the same, dual fury is ...well dual fury and then we had nano

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

...Okay, now show me 5k with Pascal and Arctic Islands.  :>

AMD demonstrated Fury X running two games at 5k one at 45 FPS and the other 60 FPS during their E3 press conference.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll let you in on a little secret that might twist your nips. AMD is just as good as Nvidia at power consumption now.

 

....

C'mon man .I returned my 970 because of coilwhine and bought used r290x for 2/3 price of 970. Just jabbing at people that try to prove how superior Maxwell is .

 

There's not much AMD/NV can do on 28nm ,yet people act like Maxwell is a generational jump in GPU . 

Link to comment
Share on other sites

Link to post
Share on other sites

 

There's not much AMD/NV can do on 28nm ,yet people act like Maxwell is a generational jump in GPU . 

970 and 980 were disappointing performance wise. But 980ti was a good performance jump IMO...

Link to comment
Share on other sites

Link to post
Share on other sites

People have argued that it will be less of an issue because swapping memory can be done faster, that's it.

there is but one tiny issue with that theory

HBM only improves bandwidth between GPU and VRAM, not between system and video card

did the PCIe changed? no! did system RAM changed? no! so, in the grand scheme of the system as a whole, HBM is rather irrelevant

a total game changer would be when we'll have HBM far system RAM and a CPU to support it

---

are those benchmarks true? probably are

but there should be benchmarks with games that truly have hi-rez assets for UDH resolutions, and up

Link to comment
Share on other sites

Link to post
Share on other sites

no we didnt fury and fury x are essentialy the same

Are you sure clocks will be the same on fury and fury x?

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure clocks will be the same on fury and fury x?

45926_04_amd-releases-full-specification

 

the fury x will be clocked 1050Mhz however there is a dual bios switch which will increase the clock.

The air-cooled fury clock have not been revealed yet .

 

AMD_Radeon_FuryX_21.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

One benchmark done by AMD themselves. Let's all trust the results!

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

970 and 980 were disappointing performance wise. But 980ti was a good performance jump IMO...

 

The interesting thing would be that if the Fury is close to this level of performance (which seems to good to be true, cherry picked my guess is real world will fall along with the titan x or a bit below) is how long before they cut the 980ti price or if they do it at all

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

943.jpg

I doubt this is true, mostly because it's coming directly from AMD.

I'll believe it when I see benchmarks from a Reputable source/ benchmarks.

Same would be if it was from Nvidia.

There not going to be too wrong though. And that could be overclocked.

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

But isn't 4GB VRAM a bit short for 4K?

 

I will probably buy 2 of those, but would really like around 8GB VRAM on them.

Well tbh I probably won't even need those 4GB as I am looking at 1440p screens right now. However I would like to have the probability to upgrade to a 4k screen in maybe a year or so :)

My beast (PC):

  • ASUS R.O.G. Rampage IV Black Edition, X79, LGA2011, AC4
  • 2x ASUS Fury X
  • Intel Core i7 4930K BOX (LGA 2011, 3.40GHz) OC @4.4GHz
  • Corsair H100i, CPU Cooler (240mm)
  • Samsung SSD 830 Series (512GB)
  • Samsung SSD 850 Pro 1TB
  • Western Digital 512GB HDD
  • TridentX 4x 8GB 2400MHz
  • Corsair Obsidian 900D

Link to comment
Share on other sites

Link to post
Share on other sites

But isn't 4GB VRAM a bit short for 4K?

 

I will probably buy 2 of those, but would really like around 8GB VRAM on them.

Well tbh I probably won't even need those 4GB as I am looking at 1440p screens right now. However I would like to have the probability to upgrade to a 4k screen in maybe a year or so :)

not enough vram even though performs better than Titan X  6.5/10 IGN

fx-8350 @4,4Ghz/sapphire r9 fury/2x 8gb Kingstone ddr3 @2030Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

not enough vram even though performs better than Titan X  6.5/10 IGN

Yes and that's the fact that bothers me. I always thought 4GB VRAM is barely enough for 4k? Or does it depend on the game and if the game actually uses that much VRAM or not?

My beast (PC):

  • ASUS R.O.G. Rampage IV Black Edition, X79, LGA2011, AC4
  • 2x ASUS Fury X
  • Intel Core i7 4930K BOX (LGA 2011, 3.40GHz) OC @4.4GHz
  • Corsair H100i, CPU Cooler (240mm)
  • Samsung SSD 830 Series (512GB)
  • Samsung SSD 850 Pro 1TB
  • Western Digital 512GB HDD
  • TridentX 4x 8GB 2400MHz
  • Corsair Obsidian 900D

Link to comment
Share on other sites

Link to post
Share on other sites

Does FC4 have a built-in benchmark? OP's comparison graphs are otherwise null.

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

It's just a matter of time until we know enough to pass judgement. It could take as little as five games benchmarks but I expect it to be a little more complicated than that.

Elsewhere I have seen blind fanboys knocking the Fury with empty talk and that's always a good sign, I enjoy it when fanboys cry in their pillows because the competitor they hate may have better products.

 

There are enough of unbiased professional sites that will dissect the GPU, run a full benchmark suite and publish unadulterated observations.

Do take note, if underdog delivers a great product, that's when fanboys operate at their highest clocks under full load, like furmark for haters. 

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

So AMD managed to pull another Titan killer out of their asses. And if its cheaper than the Titan X.....

it is. the fury x is 650$

Link to comment
Share on other sites

Link to post
Share on other sites

If this is true and not AMD telling us it's Ultra while it's not, I'll be damned. Glad they can finally push back Nvidia, they needed this. We need this competition. Bravo

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

in the conference at e3 earlier today they called 4k 4096 by 2160

That is technically what 4K resolution actually is. The 4K that we usally talk about is UHD

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

That is technically what 4K resolution actually is. The 4K that we usally talk about is UHD

That's why I still like to call out the full resolution or at least use the Y axis designation from interlaced vs. progressive days, aka 2160p.

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

there is but one tiny issue with that theory

HBM only improves bandwidth between GPU and VRAM, not between system and video card

did the PCIe changed? no! so, in the grand scheme of the system as a whole, HBM is rather irrelevant

 

PCIe was never saturated by any card. Not even PCIe 3.0 using a dual GPU over 8x lanes. The GPU -> VRAM (internal bandwith of the card itself) was always the bottleneck.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

That's why I still like to call out the full resolution or at least use the Y axis designation from interlaced vs. progressive days, aka 2160p.

 

Yeah I've always hated talking about resolution because it gets so murky when discussing it because of situations like this. I try to specifically write down the full resolution myself do I don't confuse people.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

PCIe was never saturated by any card. Not even PCIe 3.0 using a dual GPU over 8x lanes. The GPU -> VRAM (internal bandwith of the card itself) was always the bottleneck.

when you go from 16x to 8x you don't see any drop? oh yes you do - even when going from PCIe gen3 to gen2
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×