Jump to content

AMD Confirms Delay of R9 Fury X2/Gemini to 2016 for Better Alignment with VR Market

HKZeroFive

What do they seem to be leaning towards in terms of the cluster replacement, if you can share? I am sure the testing is still early, but I am very curious.

A combination of KNL-exclusive nodes for scale-up workloads and some Skylake-EP/EX (iGPU vs. I FPGA-accelerated not yet decided) with Arctic Islands for scale-our due to the near 1.3TFLOPs DP compute advantage in real-world performance. For anyone who saw the Pascal DP article earlier this month, I was correct. Pascal is a 1/3 DP vs. SP performance architecture. It's 12.6 vs 4.2 for Pascal theoretically with 11.3 and 3.97 being about the best in RWP achieved under OpenCL 2.0. It's a little better under CUDA. Arctic Islands is 11.8 vs. 5.9 theoretically and comes out to 10.95 and 5.75 in RWP. KNL lives up to its theoretical numbers a bit better and is a better accelerator for AI training, but in raw compute the Xeon Phi version is already beaten handily. The socketed package form has no competitor anywhere on the market.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

"to 2016 for Better Alignment with VR Market"

 

Any clues when and what is this VR market?

Mean aside from this routine fucking teams everywhere talk, hows that VR launch going to roll out like what goggles and games am I expecting to see?

Link to comment
Share on other sites

Link to post
Share on other sites

"to 2016 for Better Alignment with VR Market"

Any clues when and what is this VR market?

Mean aside from this routine fucking teams everywhere talk, hows that VR launch going to roll out like what goggles and games am I expecting to see?

Mostly the Oculus 2.0, but I think VR won't really be taking off until 2018. 2016 will still be too expensive.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Pascal is not around the corner. More like Q4 2016.

The Fury X2 will still be a beast after Pascal. After all, Pascal isn't some kind of magic GPU architecture that will be 200% faster than Maxwell.

no, itll be summer, not q4. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Well for one, @Prysin for damn sure is happy with his 295x2.

It may not be the most practical card, or even the smartest card to buy, but look at where the Titan X beats the 295x2. It takes two of them and OC'd and it BARELY edges out the 295x2. By 0.2 fps and that's the margin of error.

Just look. People who would rather have two separate GPUs are gonna go that route. But you have people like me who'd rather prefer one chunk of PCB.

Please ignore @Enderman

he is suffering from confirmation bias in quite some cases. Sure, anyone can find a single benchmark that proves their point. Finding more benchmarks that prove your point then those that disprove them however. That is more difficult.

 

people in general dont buy these cards.

 

BECAUSE THEY COST A FUCKING METRIC TONN...

 

but you see, @Enderman

if you had seen a R9 295x2 for 575 USD. Would you pass it up if you had THE BEST PSU ON THE MARKET, AND A CASE THAT SUPPORTS IT?

 

i get TitanX/980Ti performance, for 980 price.

Even when CF doesnt work, i have set GPU1 to be overclocked. So it will always perform better then stock R9 290X. Always.

 

Titan Z = 3000 USD at launch

R9 295x2 = 1500 USD at launch

Titan X = 999 USD at launch

980Ti/Fury X = 649 USD at launch

 

Most bought cards

R9 390/GTX 970 = 329 USD at launch

R9 380/GTX 960 = 229 USD at launch

 

 

the market for the "Enthusiast" level GPUs such as 980Ti and up, are just that. For Enthusiasts. You do not buy a 980Ti because you NEED it. you buy it because you CAN.

A R9 390/GTX 970 can power most games at high FPS with the right settings. you do not NEED a 980TI/Fury X to play a game. You WANT a 980Ti/Fury X to play a game at ULTRA SETTINGS.

Link to comment
Share on other sites

Link to post
Share on other sites

So.. Because it produces heat, there's no reason why you should own it?

You should tell that to JayzTwoCentz with his SLI Titan X build (skunkworks.) It's watercooled, and spits all the CPU and GPU heat straight into his room. Guess he shouldn't do that setup, because "people don't want all that heat in their room."

 

1) you need to learn to read without making assumptions

i never said people should not buy it because it produces heat...wtf...where did you get that shit from

 

2) if you read with your eyes instead of your feet, you will notice that I'm talking about REASONS why MOST PEOPLE buy other graphics cards

and yes, its a fact that the 295x2 is a niche product and does not sell very well

 

3) heat was ONE out of SEVERAL points i mentioned

not because it produces heat, but because it produces UNNECESSARY heat

 

4) titan X SLI performs better and produces less heat, and since j2c is not limited to one GPU he went with a better option

make sense?

 

5) why have twice as much heat than other options?

 

6) why am i even arguing with an AMD fanboy? of course someone like you won't understand why most people dont buy dual GPU cards, I'll just stop arguing with a brick wall now.

 

Dual-GPU cards typically don't sell well for a variety of reasons, the best of them being heat and the power reasons.

Granted, the 295x2 might be one of the coolest running dual-GPU cards, but eh

Mom, Dad, Grandpa, Please stop fighting...

But on a serious note, Enderman, you still haven't made your point clear to me. What are you arguing exactly? That the 295x2 is a bad card because of the heat? That dual gpu's in general are bad? And on another note, isn't pascal only supposed to improve compute performance majorly? Or am I thinking about this wrong?

Link to comment
Share on other sites

Link to post
Share on other sites

you've derailed this thread so far even YOU dont remember what the topic was

LOL

 

1) titan X = 250W TDP

290X = 290W

titan X SLI = less heat + more performance than 295x2

 

 

2) the ENTIRE POINT of this is that dual GPU cards sell very little because of their niche market, and delaying the fury X2 is hurting the sales because most people will be looking forward to the next AMD/nvidia GPU generation rather than "oh most powerful graphics card in the world for $1500 for the next year hurr durr"

 

so by waiting an extra 6 months for "better alignment with VR" they are going to be selling even fewer than they already would releasing it now

or maybe they're just using that an excuse instead of saying that its not ready yet

You aren't actually showing yourself to be much more than just a fanboy yourself you known either right?

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, I really though they'd release it by the end of the year. Though I wonder how they'll price it being their new lineup is not so far off and all.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

You aren't actually showing yourself to be much more than just a fanboy yourself you known either right?

he doesnt know math and he doesnt know his facts

 

http://anandtech.com/show/7930/the-amd-radeon-r9-295x2-review

R9 290X is supposed to be 250w TDP, albeit real world power draw IS higher. (however, the electricity going in is NOT equivalent to the HEAT going out)

R9 295x2 is 500W TDP because the GPUs are cherry picked to have lower voltage consumption then normal 290Xs....

 

its sort of like R9 Nano... much lower power usage due to cherry picked GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

Pascal is Q2 2016. The HPC clients are already being sampled. Miami has KNL, AI, and Pascal samples already since we're looking at replacing our entire cluster.

<s>

And actually Pascal is 1000% the performance of Maxwell. CEO Jen Hsun Huang already said 10x performance gains.

</s>

1000% of barely anything = "finally decent"

Link to comment
Share on other sites

Link to post
Share on other sites

Please ignore @Enderman

he is suffering from confirmation bias in quite some cases. Sure, anyone can find a single benchmark that proves their point. Finding more benchmarks that prove your point then those that disprove them however. 

I have never seen a 295x2 for $600...

I have seen them for $800+

and I wouldnt buy one used

 

also they are just two 290Xs in crossfire

so a lot of stuff that doesn't perform well in SLI/crossfire will perform horribly or not perform at all, because iirc you cant just disable crossfire on a dual GPU card like you can on two separate GPUs

 

and i use shadowplay, so personally I wouldnt buy a last-gen GPU just because it gets a few fps higher performance

maybe you didnt know this, but not everyone is an "fps per dollar" gaming fanatic

 

either way, my point is that dual GPU cards are a nice market, few people buy them, and even fewer will buy the fury X2 if its released so late

 

Mom, Dad, Grandpa, Please stop fighting...

But on a serious note, Enderman, you still haven't made your point clear to me. What are you arguing exactly? That the 295x2 is a bad card because of the heat? That dual gpu's in general are bad? And on another note, isn't pascal only supposed to improve compute performance majorly? Or am I thinking about this wrong?

Did you not read the OP???

AMD is hurting sales by delaying the fury X2 becuase of "better alignment with vr"

like what kind of stupid excuse is that?

I think they're just using that as an excuse instead of saying "its not ready" and lowering their stock value even more

 

not that many people buy dual GPU cards in the first place, by releasing it closer to other generations they are losing that already small nice customer base

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thinking AMD should have maybe scrapped the 2x Fury card, or just simply release it when it's ready. By the time this launches (nearly a year since Fiji launch) Arctic Islands and Pascal won't be too far behind. I don't see how waiting to launch it a long side VR products will help sell more. not a good marketing strategy IMO.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly I think the issue is two fold:

 

  1. AMD has scrapped TSMC for Samsung/GloFo for Arctic Island. This is apparently due to poor yields and quality from TSMC. This might also mean there aren't enough chips to actually make this product in high enough numbers.
  2. They might be postponing this product to await the courts judgement of the Asetek vs Cooler Master case. Releasing a product, just to see it be removed again is a bad idea.

Either way, it seems like Arctic Island cards won't be made until Q2, so expect a Q3 release in 2016. I highly doubt NVidia will be out sooner with Pascal. Maybe their quadro cards will see an earlier launch, but that is irrelevant for gaming or people wanting to spend less than 5000$ on a card.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I have never seen a 295x2 for $600...

I have seen them for $800+

and I wouldnt buy one used

 

also they are just two 290Xs in crossfire

so a lot of stuff that doesn't perform well in SLI/crossfire will perform horribly or not perform at all, because iirc you cant just disable crossfire on a dual GPU card like you can on two separate GPUs

 

and i use shadowplay, so personally I wouldnt buy a last-gen GPU just because it gets a few fps higher performance

maybe you didnt know this, but not everyone is an "fps per dollar" gaming fanatic

 

either way, my point is that dual GPU cards are a nice market, few people buy them, and even fewer will buy the fury X2 if its released so late

 

Did you not read the OP???

AMD is hurting sales by delaying the fury X2 becuase of "better alignment with vr"

like what kind of stupid excuse is that?

I think they're just using that as an excuse instead of saying "its not ready" and lowering their stock value even more

 

not that many people buy dual GPU cards in the first place, by releasing it closer to other generations they are losing that already small nice customer base

 

Again, mate... i own this card. Ive been using AMD for a while...

And i can happily inform you that on most of your points.

 

YOU ARE WRONG!!

 

Well, you do not live in Norway and or you havent looked for the best sales

Mine was brand new out of the box. Box still had original, untampered seals.

 

They are two slightly overclocked 290Xs.

Reference 290X is 975MHz core clock with 1GHz boost. 295x2 runs at 1018 "core" clock... it doesnt really "boost". It runs any range between then 300MHz "idle" or 1018 "max".

CF can be disabled with dual GPU cards in crimson drivers. Game by game if you so want to. It was a bit more cumbersome to disable pre-crimson drivers. BUT IT WAS POSSIBLE.

 

Shadowplay is a fair argument. I have not used it myself, but every video/review ive seen of it indicates that it is simply better then what AMD has. Especially on the streaming side.

 

 

In terms of SLI/CF issues:

my HD 7950s using the CF Bridge had stuttering. Not a whole lot. but they had it. My 295x2 which uses a PLX chip as interconnect still have some microstutters, but much unlike the HD 7950s, my current issues is game by game. Not as a whole.

 

Some games will have more microstutters then others. It seems this is mostly tied to the post processing effects used in some titles.

I recently found out, by modding skyrim, that different implementations of Depth of Field and Adaptive Lightning can cause flicker/stutter on AMD (these are known issues). Whilst other similar effects, that just have a few minor tweaks mostly timing wise. Will NOT cause stutters or flickers.

 

This indicates to me that SOME. but by far not all stuttering issues with GCN 1.1 and GCN 1.2 cards (these cards uses the PCIe for CF rather then a bridge interconnect). Are mostly caused by timing/execution issues in the game code rather then drivers.

 

I say SOME because far from all games will have this issue, and saying AMD drivers are fine and without any fault would be a blatant lie.

Link to comment
Share on other sites

Link to post
Share on other sites

"to 2016 for Better Alignment with VR Market"

 

Any clues when and what is this VR market?

Mean aside from this routine fucking teams everywhere talk, hows that VR launch going to roll out like what goggles and games am I expecting to see?

It's clearl Lisa Su can't run the company for shit. She is helping AMD to go to bankrupt. "Better alignment with VR"...?? The fuck??

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

They should just scrap it at this point.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

 

its just two fury Xs

 

Don't it's specs match like 2 Fury Nano's/ normal Furys?

It'll be extremely highly binned Fury X, since it's rumored TDP is 375w and the Fury Nano TDP is less than 1/2 that (yes, I know Rumors. WCCF Tech, and TDP  != power consumption).

This said the Gemni GPU will have to be either Nano's, or a 140mm+ radiator since the Fury X runs at ~60 C, 2 of them would be probably be too much for that small of a radiator.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Don't it's specs match like 2 Fury Nano's/ normal Furys?

It'll be extremely highly binned Fury X, since it's rumored TDP is 375w and the Fury Nano TDP is less than 1/2 that (yes, I know Rumors. WCCF Tech, and TDP  != power consumption).

This said the Gemni GPU will have to be either Nano's, or a 140mm+ radiator since the Fury X runs at ~60 C, 2 of them would be probably be too much for that small of a radiator.

yes, this was mentioned already later on in the discussion

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't matter what anyone says. It's releasing way too late. They may sell some units but they won't sell a lot at all.

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't matter what anyone says. It's releasing way too late. They may sell some units but they won't sell a lot at all.

This.

By the time its released nvidias line up will be on the horizon and Arctic islands wouldnt be too far off.

Link to comment
Share on other sites

Link to post
Share on other sites

i wonder if this has something to do with amd not having good enough cpus to pair them with the powerful dual gpu and promote it

fx-8350 @4,4Ghz/sapphire r9 fury/2x 8gb Kingstone ddr3 @2030Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

This said the Gemni GPU will have to be either Nano's, or a 140mm+ radiator since the Fury X runs at ~60 C, 2 of them would be probably be too much for that small of a radiator.

Well as a rad heats up, the Delta worth the outside air increases, making the rad more efficient.

It might be that fury X hits 60c 2 won't hit 120

295x2 runs under 75c remember and that's 500w

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

i wonder if this has something to do with amd not having good enough cpus to pair them with the powerful dual gpu and promote it

 

I don't think so. This is supposed to be the fastest video card ever and there wouldn't be any sense to try and promote it with a new CPU. I'm just wondering why they're delaying it because I don't buy the VR market alignment excuse.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

Pascal is not around the corner. More like Q4 2016.

The Fury X2 will still be a beast after Pascal. After all, Pascal isn't some kind of magic GPU architecture that will be 200% faster than Maxwell.

It will be 200% faster, it must be faster, after all 17billion transistor vs 8billion in 980ti, with HBM2 on top of that and other arhitecture improvements on 16nm it must be a huge leap else it will be a fail.

We need it for 4k and VR so they better not hold back on any performance.

From 28nm old outdated stuff to 16nmFF i have very high expectations.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×