Jump to content

AMD Radeon Fury X 3DMark performance

BonSie

Not a flame or troll question. Just seeking an honest answer: why didn't Nvidia use HBM on the TiX & 980 Ti? Did Nvidia not see enough of a performance increase for it? Are they letting AMD practically "beta" testing it before HBM2 becomes standard?

Like I said, I don't know. I'm sure Nvidia has the technology to implement HBM1 maybe specifically for their highest end cards. Just wondering why they chose to forego HBM1 and wait for Pascal.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Could it be...the ghost of Ati is showing itself?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Not a flame or troll question. Just seeking an honest answer: why didn't Nvidia use HBM on the TiX & 980 Ti? Did Nvidia not see enough of a performance increase for it? Are they letting AMD practically "beta" testing it before HBM2 becomes standard?

Like I said, I don't know. I'm sure Nvidia has the technology to implement HBM1 maybe specifically for their highest end cards. Just wondering why they chose to forego HBM1 and wait for Pascal.

Nvidia saw the trend of games using more VRAM and decided to capitalize on it, not to mention the cost of HBM and interposers would eat into their profit margins as they currently stand.

 

Also, planned obsolescence. Do the minimum to ensure your new products will sell and compete well without doing so much as to make it more than necessarily difficult to differentiate your next product line. Monopolistic competition 101.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Not a flame or troll question. Just seeking an honest answer: why didn't Nvidia use HBM on the TiX & 980 Ti? Did Nvidia not see enough of a performance increase for it? Are they letting AMD practically "beta" testing it before HBM2 becomes standard?

Like I said, I don't know. I'm sure Nvidia has the technology to implement HBM1 maybe specifically for their highest end cards. Just wondering why they chose to forego HBM1 and wait for Pascal.

I think the problem was that if Nvidia used it they'd only be fitting 4GB onto the GPU. Its better for them to wait until HBM2 comes out and they can use 8GB.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Not a flame or troll question. Just seeking an honest answer: why didn't Nvidia use HBM on the TiX & 980 Ti? Did Nvidia not see enough of a performance increase for it? Are they letting AMD practically "beta" testing it before HBM2 becomes standard?

Like I said, I don't know. I'm sure Nvidia has the technology to implement HBM1 maybe specifically for their highest end cards. Just wondering why they chose to forego HBM1 and wait for Pascal.

 

Because http://electroiq.com/blog/2013/12/amd-and-hynix-announce-joint-development-of-hbm-memory-stacks/

 

Hynix is likely licensing out HBM 2.0 to whoever will pay for it (i.e. Nvidia), but HBM 1.0 was probably limited for AMD's use only, as per whatever agreement I'm sure AMD had with Hynix to co-develop HBM together. Nvidia had no involvement in the development of HBM, and as such I doubt they ever had a choice to implement HBM 1.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope this $999 amd price point rumour is not true.

 

I'm not too fussed on benchmarks just yet, its the price at stake here.

 

I might pay £550ish for a new card if it top of the range (looking at obvious).

 

I will not pay £999 (insulting conversion to sterling) for any card! 

 

It is ridiculous whether the price is considered reasonable or not; to make it viable at that price point it would need to make a brew and put the washer on spin also, i.e. it is not reasonable.

 

I am a self made millionaire and a man who knows what he wants, only one of these is true.

Link to comment
Share on other sites

Link to post
Share on other sites

Because http://electroiq.com/blog/2013/12/amd-and-hynix-announce-joint-development-of-hbm-memory-stacks/

 

Hynix is likely licensing out HBM 2.0 to whoever will pay for it (i.e. Nvidia), but HBM 1.0 was probably limited for AMD's use only, as per whatever agreement I'm sure AMD had with Hynix to co-develop HBM together. Nvidia had no involvement in the development of HBM, and as such I doubt they ever had a choice to implement HBM 1.

To be fair that is only speculation.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you guys for the responses. :) I just wasn't sure, but your guys' responses make sense.

You learn a lot everyday in here. :)

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you guys for the responses. :) I just wasn't sure, but your guys' responses make sense.

You learn a lot everyday in here. :)

Another thing to keep in mind, buy based on performance and overall reviews. That way you won't get stuck fanboying to a company such as AMD when their making bad decisions. shots fired

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Ayy a 980Ti that is bound to use more power and be louder than a 980Ti, what a steal.

Link to comment
Share on other sites

Link to post
Share on other sites

Ayy a 980Ti that is bound to use more power and be louder than a 980Ti, what a steal.

wat

 

OT: I literally made this exact prediction on the performance of the Fury... I'm a hacker

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Another thing to keep in mind, buy based on performance and overall reviews. That way you won't get stuck fanboying to a company such as AMD when their making bad decisions. shots fired

These next couple years will determine the course of semiconductor history. If by some miracle AMD makes Zen so obnoxiously competitive they start beating Intel in the HPC space for more than a single generation, maybe we can keep all 3 companies afloat. If AMD goes down in the short or medium term and the IP is split between Nvidia and Intel such that Nvidia gets x86_64 and forces Intel to make a cross-license, and Intel gets the ATI IP, we get back strong competition on both the CPU and GPU front, Intel's cheaper foundries and expansive research resources against Nvidia's wealth of experience for GPUs and Intel's experience, resources, and cheaper foundries against Nvidia's innovation in CPU designs applied to x86.

 

If AMD holds on too long and Intel unseats Nvidia from HPC, then Nvidia will be bought out or merged with, and then not only with AMD go up in smoke, but so will IBM, Qualcomm, and VIA. In the next 5 years the long-term course will be decided.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Not a flame or troll question. Just seeking an honest answer: why didn't Nvidia use HBM on the TiX & 980 Ti? Did Nvidia not see enough of a performance increase for it? Are they letting AMD practically "beta" testing it before HBM2 becomes standard?

Like I said, I don't know. I'm sure Nvidia has the technology to implement HBM1 maybe specifically for their highest end cards. Just wondering why they chose to forego HBM1 and wait for Pascal.

I don't think there was much incentive for nVidia to compete on HBM1. They already maintain the vast majority of dGPU shares and thus don't really need to be at the forefront of innovation to drive sales. Plus there's the risk of early adoption with new tech and the innate issues that can potentially come with it. They can allow AMD to "beta" test the technology (as you pointed out) and, should it perform well, they wont lose too much in terms of revenue, but they can improve on whatever short-comings it may have in its current iteration ready for Pascal.

 

nVidia maintains a lot of its success through aggressive marketing, and refining existing technology. AMD are almost forced to rely on early adoption of new technology as a selling point. I just hope that AMD can claw back some market share so both companies can start pushing innovation.

Link to comment
Share on other sites

Link to post
Share on other sites

These next couple years will determine the course of semiconductor history. If by some miracle AMD makes Zen so obnoxiously competitive they start beating Intel in the HPC space for more than a single generation, maybe we can keep all 3 companies afloat. If AMD goes down in the short or medium term and the IP is split between Nvidia and Intel such that Nvidia gets x86_64 and forces Intel to make a cross-license, and Intel gets the ATI IP, we get back strong competition on both the CPU and GPU front, Intel's cheaper foundries and expansive research resources against Nvidia's wealth of experience for GPUs and Intel's experience, resources, and cheaper foundries against Nvidia's innovation in CPU designs applied to x86.

 

If AMD holds on too long and Intel unseats Nvidia from HPC, then Nvidia will be bought out or merged with, and then not only with AMD go up in smoke, but so will IBM, Qualcomm, and VIA. In the next 5 years the long-term course will be decided.

I'll say this straight up, if Intel ever gets hold of the Ati IP, Nvidia is going to really, really have some tough times competing. Just look at how far Intel's iGPU have come-and they started from scratch.

Edit: I"ll put it into perspective, the iGPU in my i5 4440 are far better than my Geforce 6200.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think there was much incentive for nVidia to compete on HBM1. They already maintain the vast majority of dGPU shares and thus don't really need to be at the forefront of innovation to drive sales. Plus there's the risk of early adoption with new tech and the innate issues that can potentially come with it. They can allow AMD to "beta" test the technology (as you pointed out) and, should it perform well, they wont lose too much in terms of revenue, but they can improve on whatever short-comings it may have in its current iteration ready for Pascal.

 

nVidia maintains a lot of its success through aggressive marketing, and refining existing technology. AMD are almost forced to rely on early adoption of new technology as a selling point. I just hope that AMD can claw back some market share so both companies can start pushing innovation.

What's odd to me is Nvidia hasn't jumped on Hynix's 8GHz memory yet. I know internal bandwidth isn't an issue, but still... I guess Hynix developed it for nothing unless Matrox or the AIBs do something with the 980TI to make use of 8GHz VRAM.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

wat

 

OT: I literally made this exact prediction on the performance of the Fury... I'm a hacker

 

I wonder if he meant that in comparison, so AMD's "980 Ti" competitor. Or something.

 

Maybe.

Link to comment
Share on other sites

Link to post
Share on other sites

What's odd to me is Nvidia hasn't jumped on Hynix's 8GHz memory yet. I know internal bandwidth isn't an issue, but still... I guess Hynix developed it for nothing unless Matrox or the AIBs do something with the 980TI to make use of 8GHz VRAM.

 

that is interesting. many were speculating Nvidia would used it for GM200. http://www.kitguru.net/components/graphic-cards/anton-shilov/sk-hynix-begins-to-mass-produce-8ghz-gddr5-memory/ 

 

maybe cost was a factor, and also power consumption.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's success depends on how much it costs. If they can undercut Nvidia by a lot, then maybe that means lower prices from Nvidia in the long run too (don't get me wrong, I like Nvidia and even bought a 980 at release, but damn- they are expensive)

My PC: MSI X99s SLI PLUS///Intel i7 5820k///Corsair H100i///Crucial DDR4 12GB///EVGA Supernova Gold 750w G2///ASUS GTX 1080 Strix///Phanteks Enthoo Luxe///Intel 730 240GB SSD///WD Blue 1TB///Intel 6250 WiFi
Current Peripherals: Sennheiser HD598///Corsair K70 LUX///Logitech MX Master///Razer Destructor 2///Saitek X52///Acer X34 Predator

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's success depends on how much it costs. If they can undercut Nvidia by a lot, then maybe that means lower prices from Nvidia in the long run too (don't get me wrong, I like Nvidia and even bought a 980 at release, but damn- they are expensive)

 

As unfortunately as it is for the consumer AMD actually needs to make some profits now.

They've been posting in the red for far to long.

 

lol...posting in the red...

hue...amd...red...posting..

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did you mean just FX or AMD in general. If it was AMD in general then I would point out that the A88X chipset for the FM2+ socket is PCIe 3.0

might be wrong ... isn't that limited to APUs ? 

 

A88X on the FM2+ platform will support PCI-e Gen3 slots when coupled with a Kaveri processor, but only when paired with Kaveri

^ http://www.gamersnexus.net/guides/1442-amd-apu-chipset-comparison-a88-a85-a78
Link to comment
Share on other sites

Link to post
Share on other sites

I'll say this straight up, if Intel ever gets hold of the Ati IP, Nvidia is going to really, really have some tough times competing. Just look at how far Intel's iGPU have come-and they started from scratch.

Intel will require a couple years to get back to doing dGPUs even if it gets ahold of all of ATI's IP and Raja Koduri and Mark Papermaster. The Xeon Phi accelerators are fundamentally different but have the benefit of running x86_64 native code. While I'm sure Intel could use the same memory controllers between the Xeon Phi and potential dGPUs, there'd be a fair bit of adjusting it would have to do over a couple years. Nvidia's main concern right now is HPC. With Knight's Landing, Intel can potentially jump up to 96 TFlops per 3U or 4U rack vs the 28 they have between the flagship Haswell E7 Xeon and up to 8 Knight's Corner flagship Xeon Phi, which honestly is less dense than IBM's Power 8 systems with Tesla K40 by about 30%. 

 

If Intel suddenly takes the density crown away from IBM and Oracle Sparc, the biggest users of Tesla accelerators will suddenly be unable to compete, and Intel will only push the Xeon Phi harder, because with a CUDA license and the ability to run CUDA code, Intel can swipe up all of Nvidia's customers for literally pennies per chip sold vs. the $4000 for each Tesla K40. Intel is a vicious, cold, calculating business. Every time a claim has been made it would never compete at X has been proven wrong with time. They knocked IBM, Texas Instruments, and MOS Technolgies out of the server world once, needing only 15 years to go from from being a non-name to being king. If Nvidia thought it could ride through against Intel long-term just on its position alone, Nvidia forgot exactly what IBM lost to in the first place.

 

From servers, to home PCs, to microcontrollers, to routers, to phones, and now to accelerators and GPUs, Intel has pushed and pushed, and the industry experts have balked again and again at its ability to be indomitable given enough time.

 

ARM thought Intel couldn't produce a sub 5-Watt x86_64 SOC that had remotely competitive performance at 22nm FF vs. 28nm HDL, well, that idea was proven wrong too. Sure, Intel isn't the leader, but how long will it take? 5 more years?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That performance difference is purely based on latency, not bandwidth. The bus isn't the bottleneck. Sure, it may make a slight detriment, but it's not the main issue, and it's way down the list of things holding back gaming performance. Now, if you go to a compute workload and toss a Tesla K40 on PCIe 2.0 vs 3.0, the difference is obscene.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That exaly doesnt say anything just "Generic VGA" it can be anything.

Its just nonsense.

 

Like i said, The card is not officialy released, nor presented. So its not very likely that somebody allready has one.

exept the people at Sapphire, and those are under strickt rules of not leaking anything.

you  no one minus developers already posting pictures of their units on twitter.....and that was 1-2 weeks ago

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×