Jump to content

NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1

BiG StroOnZ

I'm just glad that stupid chart has been properly and publicly debunked. It should be obvious that AMD rebrands don't somehow support all these features, especially when Nvidia is launching brand new cards every couple months it seems like.

 

I don't know why people posted it in the first place, the sheer fact it lists GCN1 as fully DX12 compatible should have raised alarm bells regarding its accuracy.  The only people who posted it where either trolling or don't know what they are talking about.  I don't mean this to sound insulting but there are really no other conclusions to draw.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'm assuming that this is as meaningful as the way AMD capitalized on GCN cards supporting DX 11.2 while Nvidia's Kepler cards only supported DX 11.1?

As in, not really at all?

i7 not perfectly stable at 4.4.. #firstworldproblems

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, so are the 3xx-series considered the same GCN iteration that won't support 12.1?

 

That depends on what the 3xx series is.  Rebrands won't, new cards probably will.

 

These features are incredibly minor though.  The only game that uses dx11.1 was Battlefield 4, most used dx11.0

 

Gameworks bullshit is gameworks bullshit though.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

That depends on what the 3xx series is.  Rebrands won't, new cards probably will.

 

These features are incredibly minor though.  The only game that uses dx11.1 was Battlefield 4, most used dx11.0

 

Gameworks bullshit is gameworks bullshit though.

 

Gameworks is far from bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

 

I don't know the answer myself but can you tell me the % of the total GPU market share that the aib market holds? Talk about a fraction of a fraction.

 

What's your point? NVIDIA has shipped more discrete graphics cards, plain and simple. Just because the dGPU market is smaller than it once was, that shouldn't take away from the fact, that should only add to the fact that AMD has even less of a impact on that specific market. So what, most people use on-die graphics or an iGPU, or an APU. That's not the argument you originally tried to make which was who has sold more discrete graphics cards... stop backtracking. 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know the answer myself but can you tell me the % of the total GPU market share that the aib market holds? Talk about a fraction of a fraction.

 

It's somewhere between 15% and 25%, I seem to recall the figure 17%, but don't quote me and I couldn't be arsed googling for it right now.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

What's your point? NVIDIA has shipped more discrete graphics cards, plain and simple. That shouldn't take away from the fact that the dGPU market is smaller than it once was, that should only add to the fact that AMD has even less of a impact on that specific market. So what, most people use on-die graphics or an iGPU, or an APU. That's not the argument you originally tried to make which was who has sold more discrete graphics cards... stop backtracking.

Explaining my point better; you claim devs will use x features because nvidia has a majority market share in GPUs. I point out that nvidia doesn't have a majority in total GPU sales (didn't specify what type of GPU) disputing your original claim. I didn't even mention the fact that cards that supported the feature you claim they would exploit is a fraction of nvidias total shipped dgpus, which is a fraction of total GPUs.
Link to comment
Share on other sites

Link to post
Share on other sites

Explaining my point better; you claim devs will use x features because nvidia has a majority market share in GPUs. I point out that nvidia doesn't have a majority in total GPU sales (didn't specify what type of GPU) disputing your original claim. I didn't even mention the fact that cards that supported the feature you claim they would exploit is a fraction of nvidias total shipped dgpus, which is a fraction of total GPUs.

Do developers make games for iGPUs or dGPUs? Especially now that iGPUs are gaining the same level of support that dGPUs have. I don't know what your beef is.

Nvidia has the lions share of GPU sales that matter. Developers will focus on the majority and not the minority. You can't have that argument both ways.

Gameworks is far from bullshit.

You're playing with fire.

Never mind that Gameworks exists because developers want more without doing the work, and AMD can't afford to match what Gameworks offers.

AMD runs a nice charity but it's long overdue that they start running their business as a business.

Link to comment
Share on other sites

Link to post
Share on other sites

You're playing with fire.

Never mind that Gameworks exists because developers want more without doing the work, and AMD can't afford to match what Gameworks offers.

AMD runs a nice charity but it's long overdue that they start running their business as a business.

 

I like to believe it's more of the consumers being stupidly impatient that causes that.

 

GTA 5's final release date, case in point.

 

I'm not saying this to be arrogant or anything, but I wish everyone had my kind of patience for video games. You get a way better product by doing so.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is in the dark for over 2 years now, i truly they had to be working on something that can match nvidia right now, i mean that's a lot of RnD days and no products. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Explaining my point better; you claim devs will use x features because nvidia has a majority market share in GPUs. I point out that nvidia doesn't have a majority in total GPU sales (didn't specify what type of GPU) disputing your original claim. I didn't even mention the fact that cards that supported the feature you claim they would exploit is a fraction of nvidias total shipped dgpus, which is a fraction of total GPUs.

 

Unfortunately for your argument to be relevant you have to understand the market.  Majority of intel iGPU solutions (and thus majority of the market) currently are in corporate/education machines or domestic email and web browsing machines. Which were never intended for and will never be used for games. The iGPU market share is almost completely disconnected from what drives gaming software and almost completely disconnected from gameworks or any game related middleware.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Do developers make games for iGPUs or dGPUs? Especially now that iGPUs are gaining the same level of support that dGPUs have. I don't know what your beef is.

Nvidia has the lions share of GPU sales that matter. Developers will focus on the majority and not the minority. You can't have that argument both ways.

You're playing with fire.

Never mind that Gameworks exists because developers want more without doing the work, and AMD can't afford to match what Gameworks offers.

AMD runs a nice charity but it's long overdue that they start running their business as a business.

You know what the incredible thing is? There's plenty of open-source physics engines built around OpenMP that would take little effort to revise for GPU, and Hell now that GCC supports OpenMP offloading, maybe we can finally dispose of the Microsoft Visual C/C++ compiler for all parts of the code base excluding those directly responsible for Direct X. I can't believe software and game builders have let Microsoft walk all over them while being so far behind its competition. GCC/Clang/ICC all support the full C++ 14 gamut with C++ 17 already in the works. Visual Studio still doesn't CORRECTLY support a lot of function inlining and use of "auto" in lambda functions either in-lined in the STL algorithm calls or used as inner calls of a larger custom function. It still doesn't support OpenMP or OpenACC, and its OpenCL 2.0 compliance is shaky at best, and it doesn't have support for Intel's CilkPlus or the C++ Boost libraries either.

 

Seriously, it's like game devs aren't remotely trying to push Microsoft or do the work that needs to be done to do it right by ALL of their customers.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Unfortunately for your argument to be relevant you have to understand the market.  Majority of intel iGPU solutions (and thus majority of the market) currently are in corporate/education machines or domestic email and web browsing machines. Which were never intended for and will never be used for games. The iGPU market share is almost completely disconnected from what drives gaming software and almost completely disconnected from gameworks or any game related middleware.

That's being a bit disingenuous. Somewhere here on LTT is a post with a game studio video where the dev claims most gaming systems have an iGPU in addition to a dGPU, where the second most common machine category is just iGPU, and the least common is just dGPU, and that data is the basis for supporting multi-adaptor GPU resource pooling in their DX 12 games. they demonstrated with a 7850K and a 290X to start but said Intel iGPU wouldn't be excluded.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's being a bit disingenuous. Somewhere here on LTT is a post with a game studio video where the dev claims most gaming systems have an iGPU in addition to a dGPU....

Out of curiosity, any idea if he took into account enterprise machines? As the majority come with discrete graphics (albeit usually for additional output support).

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously, it's like game devs aren't remotely trying to push Microsoft or do the work that needs to be done to do it right by ALL of their customers.

That's becasue games devs build a product to make money from a controlled market.   If they can't control the market they can't make any money.

 

That's being a bit disingenuous. Somewhere here on LTT is a post with a game studio video where the dev claims most gaming systems have an iGPU in addition to a dGPU, where the second most common machine category is just iGPU, and the least common is just dGPU, and that data is the basis for supporting multi-adaptor GPU resource pooling in their DX 12 games. they demonstrated with a 7850K and a 290X to start but said Intel iGPU wouldn't be excluded.

 

Of course majority of all gaming PC's have an iGPU, but it's not used and can't really be used right now.  Building software (dx12) to leverage that iGPUs existence is smart but not the reason for the iGPU's existence in the first place, and iGPU's are certainly not the reason gameworks or and other current middle ware exists.   DX12 multi-GPU platform is as much about using more AIB as it is about leveraging an otherwise dormant iGPU solution.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Out of curiosity, any idea if he took into account enterprise machines? As the majority come with discrete graphics (albeit usually for additional output support).

Don't know why he would given he specifically claimed gaming systems, which means there's probably a reporting function buried in the code telling the studio who's got what playing their games. Hell I'd do that so I could better focus optimization and driver development.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's becasue games devs build a product to make money from a controlled market.   If they can;t control the market they can't make any money.

 

 

Of course majority of all gaming PC's have an iGPU, but it's not used and can't really be used right now.  Building software (dx12) to leverage that iGPUs existence is smart but not the reason for the iGPU's existence in the first place, and iGPU's are certainly not the reason gameworks or and other current middle ware exists.   DX12 multi-GPU platform is as much about using more AIB as it is about leveraging an otherwise dormant iGPU solution.

I think you and I are arguing different things or going in totally different directions here. I'm not saying iGPUs were built for gaming. They were initially designed for office users who don't need a dGPU, eating up Nvidia's and AMD's low end dGPU sales. Beyond that Intel's development for iGPU is primarily focused on compute, though any gaming performance they get is a great bonus to further undermine Nvidia. AMD's has been a bat out of Hell with no clear direction.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Ok another perspective then, based on the total gpu share, what % of that minority share is split with the other divisions. Like GeForce, tesla, maybe Tegra?

Let's say for arguments sake that it is 50/50 excluding Tegra, how is that divided amongst 6,7 and 9 series gpus?

Nvidia is not the majority of the gpu market and the 9 series isn't the majority of all shipping dgpus.

Hopefully that was a little more clear as I am notoriously bad at getting my point across.

Link to comment
Share on other sites

Link to post
Share on other sites

Out of curiosity, any idea if he took into account enterprise machines? As the majority come with discrete graphics (albeit usually for additional output support).

 

Probably wouldn't matter,  we are talking about very context specific hardware. because there are big overlaps in hardware market share data it is important to remember that of the 16% of pc with NV AIB probably 80% of them have a dormant intel iGPU. Which adds to the intel percentage but not to the rational behind developers who develop with certain GPU capabilities in mind.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia already has 75% of market share (in terms of dGPU only) because of fanboys, (and admittedly me, the timing has never been right to buy an AMD/ ATi card... ever) So they're already capitalizing. In fact, AMD will be gone in 2 years, and all the fanboys better get used to $2600 entry-level cards from Nvidia.

 

And if Nvidia ever comes under anti-trust for monopoly, the retarded US pencil pushers will treat Intel as a legitimate competitor (because they probably wouldn't classify iGPU as a different market from dGPU) and thus Nvidia would face no consequences.

This is why Nvidia is powering on aggressively with groundbreaking GPUs, not giving two fucks about AMD, while Intel are scared shitless and holding back. Intel could design a 12-core unlocked consumer chip with no iGPU for $300 right now, but if they did, AMD would be six feet under and Intel, not Nvidia, would be the ones in deep shit over monopoly allegations.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

 get used to $2600 entry-level cards from Nvidia.

 

I assume you are exaggerating or being sarcastic,  There are economic limitations outside of competition that dictate the maximum and minimum a product will sell for. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Amazing. No one cared that gtx 200 cards didn't support dx 10.1 or that Kepler didn't support dx 11.2. In the words of the great Glenwig, "Whatever Nvidia is winning at suddenly becomes the most important thing."

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Amazing. No one cared that gtx 200 cards didn't support dx 10.1 or that Kepler didn't support dx 11.2. In the words of the great Glenwig, "Whatever Nvidia is winning at suddenly becomes the most important thing."

Didn't ubisoft also remove the dx10.1 engine from assassins creed allegedly at the behest of nvidia due to the lack of support from their hardware?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×