This guide is to help understand the nuances of Direct3D and it's versioning/compatibility schema. Confusingly, it's not all about a single number, but a few things that can make buying video cards annoying.   Direct3D? Isn't it DirectX? This is being pedantic, because I know some people like to be pedantic. Direct3D is the actual 3D acceleration API that most people are referring to when talking about the DirectX API in general. DirectX is really a family of API for game development. Components of a modern DirectX family include (but not limited to): Direct2D DirectCompute XAudio2 XInput (curiously this is considered deprecated, since it's under DirectInput, which was deprecated) But since Direct3D is the only one most people care about, DirectX versions tend to follow Direct3D ones.   What does a GPU have to do in order to be compatible with a version of Direct3D? Microsoft has a list of features that the system must support. Microsoft doesn't care how this is implemented, only that it is implemented. For example, one of the minimum requirements for Direct3D 9.0c is that the GPU must support Shader Model 3.0. I say system, because somehow some GPUs lack hardware support for some features, but likely through drivers, they're able to support it via software. Intel's GMA 900 series is infamous for supporting Direct3D 9.0b, but it lacks a hardware transform & lighting engine.   Compatibility with a Direct3D version is down to a checklist Prior to Windows Vista, and thus Direct3D 10, compatibility was checked by testing if the GPU was capable of certain features. This was cataloged by way of so-called capability bits or "cap bits." This is similar to the extended info and feature bits when querying the processor for its CPUID information (see The problem with cap bits is it led to software developers having to mangle with a bunch of features that the GPU may or may not have supported, despite being "DirectX (some version)" compatible. This was especially a problem in DirectX 9, where there were three versions of it and all of the GPUs that technically supported it had different feature sets.   Cap bits were done away with in Direct3D 10. Instead, feature level was introduced. This meant in order for the GPU to be considered one that supports Direct3D 10, the GPU had to support all of the features in a given feature level, or it wasn't compatible with the API. If a feature level was higher than another, the GPU had to support all of the features of the lower ones. When Direct3D 10 launched, there were initially four feature levels: 9_1, 9_2, 9_3, and 10_0. This greatly simplified what a developer had to support or could support. So for example, if they wanted to support a Direct3D 10 game with Feature Level 10_0 (FL10_0), they could use any of the features it provided, plus the lower levels, and as long as the GPU was FL10_0 compliant, it would work.   What do Feature Levels imply? Here's the confusing part about Direct3D's feature levels: in order to support newer versions of the API but keep compatibility with older hardware, feature levels can have levels with a number before the version number itself.   For example, Direct3D 12 not only has two of its own (FL12_0 and FL12_1), but it has FL11_0 and FL11_1 (which were from Direct3D 11). This means that any GPU that supports FL11_0 can run games using Direct3D 12 API, but only with the features described in FL11_0. Even if the developer targeted FL12_0 or FL12_1, any GPU capable of those feature levels must also support FL11_0 and FL11_1 features, meaning the game also must have those features. In other words, for Direct3D 12, its foundation is on FL11_0 features, and builds up from there.   Note however, just because a GPU has feature level support on multiple versions of Direct3D, doesn't mean the GPU automatically supports it. The drivers still have something to do about it. For example, the GeForce 400 series is FL11_0 compatible, but it's only good for Direct3D 11 and lower due to the lack of driver support for Direct3D 12.   Do Feature Levels mean X GPU is better than Y GPU? Not necessarily. NVIDIA's Maxwell is FL12_1 compliant, whereas AMD's GCN 3, the GPU architecture competing against it, is only FL12_0. Amusingly, Intel's HD500 is also FL12_1 compliant, but supports even more features than NVIDIA Maxwell.   There's also the problem that the developer still has to target a particular feature level. So if a game is targeting Direct3D 12, but only making use of features up to FL12_0, then buying an FL12_1 compliant card doesn't do much.   Also note that a GPU can support more than what the feature level requirement specifies, but it's probably not going to be used. Multiple GPUs had support for tessellation in their own versions before Direct3D 11 made it a requirement, but it almost never got used in any game because well, the developers knew the feature would be limited to only a subset of cards.   One more thing... I mentioned before that Microsoft doesn't care how a feature is implemented. Only that it supports it. I think this was a source of contention regarding the whole "asynchronous compute" debacle. Here's a list of feature levels that Direct3D 12 supports and what the GPU has to implement in order to be compliant: Notice that the only mentioning of "Compute" just says the GPU needs to support it. And neither the table nor the page that talks about compute have the word "asynchronous" somewhere.   In this instance, Direct3D 12 exposes multiple queues, but how the hardware uses them is up to the hardware manufacturer. Multiple queues is a feature, but AMD's Asynchronous Compute and NVIDIA's Dynamic Load Balancing are implementation details. It would be bad for Microsoft to go "in order to support this feature, you must do it this way" even if there's a dozen other ways to do it.   Bonus Thing: Direct3D versions and their Feature Levels Direct3D 10 9_1, 9_2, 9_3, 10_0, 10_1 (implies any Direct3D 9 GPU is compatible with Direct3D 10) Direct3D 11 9_1, 9_2, 9_3, 10_0, 10_1, 11_0, 11_1 (implies any Direct3D 9 and 10 GPU is compatible with Direct3D 11) Direct3D 12 11_0, 11_1, 12_0, 12_1 (implies any Direct3D 11 GPU is compatible with Direct3D 12)