Jump to content

AMD: No Such Thing As 'Full Support' For DX12 Today

HKZeroFive

Fuck it I'm on the Vulkan hype train

I don't think you understand how it works. If a particular feature is not supported by DirectX, then most likely it isn't supported by OpenGL either, if they share the same feature or similar one. Usually, and I said usually, when a graphics chip manufacture skips a feature it is because:

-> They were able to implement an equivalent one which could be costing less to implement, but not as good, and the driver will do the API conversion.

-> They don't see it being used for the target graphics cards (ie: not game related for gaming graphics card, despite what the feature name might sound like)

-> Was in development at the same time of DirectX/OpenGL version, so it is missing features

An example of this were Nvidia DirectX 10 ready cards. Beside the first gen, if I recall correctly, they don't mention DirectX 10.1 like AMD said, yet everything in terms of game features of DirectX10.1 worked. Because Nvidia took the few options of DirectX10.1 which were related to gaming, implemented it, but as it doesn't have everything or most features, they can't pass through the Directx10.1 certification.

Link to comment
Share on other sites

Link to post
Share on other sites

You could take Roberts statement a number of ways, or even put words in his mouth. The million dollar question is: which dx12 features provide a performance boost?

Roberts post is actually a challenge to Nvidia to come out and say which dx12 features Maxwell 2.0 does not natively support, so that going forward the discussion is based on facts.

PR teams (or tech reviewers for that matter) from one or both sides are guilty of saying their latest architecture is fully dx12 ready, and it's nice to see someone officially spell it out that such claims are (deliberate) lies.

 

That's not the only question though: yes some features might provide a performance boost but I am also interested in features that provide a visual fidelity improvement even at the cost of performance. If all I wanted was performance I could still go by with just low settings and crap textures and run everything at 144fps but well that's hardly worth it if it looks like a game released in 2008

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you understand how it works. If a particular feature is not supported by DirectX, then most likely it isn't supported by OpenGL either, if they share the same feature or similar one. Usually, and I said usually, when a graphics chip manufacture skips a feature it is because:

-> They were able to implement an equivalent one which could be costing less to implement, but not as good, and the driver will do the API conversion.

-> They don't see it being used for the target graphics cards (ie: not game related for gaming graphics card, despite what the feature name might sound like)

-> Was in development at the same time of DirectX/OpenGL version, so it is missing features

An example of this were Nvidia DirectX 10 ready cards. Beside the first gen, if I recall correctly, they don't mention DirectX 10.1 like AMD said, yet everything in terms of game features of DirectX10.1 worked. Because Nvidia took the few options of DirectX10.1 which were related to gaming, implemented it, but as it doesn't have everything or most features, they can't pass through the Directx10.1 certification.

I meant it more or less on "Vulkan is a bit less confusing than DX12 is right now" rather than "lol i've lost hope in amd and nvidia"

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

For Ashes of Singularity, yes. This feature may not be fully implemented into other games. Don't forget, both AMD and NVIDIA are missing features on their cards. For example, Fury X doesn't have Raster Ordered Views and Conservative Raster.

 

neither of these things are relevant because they can be achieved elsewhere without dx12...

Abigail: Intel Core i7-4790k @ 4.5GHz 1.170v / EVGA Nvidia GeForce GTX 980 Ti Classified  / ASRock Z97 Extreme6 / Corsair H110i GT / 4x4Gb G.Skill Ares 1866MHz @ CAS9 / Samsung 840 EVO 250Gb SSD / Seagate Barracuda 1TB 7200RPM / NZXT H440 Blue / EVGA SuperNOVA 750w G2

Peripherals: BenQ XL2411z 24" 144hz 1080p / ASUS VG248QE 24" 144Hz 1080p / Corsair Vengeance K70 RGB / Logitech G502 / Sennheiser HD650 / Schiit Audio Modi 2 / Magni 2 / Blue Yeti Blackout
Link to comment
Share on other sites

Link to post
Share on other sites

neither of these things are relevant because they can be achieved elsewhere without dx12...

True, but you get the dynamic of what I'm saying. The point is, and I quote myself, 'both AMD and NVIDIA are missing features on their cards'.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you understand how it works. If a particular feature is not supported by DirectX, then most likely it isn't supported by OpenGL either, if they share the same feature or similar one. Usually, and I said usually, when a graphics chip manufacture skips a feature it is because:

-> They were able to implement an equivalent one which could be costing less to implement, but not as good, and the driver will do the API conversion.

-> They don't see it being used for the target graphics cards (ie: not game related for gaming graphics card, despite what the feature name might sound like)

-> Was in development at the same time of DirectX/OpenGL version, so it is missing features

An example of this were Nvidia DirectX 10 ready cards. Beside the first gen, if I recall correctly, they don't mention DirectX 10.1 like AMD said, yet everything in terms of game features of DirectX10.1 worked. Because Nvidia took the few options of DirectX10.1 which were related to gaming, implemented it, but as it doesn't have everything or most features, they can't pass through the Directx10.1 certification.

Actually, it was 3 generations. 8000, 9000 and Gtx 200 series didn't support Dx 10.1. It wasn't until the 300 mobiles series that they supported it.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

True, but you get the dynamic of what I'm saying. The point is, and I quote myself, 'both AMD and NVIDIA are missing features on their cards'.

this is true

that being said nvidia is worse off with dx12 even while having a higher feature support level than amd

 

unfortunate but that's how nvidia planned things. too bad

Abigail: Intel Core i7-4790k @ 4.5GHz 1.170v / EVGA Nvidia GeForce GTX 980 Ti Classified  / ASRock Z97 Extreme6 / Corsair H110i GT / 4x4Gb G.Skill Ares 1866MHz @ CAS9 / Samsung 840 EVO 250Gb SSD / Seagate Barracuda 1TB 7200RPM / NZXT H440 Blue / EVGA SuperNOVA 750w G2

Peripherals: BenQ XL2411z 24" 144hz 1080p / ASUS VG248QE 24" 144Hz 1080p / Corsair Vengeance K70 RGB / Logitech G502 / Sennheiser HD650 / Schiit Audio Modi 2 / Magni 2 / Blue Yeti Blackout
Link to comment
Share on other sites

Link to post
Share on other sites

True, but you get the dynamic of what I'm saying. The point is, and I quote myself, 'both AMD and NVIDIA are missing features on their cards'.

 

those two features mentioned by Robert are dx12_1, not part of the original 12_0 spec, something that should be noted accordingly. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

this is true

that being said nvidia is worse off with dx12 even while having a higher feature support level than amd

 

unfortunate but that's how nvidia planned things. too bad

those two features mentioned by Robert are dx12_1, not part of the original 12_0 spec, something that should be noted accordingly.

As time goes on, we'll see which games perform better on which card. It's just going to take a while.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

The only big advantage that AMD has, is that they also have their own api vulcan in  their back pocket.

Which nvidia doesnt have their own api.

 

Thats why amd has a nice future in gaming with their gpu's.

Because vulcan is open source, and platform undepended.

Vulkan does not belong to AMD. I know it's an evolution of mantle but at this point it's probably a different animal and Nvidia was also heavily involved in it's design.

It's not 'AMD's API anymore.

 

Link to comment
Share on other sites

Link to post
Share on other sites

With all do respect , and as of now ..... the quoted statement is WRONG

None of the current available benchmarks is showing anything near "in favor of NVIDIA"

 

some of NVIDIA fanboys are dying for something to show an advantage for NVIDIA on some DX12 benchmarks, once there is such benchmarks, the above statement will become true

 

Dont blame me mate , I'm just trying to be fully objective

Actually there is a DX12 demo that uses conservative raster to do real time ray tracing and smoke simulation without dropping fps.

While the benchmark is from Nvidia them selves I doubt it would run good on AMD if the feature is missing.

We don't know what features game developers will use some devs might favor raster for lighting and particle effects others might use A-Sync compute to boost up performance with game that have a lot of AI or Physics going on.

https://www.youtube.com/watch?v=rd4DE-RG_jw

 

 

neither of these things are relevant because they can be achieved elsewhere without dx12...

 

No it can't be done other wise those are on a complete different level as you can see in the video above.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

So basically, I have a useless or below par directx 12 gpu. That will stutter and fall short on upcoming directx 12 gaming titles. This is fantastic news, I'm overwhelmed with happiness.  

Test ideas by experiment and observation; build on those ideas that pass the test, reject the ones that fail; follow the evidence wherever it leads and question everything.

Link to comment
Share on other sites

Link to post
Share on other sites

None of the current available benchmarks is showing anything near "in favor of NVIDIA"

 

You mean none of the one currently available benchmarks? That's not a very objective way to look at it.

Link to comment
Share on other sites

Link to post
Share on other sites

ok so when will we get GPU's that have full DX12 support? thats when I will upgrade...

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

 

On an API that AMD literally wrote the code for.

 

Cool, I didnt know AMD wrote the codes for DX11...

 

I mean, GET REAL OR GTFO

 

280X, used to be HD 7970... still beats the 960. A GPU in the same price range....

The 270... used to be HD 7850, still beats the 750Ti

the 290X still competes with 970 (although slightly slower)....

 

Then we rebrand these again

370 (based on super cut down 270X) > 750TI

270X >/= 950 (need benchmarks)

380 (based on the weaker 285) > 960

390 > 970 (marginally)

390X < 980 (but 390X is cheaper by far)

Fury > 980

furyX < 980Ti

295x2 > TitanX, 980Ti

 

how old are these GPUs by now?

370 (Pitcairn) - 3.5 years

380 (Tonga) - 1 year

390 (Hawaii remodel) - 2.5 years

390X (Hawaii XT remodel) - 2.5 years

Fury - 2 months

Fury X - 3 months

295x2 (Hawaii) - 2 years

 

 

How old are the maxwell2 cards in the 900 series?

750Ti - 2 years

950 - 14 days

960 - 1 year

970 - 1 year

980 - 1 year

980Ti - 3 months

TitanX - 9 months(?)

 

NVIDIA HAS ACCOMPLISHED THE EXTRAORDINAIRY FEAT, OF BEING BEATEN BY REBRANDS THAT ARE TWICE THEIR AGE IN AVERAGE...

 

At which point does Nvidia make the best products?

 

Their products cannot even last a year before "last years AMD" is rebranded and beats them... hell, they even beat them while they are overlocked in many if not most cases.

Link to comment
Share on other sites

Link to post
Share on other sites

ok so when will we get GPU's that have full DX12 support? thats when I will upgrade...

Most likely NVIDIA's Pascal and AMD's Arctic Islands. If not Pascal, which I heavily doubt, definitely Volta.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

Dont blame me mate , I'm just trying to be fully objective

 

"some of NVIDIA fanboys are dying for something to show an advantage for NVIDIA"

 

I'm just trying to be fully objective

 

mXyupD1.gif

Link to comment
Share on other sites

Link to post
Share on other sites

Cool, I didnt know AMD wrote the codes for DX11...

 

I mean, GET REAL OR GTFO

 

280X, used to be HD 7970... still beats the 960. A GPU in the same price range....

The 270... used to be HD 7850, still beats the 750Ti

the 290X still competes with 970 (although slightly slower)....

 

Then we rebrand these again

370 (based on super cut down 270X) > 750TI

270X >/= 950 (need benchmarks)

380 (based on the weaker 285) > 960

390 > 970 (marginally)

390X < 980 (but 390X is cheaper by far)

Fury > 980

furyX < 980Ti

295x2 > TitanX, 980Ti

 

how old are these GPUs by now?

370 (Pitcairn) - 3.5 years

380 (Tonga) - 1 year

390 (Hawaii remodel) - 2.5 years

390X (Hawaii XT remodel) - 2.5 years

Fury - 2 months

Fury X - 3 months

295x2 (Hawaii) - 2 years

 

 

How old are the maxwell2 cards in the 900 series?

750Ti - 2 years

950 - 14 days

960 - 1 year

970 - 1 year

980 - 1 year

980Ti - 3 months

TitanX - 9 months(?)

 

NVIDIA HAS ACCOMPLISHED THE EXTRAORDINAIRY FEAT, OF BEING BEATEN BY REBRANDS THAT ARE TWICE THEIR AGE IN AVERAGE...

 

At which point does Nvidia make the best products?

 

Their products cannot even last a year before "last years AMD" is rebranded and beats them... hell, they even beat them while they are overlocked in many if not most cases.

2 mistakes.

 

R9 270 and 270x are 7870 and 7870 GHz Ediition.

R7 370 = R7 265 = 7850

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

ok so when will we get GPU's that have full DX12 support? thats when I will upgrade...

 

GCN 1.2 is 100% DX12_0 compliant and was initially fully DX12 compliant. Once 12_1 was carved out GCN 1.2 was no longer fully DX12 compliant. 

 

Its likely that there could be 12_2 or even 12_3 iterations, which would mean a theoretical graphic card that is fully 12_0 and 12_1 compliant would still not be fully dx12 compliant. 

 

It boils down to word-play, because having full DX12 support is dependent on supporting every current feature, and there being no newer DX12 features getting released. In Nvidia's case, Maxwell supports the newest 2 features, but does not support all of 12_0 features. totally bass ackwards.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Actually there is a DX12 demo that uses conservative raster to do real time ray tracing and smoke simulation without dropping fps.

While the benchmark is from Nvidia them selves I doubt it would run good on AMD if the feature is missing.

We don't know what features game developers will use some devs might favor raster for lighting and particle effects others might use A-Sync compute to boost up performance with game that have a lot of AI or Physics going on.

-snip-

No it can't be done other wise those are on a complete different level as you can see in the video above.

PhysX m8, it can do things like that, and that's OLD SCHOOL as hell! Look at the Batman demo's with Nvidia Physx, always smoke.

Physx failed because it could not do enough (and because it was nvidia only, but whatever :P), and to keep it simple, the ray tracing stuff or whatever can do the same as PhysX, and physx failed, so this might go the same route, just sayin'

 

And DX11.2 or whatever is also barely used, everything uses DX11.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

2 mistakes.

R9 270 and 270x are 7870 and 7870 GHz Ediition.

R7 370 = R7 265 = 7850

Wait, I thought it was :

R7 370 = R7 270 = 7850

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

At which point does Nvidia make the best products?

 

At the point where they're turning out chips that are only 60% of the die size of AMD's cards (R9-380 is 366mm2, GTX 960 is 228mm2), and sell them for more money. And own 80% of the marketshare that way due to appealing to OEM's and most consumers due to theirr efficient and continuously updated productlines.

 

101 economics, which you clearly don't have much understanding of.

 

The fact they're newer doesn't mean they have to be inherently that much faster. You have to be just faster than your competition and market your products correctly. Especially since that way they can build smaller chips and have way more margin on each wafer of chips. What's the point of making chips that are the same size, stomp the AMD cards but don't actually generate more funds due to a saturated market (and have less margin, due to not being able to ask 60% more for the cards). You'll only make it harder on yourself in terms of longevity of your business. 

 

A lesson AMD just doesn't seem to understand. 

Link to comment
Share on other sites

Link to post
Share on other sites

At the point where they're turning out chips that are only 60% of the die size of AMD's cards (R9-380 is 366mm2, GTX 960 is 228mm2), and sell them for more money. And own 80% of the marketshare that way due to appealing to OEM's and most consumers due to theirr efficient and continuously updated productlines.

 

101 economics, which you clearly don't have much understanding of.

 

The fact they're newer doesn't mean they have to be inherently that much faster. You have to be just faster than your competition and market your products correctly. Especially since they can build smaller chips and have way more margin on each wafer of chips.

 

and you failed in marketing 101

 

just because you got a better product, doesnt mean a damn fucking shit, if you do not sell it.

 

Nvidia is marketing their GPUs as superior. While in reality, they are not.

 

Die size can have many explanations outside of pure GPU core. Auxiliary functions that are done on-die also needs Die size..

 

AMD has a few extra cores on their GCN 1.1 and 1.2 cores to run audio, using True Audio... So that alone would increase the size a tidbit.

 

Nvidia does some of their auxiliary functions outside on a extra chip on the PCB (physX), unless they changed that structure.... Havent looked into it for a while.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×