Jump to content

AMD: No Such Thing As 'Full Support' For DX12 Today

HKZeroFive

So I guess the lesson here is everyone is a liar. When AMD & Nvidia both said their current cards would support DX12 they both flat out lied to us. (or at least were frugal with the truth)

yes they did, nVidia hoped DX12 would be released in time for Pascal while AMD had already a head start by designing XB1's Durango GPU with hardware features not specific to DX11 at that time - to put it bluntly, AMD knew what was coming, nVidia sorta' didn't

GCN 1.0 is nowhere near the same level of DX12 feature support as XB1's Durango (GCN 1.1)

Link to comment
Share on other sites

Link to post
Share on other sites

hey lets look at facts shall we?

This scribbled, rebranded old technology is STILL competing wiht Nvidias latest, baddest and most powerful architecture to date..... what does that tell you?

 

On an API that AMD literally wrote the code for.

 

no.. amd said that they wont be supporting DX 12.1 featured long before DX 12 came out... because consoles wont use them...

 

They also said they supported DX12 which their now saying no one supports fully.

 

That is not a lie. Both brands support DX12.

 

Partially, the important word here is partially.

 

Um, yeah, in price to performance. Perhaps it is you who only pick the details you want to see and not the whole truth.

 

I'm sorry but that horse left the stable a long time ago. The price difference between AMD & Nvidias competing products has been closed down by a huge chunk in recent years. Theres no more than £20 to £30 difference (depending on brand) between equivalent AMD & Nvidia cards.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell supports async compute, this isn't even the question

the question is how it is implemented, because there are some voices saying async shaders can't work in the rendering pipeline at the same time with regular ones

Yeah it does, but not on a hardware level though ;) At least that's what I have been reading lately.

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

hey lets look at facts shall we?

This scribbled, rebranded old technology is STILL competing wiht Nvidias latest, baddest and most powerful architecture to date..... what does that tell you?

Maxwell was never a power jump, it was only an efficiency jump compared to Kepler. Remember how everyone bitched that performance was marginally better in the 900 series over the 700 series yet power draw went way down?

 

Partially, the important word here is partially.

"Flat out lied" =/= partially

.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah it does, but not on a hardware level though ;) At least that's what I have been reading lately.

that can't be! it either does or it doesn't - if it doesn't, nVidia lied .. again
Link to comment
Share on other sites

Link to post
Share on other sites

"Flat out lied" =/= partially

Frugal with the truth =/= Partially

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Frugal with the truth =/= Partially

So you're agreeing with me that you lied about your own statement..?

 

You people need a few books on thinking thrown in your directions.

.

Link to comment
Share on other sites

Link to post
Share on other sites

that can't be! it either does or it doesn't - if it doesn't, nVidia lied .. again

It seems like they did indeed xD

https://www.techpowerup.com/215663/lack-of-async-compute-on-maxwell-makes-amd-gcn-better-prepared-for-directx-12.html

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

So you're agreeing with me that you lied about your own statement..?

 

You people need a few books on thinking thrown in your directions.

 

Huh?

 

So I guess the lesson here is everyone is a liar. When AMD & Nvidia both said their current cards would support DX12 they both flat out lied to us. (or at least were frugal with the truth)

My statement was very clear, don't put words in my mouth.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Huh?

 

My statement was very clear, don't put words in my mouth.

Hey, you're the one that agreed with me.

 

That is not a lie. Both brands support DX12.

Partially, the important word here is partially.

Frugal with the truth =/= Partially

.

Link to comment
Share on other sites

Link to post
Share on other sites

this makes me wonder... what did Ark survival evolved delay the DX 12 patch?... btw ark is a gameworks game... so it should run good .. but lets see here....

 

 

 

what a joke... 

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

On an API that AMD literally wrote the code for.

The Oxide developer literally said it himself. Although they have a contract because they want to promote DX12 games, NVIDIA worked with them more in developing the game over the summer for DX12. The problem is not the programming, it's the hardware that NVIDIA simply does not have at the moment.

They also said they supported DX12 which their now saying no one supports fully.

There is a difference between 'fully supporting' and 'supporting'. AMD never claimed to fully support it, otherwise they would be in a shitshow if they contradicted what they said.

I'm sorry but that horse left the stable a long time ago. The price difference between AMD & Nvidias competing products has been closed down by a huge chunk in recent years. Theres no more than £20 to £30 difference (depending on brand) between equivalent AMD & Nvidia cards.

I think what he meant to say was that this boost in performance created greater a price to performance ratio. A $300 card performing very closely to a $650 card is bound to cause a lot of controversy.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

Heh heh, AMD clarifying shit before creating unnecessary hype for themselves again.

 

tumblr_n2k3oveRRh1s5cw87o1_500.gif

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

this makes me wonder... what did Ark survival evolved delay the DX 12 patch?... btw ark is a gameworks game... so it should run good .. but lets see here....

 

-snip-

 

what a joke... 

Ark is one of the most horrendously unoptimised games of recent times. I'm not sure this makes for a valid point.

Aftermarket 980Ti >= Fury X >= Reference 980Ti > Fury > 980 > 390X > 390 >= 970 380X > 380 >= 960 > 950 >= 370 > 750Ti = 360

"The Orange Box" || CPU: i5 4690k || RAM: Kingston Hyper X Fury 16GB || Case: Aerocool DS200 (Orange) || Cooler: Cryorig R1 Ultimate || Storage: Kingston SSDNow V300 240GB + WD Black 1TB || PSU: Corsair RM750 || Mobo: ASUS Z97-A || GPU: EVGA GTX 970 FTW+

"Unnamed Form Factor Switch" || CPU: i7 6700K || RAM: Kingston HyperX Fury 16GB || Case: Phanteks Enthoo Evolv Mini ITX (White) || Cooler: Cryorig R1 Ultimate (Green Cover) || Storage: Samsung 850 Evo 1TB || PSU: XFX XTR 550W || Mobo: ASUS Z170I Pro Gaming || GPU: EVGA GTX 970 FTW+

Link to comment
Share on other sites

Link to post
Share on other sites

Ark is one of the most horrendously unoptimised games of recent times. I'm not sure this makes for a valid point.

its valid... because its a nvidia sponsored game that is going to get DX 12 patch soon... a full game to really see... if AoTS benchmark was right or wrong about NV performance ....

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

both nvidia and amd will support dx12.

why? because DX12 is provided by MS, and its intergrated with windows10.

Both company´s basicly have no choice, to just support it.

The only big advantage that AMD has, is that they also have their own api vulcan in  their back pocket.

Which nvidia doesnt have their own api.

 

Thats why amd has a nice future in gaming with their gpu's.

Because vulcan is open source, and platform undepended.

Gaming on Linux is slowly starting to become a thing, and if AMD has their vulcan api ready.

And it gets adopted by manny games, they will make a big bang.

 

Because lets be fair, wo wants to install that piece of malware called Windows 10?

Link to comment
Share on other sites

Link to post
Share on other sites

Heh heh, AMD clarifying shit before creating unnecessary hype for themselves again.

 

Exactly. They are fully aware that with the massive outlash against Nvidia's async issues, it's simply better to come clean as well to not have these limitations blow up in their face (and out of proportions) at a later date. Well played, AMD. Well played.  ^_^

Cheers,

Linus

Link to comment
Share on other sites

Link to post
Share on other sites

You could take Roberts statement a number of ways, or even put words in his mouth. The million dollar question is: which dx12 features provide a performance boost?

Roberts post is actually a challenge to Nvidia to come out and say which dx12 features Maxwell 2.0 does not natively support, so that going forward the discussion is based on facts.

PR teams (or tech reviewers for that matter) from one or both sides are guilty of saying their latest architecture is fully dx12 ready, and it's nice to see someone officially spell it out that such claims are (deliberate) lies.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly. They are fully aware that with the massive outlash against Nvidia's async issues, it's simply better to come clean as well to not have these limitations blow up in their face (and out of proportions) at a later date. Well played, AMD. Well played.  ^_^

Nah, we all already knew no AMD cards currently "fully support" DX12 few months ago.

 

http://linustechtips.com/main/topic/380211-updated-not-all-gcn-cards-are-dx12-feature-level-compatible/

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

both nvidia and amd will support dx12.

why? because DX12 is provided by MS, and its intergrated with windows10.

Both company´s basicly have no choice, to just support it.

The only big advantage that AMD has, is that they also have their own api vulcan in  their back pocket.

Which nvidia doesnt have their own api.

 

Thats why amd has a nice future in gaming with their gpu's.

Because vulcan is open source, and platform undepended.

Gaming on Linux is slowly starting to become a thing, and if AMD has their vulcan api ready.

And it gets adopted by manny games, they will make a big bang.

 

Because lets be fair, wo wants to install that piece of malware called Windows 10?

to be honest i think... alot of the mantle code was implemented into DX 12 aswell...  DX 12 was in development for years... but it was kind of stuck at some point ... but suddenly after the mantle API came out... it got finished... is it a coincidence or am i being too optimistic?

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

To me, it sounds like silly excuses to push sales.

Of course, current GPUs aren't DirectX12 100% supported, they were in dev as DirectX12 was in the works.

But the next gen cards will be. Simple as that.

But the features skipped, are they related to gaming? That is the question.

Link to comment
Share on other sites

Link to post
Share on other sites

well afaik there isnt a full DX11 card either but the features missing arnt important

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

well afaik there isnt a full DX11 card either but the features missing arnt important

apparently the assync feature is important since it makes AMD gpus perform better and NV perform worse... hell.. even NV on their own website advertises assync and key component of DX12....

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

To me, it sounds like silly excuses to push sales.

Of course, current GPUs aren't DirectX12 100% supported, they were in dev as DirectX12 was in the works.

But the next gen cards will be. Simple as that.

But the features skipped, are they related to gaming? That is the question.

 

Most likely related to gaming, if gauging by Nvidia presentations.

 

The 2 features that Robert Hallock mentions "Rasterizer Ordered Views" and "Conservative Rasterization" are both DX12_1 related, which means GCN 1.2 is fully 12_0 feature compliant... something people are overlooking.

 

 

The slide below confirms this:

 

dx12.jpg

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×