Jump to content

AMD: No Such Thing As 'Full Support' For DX12 Today

HKZeroFive

who said it's for gaming? Async Compute isn't for gaming, Async Shader is

 

Sure, like Windows doesn't drive your GPU, thats what a driver is for.

 

Remove one and the other is pretty worthless.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Because standards take years to create and years longer to implement. HDMI1.4 barely came out before we started hearing about HDMI2.0 features and release dates. Thunderbolt2 has been out for how long and now we are expecting Thunderbolt3?

 

Not to mention most of the graphics companies never (EDIT: fully) implement the _X versions of the spec, it was the same for DX10 and DX11. They will just support them in the next major revision (assuming the cards can).

Yeah but, monitor interface is kind of different than software like DX and GPUs so why not try to do a better implementation. How would 2016 GPUs be supported?

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

AMD buying ATI was deemed a good thing when it happened but IMO it added to much stock to the pot. When they lost their focus and diversified they also lost their edge.

Who on Earth thought it was a good idea? Financial analysts certainly didn't. Hedge funds didn't. Everyone with a brain saw AMD was getting duped into buying a valueless company where debt to revenues ratios were stupid high. AMD should have merged with Nvidia, and Ruiz should have taken the position as Huang's #2. Intel and IBM were the only tech companies in the U.S. who should have bought ATI. Both could have thrown money at the debts and restructured ATI in a single year.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

who said it's for gaming? Async Compute isn't for gaming, Async Shader is

Both are for gaming when you do GPU-based physics calculations. That said, AMD converts the DX kernels to compute kernels anyway and just massages the end data for final processing by the ROPs.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Who on Earth thought it was a good idea? Financial analysts certainly didn't. Hedge funds didn't. Everyone with a brain saw AMD was getting duped into buying a valueless company where debt to revenues ratios were stupid high. AMD should have merged with Nvidia, and Ruiz should have taken the position as Huang's #2. Intel and IBM were the only tech companies in the U.S. who should have bought ATI. Both could have thrown money at the debts and restructured ATI in a single year.

While I agree about your statement that it was a poor decision, AMD merging with NVIDIA would mean no competition, and therefore higher prices. I much rather keep them as two separate companies, even though the financial situations of AMD compared to NVIDIA is widely different at the moment.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

While I agree about your statement that it was a poor decision, AMD merging with NVIDIA would mean no competition, and therefore higher prices. I much rather keep them as two separate companies, even though the financial situations of AMD compared to NVIDIA is widely different at the moment.

I mean, before the ATI buyout, Ruiz's original plan was to merge with Nvidia instead of buying ATI. He just couldn't stomach letting Huang's be the CEO of the merged entity. In that event, AMD would have had a huge cash injection and the stronger GPU market share to jumpstart its APU designs. ATI would have likely ended up in Intel's hands shortly thereafter, and we'd have strong competition on both fronts.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, before the ATI buyout, Ruiz's original plan was to merge with Nvidia instead of buying ATI. He just couldn't stomach letting Huang's be the CEO of the merged entity. In that event, AMD would have had a huge cash injection and the stronger GPU market share to jumpstart its APU designs. ATI would have likely ended up in Intel's hands shortly thereafter, and we'd have strong competition on both fronts.

Yep, that I can agree with.

Although I cannot envisage the notion of Intel making formidable GPUs at all :D

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

I just really hope Vulkan succeeds.  :*(

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Ruiz's original plan was to merge with Nvidia instead of buying ATI. He just couldn't stomach letting Huang's be the CEO of the merged entity.

Seriously ? It was ego of one CEO that started AMD demise? Or were there concerns that Huang won't be up to task?

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, that I can agree with.

Although I cannot envisage the notion of Intel making formidable GPUs at all :D

 

Give it another 5-10 years and discrete GPU's will probably be for professional work only, and far too inefficient for gaming (especially latency). Even then I wonder, because unified architecture is often superior - once every other limit is reached.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Give it another 5-10 years and discrete GPU's will probably be for professional work only, and far too inefficient for gaming (especially latency). Even then I wonder, because unified architecture is often superior - once every other limit is reached.

Only time will tell...

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, before the ATI buyout, Ruiz's original plan was to merge with Nvidia instead of buying ATI. He just couldn't stomach letting Huang's be the CEO of the merged entity. In that event, AMD would have had a huge cash injection and the stronger GPU market share to jumpstart its APU designs. ATI would have likely ended up in Intel's hands shortly thereafter, and we'd have strong competition on both fronts.

speculations though...

 

things could just as easily gone south with Nvidia too... I mean, history shows us AMD hasnt been THE best at managing their money or expenses.... and their marketing (thus ability to make money at all) has been abyssmal since way before ATI were acquired

Link to comment
Share on other sites

Link to post
Share on other sites

No. They have lower level APIs, yes. They can afford this because they know that the hardware doesn't change (beside drive space).

WiiU and PS4 uses OpenGL. Usually a modified version of OpenGL with these lower level APIs, so you can't just easily port a game from Console to PC by just recompiling it for Windows.

The XBox One supports DirectX. We don't know if it has special lower level APIs inside (probably it does). DirectX12 might benefit a bit the XBox One, able to achieve a bit better graphics, bringing it closer to the PS4, is a possibility on well coded games.

 

Xbone uses a heavily customized version of DX11, dubbed DX11.x which includes features from DX12 including asynchronous compute.  Phil Spencer made it pretty clear that dx12 won't benefit the xbone, and my understanding is that the reasons why it wouldn't benefit from dx12 is because dx11.x already includes the features that are needed(AC).   Good article about the history and explaining AMD's long game is here.

Rock On!

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously ? It was ego of one CEO that started AMD demise? Or were there concerns that Huang won't be up to task?

CEO's usually don't have that kind of power. The board of directors have to agree to this kind of thing.

It's probably more huang being the wannabe mob boss he is, that was the problem.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Alright guys, I stumbled upon this expert analysis of the state of nvidias and amds dx12 support.  Never in all my life have I come across a speaker so measured and even handed in this discussion.  Never once veering into bouts of hyperbole and expletives to get his point across.  This is my favorite analysis all year.

 

http://vocaroo.com/i/s0moE1u9QsCA

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Alright guys, I stumbled upon this expert analysis of the state of nvidias and amds dx12 support.  Never in all my life have I come across a speaker so measured and even handed in this discussion.  Never once veering into bouts of hyperbole and expletives to get his point across.  This is my favorite analysis all year.

 

http://vocaroo.com/i/s0moE1u9QsCA

 

He is nothing but a gentleman. I laughed out loud at his comment about AMD spending 6 years preparing a suckerpunch for Nvidia's junk.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Alright guys, I stumbled upon this expert analysis of the state of nvidias and amds dx12 support.  Never in all my life have I come across a speaker so measured and even handed in this discussion.  Never once veering into bouts of hyperbole and expletives to get his point across.  This is my favorite analysis all year.

 

http://vocaroo.com/i/s0moE1u9QsCA

Lot's of heresay but still pretty funny.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Alright guys, I stumbled upon this expert analysis of the state of nvidias and amds dx12 support.  Never in all my life have I come across a speaker so measured and even handed in this discussion.  Never once veering into bouts of hyperbole and expletives to get his point across.  This is my favorite analysis all year.

 

http://vocaroo.com/i/s0moE1u9QsCA

Every single time he said "ne-vidia" i died a little inside. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's seems like someone, is on a fulltime job trying desperatly to change the information in the NVIDIA 900 articles on WIKIPEDIA, in a atempt to completly remove this latest developments about DirectX12 support on the 900 series:

 

IP's talkpage

IP's article edits

Geforce 900 Series

Geforce 900 Series' history

NV PR team too stronk... check this out ..... http://www.legitreviews.com/geforce-gtx-980-ti-dx12-feature-level-and-tier-details_164782... the first pic... DX12 API consists of - Low Overhead, More Control, Async Compute... straight from the horses mouth....

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, that I can agree with.

Although I cannot envisage the notion of Intel making formidable GPUs at all :D

Intel's problem is its lack of IP while AMD and Nvidia hold the lion's share and are both vicious patent trolls.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

speculations though...

things could just as easily gone south with Nvidia too... I mean, history shows us AMD hasnt been THE best at managing their money or expenses.... and their marketing (thus ability to make money at all) has been abyssmal since way before ATI were acquired

It's not speculation. That was Ruiz's plan. And Nvidia had the monetary position to draw more investors to AMD. And Nvidia would have managed the finances. AMD's IP and talent would have been kept. The management would have been stripped out.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×