Jump to content

DX12, what was promised and what we got .. a rainbow shitting unicorn

Seriously so many dumb people missing a major point of DX 12. Its meant to use more threads thus making weak CPU become less of bottleneck, how the fuck is 6700k at 4.7ghz gonna show any difference. The only game that had specific optimization is DOOM and that's Vulkan.

Slowly...In the hollows of the trees, In the shadow of the leaves, In the space between the waves, In the whispers of the wind,In the bottom of the well, In the darkness of the eaves...

Slowly places that had been silent for who knows how long... Stopped being Silent.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, MageTank said:

-

you should be comparing with Bioware when was doing good work

2000 people doesn't translate into better product - the more you fragment the development the worse it gets

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Liltrekkie said:

-

just to show how out of touch with reality you are

AMD = DX12? funny! nVidia is in the same shithole 

 

you can't even have a debate without insulting -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

you should be comparing with Bioware when was doing good work

2000 people doesn't translate into better product - the more you fragment the development the worse it gets

How exactly would I do that? My time machine broke years ago. I also can't find their team sizes for ME:3 or ME:2. 

 

Another fun fact: Skyrim was developed by a team of roughly 100 people. Except everything broken with Skyrim isn't a glitch, simply a feature. Those horse physics...

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

just to show how out of touch with reality you are

AMD = DX12? funny! nVidia is in the same shithole 

 

you can't even have a debate without insulting -_-

Since when did I say that AMD = DX12?

 

You're putting words I never said in my mouth. 

 

Seriously. Learn to read.

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

I get frustrated every time I read posts on DX12. Direct X 12 was never designed to lower GPU usage. Not once. Not ever. A change of graphics API in no way changes the amount of compute power that the GPU has.

 

How do people realistically expect game developers doing last minute ports of their render pipeline from DX11 to DX12 to possibly be even close to as well optimized as DX11 drivers that Nvidia and AMD have been optimizing for many many years now?

 

The big benefit to DX12 and Vulkan is *NOT* on the GPU side. It never has been. The big benefit to DX12 and Vulkan are on the CPU side.

 

Testing a game at 1440p on an i7-6700k and a 480 and saying it has the same performance under DX11/DX12 would be like testing a DX11 game at 4k on a 480 and complaining that two different CPUs give you the same frame rate... It's missing the whole point of the measurement.

 

Let's see some ARM games tested. Mobile is a huge use case that benefits from Vulkan. Let's see some low clock high core CPUs tested. Let's see something tested that might realistically in any real world ever see benefit from DX12.

Link to comment
Share on other sites

Link to post
Share on other sites

Correct me if I'm wrong, but would it make more sense to get in a CPU bound rather than a GPU bound situation?

 

Why not test it on 1080p and with a i3? Wouldn't that show the improvement in overhead better?

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Sniperfox47 said:

-

really? OK ..

here, directly from MS' camp:

 

if MS's own camp can't do better, why should we expect greatness from others?

 

---

 

two: the issue with the DX11 vs DX12 API draw calls not CPU performance - when you hit the draw call limit you are introducing a CPU bottleneck and GPU usage will get lower; and it's not depended on the resolution

the way you understand what DX12/Vulkan do is quite wrong

 

you get frustrated? imagine me trying to explain to people the actual problem

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with so-called "Low level" API (Sidebar: they're not low level and they don't give you "direct" access to GPU resources. The API represent data and commands that are easily reusable and translatable into what the GPU wants, i.e., you get a better abstracted representation of the GPU, not "oh I can go ahead and poke GPU memory address 0xDEADBEEF and access it directly") is they were meant to lower CPU overhead. If you take your current rig, run a game, and the CPU is not constantly taxed to 100% in DX11/OGL, you're going to get hardly any benefit from DX12 or Vulkan.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:

Funny. I said the exact same thing, and got taken completely out of context in another zMeul thread

 

What did you expect, greatness ?

 

Well we know Doom taught us that when you implement a game properly with one of the new api's it gives significant boost in performance.... Just need to get the same done on the dx 12 side to see how well that actually performs but I don't think that's gonna happen anytime soon the way devs are botching it.

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

This video was just posted today by Oxide Games at Capsaicin 2017:

 

COMPUTER: Mobile Battlestation  |  CPU: INTEL I7-8700k |  Motherboard: Asus z370-i Strix Gaming  | GPU: EVGA GTX 1080 FTW ACX 3.0 | Cooler: Scythe Big Shuriken 2 Rev. b |  PSU: Corsair SF600 | HDD: Samsung 860 evo 1tb

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Fetzie said:

It took years for DX11 to see any noticeable application in games, it wouldn't surprise me if it took that long for DX12 too. DX9 was the standard API for a couple of years still after DX11 was released.

The problem with that is that we are already seeing gains from Vulkan. DX12 failed to meet all of its efficiency targets and, so, Vulkan is the superior API.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, zMeul said:

forgot to add Ashes

 

you mean the "lazy devs" that developed Ashes and sucked so much AMD cock!?

how can it be 9_9

I would write some response about how Oxide games implemented Asynchronous Compute because it interested them, but I don't know why I bother now, you've blocked me anyway. xD

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Citadelen said:

I would write some response about how Oxide games implemented Asynchronous Compute because it interested them

and how's that Async helping .. yeah

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, XenosTech said:

Well we know Doom taught us that when you implement a game properly with one of the new api's it gives significant boost in performance.... Just need to get the same done on the dx 12 side to see how well that actually performs but I don't think that's gonna happen anytime soon the way devs are botching it.

Well, Ashes of the Singularity gave one to AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

@zMeul Your title offends my ponies ;/

 

DX12 got a rough implementation but it's purpose to increase multi-threading on games is noble, but Vulkan definitely did a better job so far, hopefully it'll have more use this year on.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Princess Cadence said:

@zMeul Your title offends my ponies ;/

 

DX12 got a rough implementation but it's purpose to increase multi-threading on games is noble, but Vulkan definitely did a better job so far, hopefully it'll have more use this year on.

Vulkan did a better job compared to what!? OpenGL?

mate, OpenGL is worse than DX11

 

and if you wanna compare DX11 to Vulkan ....

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, M.Yurizaki said:

Well, Ashes of the Singularity gave one to AMD.

Still a dx 11 game wrapped in dx12

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

Vulkan did a better job compared to what!? OpenGL?

In the one example we seem to have, Vulkan on DOOM provided a boost for AMD and NVIDIA. The best example I can find for DX12 only provided AMD with a boost, but then again this is without the 378.92 update.

 

Although that really doesn't say anything about the API's ability to do a "better job."

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

In the one example we seem to have, Vulkan on DOOM provided a boost for AMD and NVIDIA. The best example I can find for DX12 only provided AMD with a boost, but then again this is without the 378.92 update.

 

Although that really doesn't say anything about the API's ability to do a "better job."

Vulkan is better than OpenGL hands down - and that what's to be compared

OpenGL has quite a "few" (cough) issues

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

and how's that Async helping .. yeah

http://www.anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta/5

Don't really know what's going on with HardCops results, none of the GPUs they tested are in the benches below, but the Fury X gains 50% in DX12 Vs DX11. The 480 doesn't gain as much though due to the way GCN works, it has less shaders therefore utilises them better with less bubbles. Nvidia cards either lose performance or gain very little because their compute pipeline is very efficient, so it scales with cores much better than AMDs card, one of the main reasons the Fury X was so disappointing considering its core count.

 

There's also more to async than just more performance, for example AMD has tools that allow the GPU to process audio asynchronously to the graphics through TrueAudio 2.

Spoiler

80318.png.0d3ac630f9e93cb83fb4e0af322cd18b.png80319.png.c84e854ddf63b39286a69335c4e668ac.png

 

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, zMeul said:

Vulkan did a better job compared to what!? OpenGL?

mate, OpenGL is worse than DX11

 

and if you wanna compare DX11 to Vulkan ....

 

 

I don't know what do you want, DX11 is in use for longer, it is as optimized as it can but it lacks new features that are interesting to put hardware to its full usage, progression must be made. Now when implementing a brand new technology that should change the paradigm of how software uses hardware it is understandable it won't come out shining perfect from day one, adaptation and optimization afterwards makes part of the process.

 

I think it is wrong want to compare the raw performance between the APIs like that because their functional is very different and DX12 and Vulkan are pioneers on a new ways this technology functions, eventually they will be mainstream and give better performance but meanwhile there's the transition period which like I have said and the video below can show better, Vulkan did perform better on it, just lacks enough titles using it to speed up the process.

 

I still use DX11 for the most part but in my opinion you're looking at this wrong.

 

 

 

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, bobhays said:

I get that he's said a lot of stuff like that but I prefer to take each comment at face value and that was not aimed at AMD. For what it's worth he's right about the Ashes thing though, it was basically a tech demo turned game sponsored by AMD (add foul language = zMeul's post)

 

so you are a formalist well i disagree that we cant discount the author in the interpretation of the message and the foul language completely changes the meaning

Link to comment
Share on other sites

Link to post
Share on other sites

@samcool55 @Sakkura @FAQBytes @Sniperfox47 and the rest that think it's a CPU problem not a API problem

 

note: sorry in advance for artifacting - nVidia's shitheads still haven't fixed their HEVC encoding on the 10 series

 

CPU: i5 6500

RAM: 16GB

Video: GTX1070 G1

OS: W10 x64 latest drivers

res: 1080p

Yy8MExM.png

 

also some in-game benchmark results:

DX12

mountain peak avg: 254.54 fps

syria avg: 172.21 fps

 

DX11

mountain peak avg: 262fps
syria avg: 179.16 fps

 

haven't included geothermal valley since I get some weirdness with DX12, objects rendering only camera is at a certain distance from them

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, bobhays said:

I get that he's said a lot of stuff like that but I prefer to take each comment at face value and that was not aimed at AMD. For what it's worth he's right about the Ashes thing though, it was basically a tech demo turned game sponsored by AMD (add foul language = zMeul's post)

And to add to this, DX12 and Vulkan need different optimizations depending on what vendor you choose to "optimize."

 

Which makes me wonder, maybe zMeul has a point about OGL: it started to suck so much that using the Vulkan render path is simply better. DX11 was just that good that DX12 doesn't really do much for performance.

 

And while you could say "There was that time Valve ran CS:GO (or CS:S) on OpenGL vs DirectX on Windows and OGL kicked its butt", Source is still a DX9 engine.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×