Jump to content

First DirectX 12 game benchmarked *Update 2 More benchmarks

Isn't it fairly obvious why? 

Nvidia had DX11 wired down to obscene levels, AMD did not have DX11 wired down. It shows in the performance, Nvidia simply spent more time working with DX11 and now DX11 drivers are more or less at their maturity for both companies, though AMD couldn't be happier to leave DX11 performance behind. 

 

Nvidia is caught at the gate on DX12. Who knows what the optimizations could result in. Their increase from DX11 to DX12 isn't as apparent, partially due to the hardware in GCN that allows it to better exploit DX12 and partly because AMDs DX11 implementation was such rubbish that the increases here look godly. 

 

I'm indifferent on this, the closest thing to a consumer card I have is the 6770M sitting in my old laptop, and if these results were mired in FirePro/Quadro I might have to consider switching but they aren't. 

 

Lets also not get ahead of ourselves, DX12 isn't going mainstream for at least another 2 or so quarters, developers aren't just gonna jump on the wagon and indeed many will continue shipping DX11 based games for whatever reason - AMD can't escape the fact that in such scenarios their cards will suffer unless they go back and give them the same TLC that they've clearly given towards DX12.

 

Side note - I keep reading here that Mantle is the base of DX12 and that Mantle inspired DX12. Really? I mean, can I read up on that? Would be interesting to see some solid proof on what inspired what. I'm all for giving credit where credit is due, I just have a hard time believing that MS scrapped whatever DX12 was originally going to be just to push DX12 as it is now out within 18 months by basing it off what AMD did, cause that sounds ridiculous. Especially given the lead times associated with developing such software...

Link to comment
Share on other sites

Link to post
Share on other sites

These results by arstechnica are far more mindblowing:

 

 

I mean, the 290X achieves almost the same avg. FPS as a 980 Ti. That said though, 1 game is still a too small sample size to make any meaningful conclusions, we shall see though.

Full article: http://arstechnica.co.uk/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/

The benchmarks should be taken with a huge grain of salt it's an AMD partnered studio that already made a Mantle benchmark and DX12 doesn't magically increase performance it only gives the developers the option to code more low level.

This means if the developer only worked on a low level implementation for AMD and not for Nvidia you won't see much performance benefits on Nvidia's side.

We'll really see how DX12 runs when vendor neutral games like Star Citizen get DX12.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Side note - I keep reading here that Mantle is the base of DX12 and that Mantle inspired DX12. Really? I mean, can I read up on that? Would be interesting to see some solid proof on what inspired what. I'm all for giving credit where credit is due, I just have a hard time believing that MS scrapped whatever DX12 was originally going to be just to push DX12 as it is now out within 18 months by basing it off what AMD did, cause that sounds ridiculous. Especially given the lead times associated with developing such software...

 

I think the conclusion has been made, as AMD is the hardware vendor for XBOX, so they work close together, as well as AMD and Mantle "pushing" MS in a certain direction. But what involvement, and implementation of Mantle tech is completely unknown and could for the sake of the argument, be none what so ever. Mantle was heavily implemented in OpenGL Next to the point of renaming it Vulkan.

 

The benchmarks should be taken with a huge grain of salt it's an AMD partnered studio that already made a Mantle benchmark and DX12 doesn't magically increase performance it only gives the developers the option to code more low level.

This means if the developer only worked on a low level implementation for AMD and not for Nvidia you won't see much performance benefits on Nvidia's side.

We'll really see how DX12 runs when vendor neutral games like Star Citizen get DX12.

 

Keep in mind that NVidia, like AMD, Microsoft and Intel, has access to the newest build of this game, and has had so for over a year. Also note that Oxide Games actually implemented an NVidia specific shader, made by NVidia for optimization on NVidia hardware. Oxide Games involvement with AMD in the Star Swarm demo had more to do with tinkering with low level API's than being vendor biased.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Based on the benchmarks, I'm somehow under the impression that AMD and Microsoft worked closely with DX12

Desktop

Y4M1-II: AMD Ryzen 9-5900X | Asrock RX 6900XT Phantom Gaming D | Gigabyte RTX 4060 low profile | 64GB G.Skill Ripjaws V | 2TB Samsung 980 Pro + 4TB 870 EVO + 4TB SanDisk Ultra 3D + 8TB WD Black + 4TB WD Black HDD | Lian Li O11 Dynamic XL-X | Antec ST1000 1000W 80+ Titanium | MSI Optix MAG342CQR | BenQ EW3270U | Kubuntu

-------------------------------

Mobile devices

Kuroneko: Lenovo ThinkPad X1 Yoga 4th (Intel i7-10510U | 16GB RAM | 1TB SSD)

Link to comment
Share on other sites

Link to post
Share on other sites

While the performance gap between 11 and 12 is great and all, my only hope is that people will understand WHY this is the case.  There are many misconceptions about DX12, and the massive performance gains, albeit impressive, will only fuel their fire.

 

Nonetheless, I can't wait to see this be more widespread considering how it's fairing in a "real-world" scenario.

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

Side note - I keep reading here that Mantle is the base of DX12 and that Mantle inspired DX12. Really? I mean, can I read up on that? Would be interesting to see some solid proof on what inspired what. I'm all for giving credit where credit is due, I just have a hard time believing that MS scrapped whatever DX12 was originally going to be just to push DX12 as it is now out within 18 months by basing it off what AMD did, cause that sounds ridiculous. Especially given the lead times associated with developing such software...

 

In the early days of Mantle Richard Huddy claimed that Microsoft asked AMD to bring Mantle features to DX12. Even the guidelines for the API (documentation) has alot in common (you can find them @ beyond3d forums). No one said MS scrapped anything - they just added pretty much Mantle features to what they already had. Mantle was more then a API, was a proof of concept made by developers. 

Link to comment
Share on other sites

Link to post
Share on other sites

AAAAAAAAAAAaaaaaaaaaaaaand no one notice how the i3 is outperforming both 8370 and 6300, even though DX 12 is suppose to be better on multithread  workloads. Meanwhile the i7 DOUBLES the FPS of the i3. HOLY CRAP. i would like to see an i5 mix in, for comparison

Link to comment
Share on other sites

Link to post
Share on other sites

AAAAAAAAAAAaaaaaaaaaaaaand no one notice how the i3 is outperforming both 8370 and 6300, even though DX 12 is suppose to be better on multithread  workloads. Meanwhile the i7 DOUBLES the FPS of the i3. HOLY CRAP. i would like to see an i5 mix in, for comparison

I did, even pointed it out to a few idiots in another thread.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Aaaaaaa I'm selling my 970 for a modded 8GB 290X.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Aaaaaaa I'm selling my 970 for a modded 8GB 290X.

Lol I was hoping for a reaction like that

Link to comment
Share on other sites

Link to post
Share on other sites

Lol I was hoping for a reaction like that

Its true though-its the same damn GPU as what you get with the 290X. And with a custom bios you can flash the 8GB version to a 390X and make use of the driver improvements (better tessellation, lower power consumption etc).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Now I start to wonder, what gains will iGPUs get from DX12. Would be interesting to add them in the compilation of charts :)

Link to comment
Share on other sites

Link to post
Share on other sites

Its true though-its the same damn GPU as what you get with the 290X. And with a custom bios you can flash the 8GB version to a 390X and make use of the driver improvements (better tessellation, lower power consumption etc).

The tessellation improvement was ported to 290x driver already and power consumption was not because of drivers either.

 

The thing that helps is higher base clock speed, higher memory clock and tighter timing.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, nvidia's DX12 drivers better not be this shit for long.

Link to comment
Share on other sites

Link to post
Share on other sites

sooo 

dx 11 = go for nvidia

dx 12= go for amd

If you want to have higher performance in benchmarks right now, then sure.

Let wait until at least one game comes out  ^_^

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Aaaaaaa I'm selling my 970 for a modded 8GB 290X.

 

 

 

Its true though-its the same damn GPU as what you get with the 290X. And with a custom bios you can flash the 8GB version to a 390X and make use of the driver improvements (better tessellation, lower power consumption etc).

 

..hmmm I'm not sure if you are saying these things to have a go at AMD or not. A GPU that was release almost 2 years ago is now suddenly looking relativly competativem, is a poor reflection on Nvidia for not being ready for DX12. The anger or frustration about the 390x being a slight revision is a seperate issue.

Someone that bought a 290x or 390x is seeing great value looking at the new benchmarks, though will be interesting to see what happens with actual games.

Link to comment
Share on other sites

Link to post
Share on other sites

sooo 

dx 11 = go for nvidia

dx 12= go for amd

No, because one benchmark doesn't mean end-all, it just tells us AMD has had more time to make good low-api drivers, which we already knew.  1-2 years in the future, when more DX12 titles get released, nvidia may very well have fixed everything that is wrong (because let's be honest, losing FPS means it's not working right, and nvidia is known for shit drivers recently).

 

Remember:  One benchmark doesn't fully argue a claim.

Link to comment
Share on other sites

Link to post
Share on other sites

nice, another reason against Windows 10 upgrade :P

i9 11900k - NH-D15S - ASUS Z-590-F - 64GB 2400Mhz - 1080ti SC - 970evo 1TB - 960evo 250GB - 850evo 250GB - WDblack 1TB - WDblue 3TB - HX850i - 27GN850-B - PB278Q - VX229 - HP P224 - HP P224 - HannsG HT231 - 450D                                                         
Link to comment
Share on other sites

Link to post
Share on other sites

No, because one benchmark doesn't mean end-all, it just tells us AMD has had more time to make good low-api drivers, which we already knew.  1-2 years in the future, when more DX12 titles get released, nvidia may very well have fixed everything that is wrong (because let's be honest, losing FPS means it's not working right, and nvidia is known for shit drivers recently).

 

Remember:  One benchmark doesn't fully argue a claim.

You're right on one thing: benchmark doesn't mean end-all. This is a intensive drawcall bench, and it does it's job very well indeed.

 

The rest, it's just bullshit - NVIDIA has claimed they have been working on DX12 for years, after AMD claimed that Microsoft asked to bring Mantle features to DX12. I liked particulary what they PR guy said, something in the lines of : "we were present on every DX12 demo" - yes you read it right, NVIDIA has been paying the shit out conferences to be in stages side by side with DX12 (trust me, that ain't cheap)... and after all? It's just smoke and mirrors. PR stunts everywhere.

 

NVIDIA was on the stages where the price to show is high - and AMD, Frostbite and Oxide (yes the Mantle developers) were in the backstages doing the Technical Presentations to developers both in DX12 and Vulkan (back then was OpenGL Next)... yes not even Microsoft had any Eng. talking about the low level features of their own API when it was first displayed. 

 

The big NVIDIA media show lol... and you guys fall for it because they were in a stage running Forza, a Xbox one game, @1080p 60fps... with a fucking Titan lol... how ridiculous that was, but the fanboys didn't care xD

 

Now they go offensive claiming it's BUGS in the game engine - wich thankfully was debunked by the developers.

 

So just cut the crap, stop making excuses, open your eyes, and try to at least search for some information.

 

Or at least MAKE AND EFFORT TO LOOK BEYOND OBVIOUS PR.

Link to comment
Share on other sites

Link to post
Share on other sites

No, because one benchmark doesn't mean end-all, it just tells us AMD has had more time to make good low-api drivers, which we already knew.  1-2 years in the future, when more DX12 titles get released, nvidia may very well have fixed everything that is wrong (because let's be honest, losing FPS means it's not working right, and nvidia is known for shit drivers recently).

 

Remember:  One benchmark doesn't fully argue a claim.

I remember when one benchmark showing AMD's poor performance in the game, completely meant that AMD was not doing anything to optimize the drivers for that game... This benchmark also further shows claims against that particular PR madness, being as to how this game isn't necessarily a major game either. It also tells us the effort that AMD tries to make. If Nvidia didn't have enough time to do the same as AMD, it also sorta shows how much effort each company have been putting into the most recent technologies.

In my opinion, I think Nvidia just has the PR on their side. Nvidia could probably easily allow Freesync on their cards, yet don't. Here, as you said, AMD probably spent more time on the low API drivers, which pretty much means, they actually focused on it. Despite, AMD's lack of share and resources, they still seem to be accomplishing things, which is the main reason I have more interest in AMD currently. Nvidia used to be more interesting, but that was back when they were behind in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

The underlying differences in architecture and technologies are at play as well here. AMD GPU's have been designed with DX12 style features in mind for a long time now. New shader techniques, different API resource allocation systems, better threading, better use of VRAM, more direct access to hardware resources, etc.

 

There is a lot less that can be done in driver for DX12 except in cases where the dev does not make full use of DX12 features. DX11 saw nVidia with a hefty lead because their arch was better suited for DX11 and they were willing to invest massive amounts of money in tailoring their drivers for game developers' bad code. AMD never had the funds for that and were constantly pushing for a more robust back end mating code with GPU horsepower. Now that DX12/Vulkan are bringing that to the table you are seeing AMD GPUs flex their muscles as they are used more effectively, I don't doubt nVidia will be able to make up some ground but I'm not sure how close to the utmost performance they will get when so much of the DX12 API is not being handled natively in hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

 

Only on AMD. And that's because their DX11 drivers had a large CPU overhead. Their DX11 drivers were bad, thefefor they scale well since the amount of drawcalls on DX12 will be significantly higher. CPU utilization is fine.

 

I see no problem, because I know what it is i'm looking at.

Yep.

This proves that the AMD drivers indeed have a lot more overhead than NVIDIA in DirectX 11. An AMD R9 280 is a faster GPU than the GTX 750 Ti... unless the CPU is a bottleneck. In that case? The more efficient NVIDIA drivers and Maxwell is just as good for a lot lower cost... well, at least in GTA V.

As for DirectX 12 Ashes of Singularity? Meh, this time it looks like NVIDIA put out some poop drivers instead of what we've seen instead of AMD.

If you go back into the past with that Anandtech article with Star Swarm DirectX 12? NVIDIA scaled properly in that and you could see the AMD driver bottleneck in DirectX 11... where for some reason even a GTX 750 Ti got a higher framerate. It could also be a serious AMD DirectX 11 driver bug in Star Swarm.

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm

So I dunno, I say let's wait to see what NVIDIA does working with Oxide Games to fix their driver's performance issues with Ashes of Singularity.

This kind of reminds me of ID Software's ID Tech 5 Engine game... RAGE. ATi put out the wrong driver and everyone with an ATi GPU was raging (pun not intended... but left in heh) at ID Software until John Carmack said "they put out the wrong driver." Afterwards? Proper drivers released and all was well again. History repeats itself meow with Ashes of Singularity, this time NVIDIA's fault.

Oxide Games also said it nicely here:

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/

 

So what is going on then? Our analysis indicates that any D3D12 problems are quite mundane. New API, new drivers. Some optimizations that the drivers are doing in DX11 just aren’t working in DX12 yet. Oxide believes it has identified some of the issues with MSAA and is working to implement workarounds on our code. This in no way affects the validity of a DX12 to DX12 test, as the same exact workload gets sent to everyone’s GPUs. This type of optimization is just the nature of brand new APIs with immature drivers.

Also... dammit! I knew that DirectX 12 is only single-GPU right meow... couldn't get my SLI setup to work with Unreal Engine 4.9 leaked DirectX 12 Elemental Demo...

 

SLI and CrossFire

Another question we get is concerning SLI and CrossFire situations. D3D12 allows us to have explicit support for Multi-GPU. This allows our engine to drive it. While we have some prototypes of this working, it isn’t yet ready to distribute for public review. There may still be OS level change that needs to happen, and we are still working with Microsoft to figure it out. However, we do expect it to be ready for public review before our game goes into BETA. Our current expectation is that scaling should be close to optimal, with minimum frame variance.

So that about sums it up. We hope that gamers everywhere will find our test useful, and help us make PC gaming better than ever.

I didn't bother spending the $50 founders pack for Ashes of Singularity and I'm glad I didn't meow. I'm not that big into RTS games anymore anyways... plus, I can't even test SLI in it? Meh. I'll just wait for future benchmarks and reviews once it goes into Beta I guess.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×