Jump to content

DirectX 12 Is A Lie? In A Good Way?

Icelander

I can't say I believe this. That would mean tools made for monitoring processor usage would also be lying. Tools that have no stake in gaming. I do know some games only use one core because the devs are lazy. Then there are games that are highly complex yet run very smooth like bf4 because the devs are not lazy. It's not going to be the difference of 8 to 60fps. 

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

In what way have we been lied to about Directx 11's use of CPU cores for gaming? It's pretty common knowledge that anything more than 4 makes marginal difference, that IPC per core is the factor that actually matters. This is the feature of Dx 12 and Mantle that's being lauded, it's use of the CPU.

 

I don't think many games only use one any more, but certainly lots don't get much benefit from having more than two.

 

Also nuhvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

I can't say I believe this. That would mean tools made for monitoring processor usage would also be lying. Tools that have no stake in gaming. I do know some games only use one core because the devs are lazy. Then there are games that are highly complex yet run very smooth like bf4 because the devs are not lazy. It's not going to be the difference of 8 to 60fps. 

Coding a game to work with DX11 to spread workload across cores is not easy to say the least. Getting the majority of the performance you can with minimal effort is going to be the golden path for developers because it's easy and uses the "it works" philosophy. Why spend weeks and weeks on end trying to band-aid your code when you can spend that time optimizing what you know and have?

 

DX12 is only supposed to make this much easier and not interfere with the rendering process as much as previous iterations have.

 

Quite excited to see games start releasing with native DX12 support and see where the performance gains lie across various system configurations.

Always trying to find reason.

Link to comment
Share on other sites

Link to post
Share on other sites

Coding a game to work with DX11 to spread workload across cores is not easy to say the least. Getting the majority of the performance you can with minimal effort is going to be the golden path for developers because it's easy and uses the "it works" philosophy. Why spend weeks and weeks on end trying to band-aid your code when you can spend that time optimizing what you know and have?

 

DX12 is only supposed to make this much easier and not interfere with the rendering process as much as previous iterations have.

 

Quite excited to see games start releasing with native DX12 support and see where the performance gains lie across various system configurations.

Did I say it was easy? No but companies manage to do it anyway. If you are a big triple a company there is no reason you can't make a multithreaded game instead of shoving shit out the door for the sake of money. 

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

I can't say I believe this. That would mean tools made for monitoring processor usage would also be lying. Tools that have no stake in gaming. I do know some games only use one core because the devs are lazy. Then there are games that are highly complex yet run very smooth like bf4 because the devs are not lazy. It's not going to be the difference of 8 to 60fps. 

he is talking specifically about cpu talking to gpu that runs one 1 core, this isnt the only process that is happening in game so we cant determine that from load on cores.

Link to comment
Share on other sites

Link to post
Share on other sites

Did I say it was easy? No but companies manage to do it anyway. If you are a big triple a company there is no reason you can't make a multithreaded game instead of shoving shit out the door for the sake of money. 

A lot of the effort is taken out when the game's engine is being developed to run on DirectX. After that when companies develop games they do have the ability to get in to the gritty areas of development but that means messing with a lot of code that quite frankly shouldn't be messed with unless they feel like running it through weeks of patching to fix what they broke. The reason why Battlefield runs well on multi-core systems is the Frostbite engine was developed to work as such.

 

One reason why I like UE4 is they're starting the support for DX12 from the very core, meaning developers have less work to do to get the product out the door in a properly working order.

Always trying to find reason.

Link to comment
Share on other sites

Link to post
Share on other sites

If your concerned about the poor performance of DX11 then you need not worry about what is being said in that.  DX11 does not have good multi-core performance and this is not news. 

Plus there is only one program that is released and it has very little to do with "graphics" . And yes there is a improvement of up to 10 or more times in areas. Draw calls. Try not to get tried of hearing those two words after seeing the  Futuremark 12 reviews

Link to comment
Share on other sites

Link to post
Share on other sites

DX 12 is not a bad thing. Thank you AMD and your API formerly known as "Manlte" for pushing the envelope and kicking MS in there collective a**es  that basically made MS start fixing Direct3D

Link to comment
Share on other sites

Link to post
Share on other sites

he is talking specifically about cpu talking to gpu that runs one 1 core, this isnt the only process that is happening in game so we cant determine that from load on cores.

Wouldn't that just be the draw calls? I don't think anyone was ever lied to about that. We all know that there is very limited draw calls on dx11. Was it around 15,000 compared to something like 150,000 for consoles? That does not mean some games won't work right, because it's obvious that many games do.

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

it still inprogress

if you know how it works, or how method it will use to deliver better performance you can prove that

looking for opinion from others who doesn't completely understand the subject doesn't matter if you agreed or not because what you seek is merely a performance boost.

 

it's like saying Direct 7/8/9/10/11 is a lie

we don't know until someone/companies apply that in their works.

Crytek took that shot with Crysis, by using DX 10 for the first time, and make lot's of contribution for developers whether we like it or not

 

the most common stupid misinterpreted is this new tech will makes their "old" apps works better which is not

even you see it in Brad Wardell interview (if you read that article) new directx won't affect old directx product if they didn't use it.

Link to comment
Share on other sites

Link to post
Share on other sites

Nah, most games are gpu limited anyway, it will help but not that much. It could have much bigger impact on what is possible for devs to do, so better quality at the same or slightly better performance.

Link to comment
Share on other sites

Link to post
Share on other sites

needed to add one last comment

And for those who have not really read to deep into "D3D" or "Manlte" or better yet have not experienced poor frametime and framehitching , I can see how DX12 when it comes down to the Frames per second of Crysis or "X" game (yes i know Crysis is D3d10) and say the differences are not overwhelming as we might of hoped or are hoping the overall quality will be better.   Heck FC4 runs above 60fps 1080p but feels like it is 18 or 20 most of the time. 

Link to comment
Share on other sites

Link to post
Share on other sites

Stopped watching the video after I heard him pronounce Nvidia.

 Asus M5A99X Evo  - AMD FX-8350 - 16GB Corsair Vengeance 1866Mhz - Corsair 120mm Quiet Edition Fans BenQ XL2411Z- EVGA GTX 980 Superclocked Fractal Design Define R4 - Corsair H100i - 2 TB 7200rpm HDD - Samsung 840 Evo 120GB - Corsair RM750w PSU - Logitech G502 Proteus Core - Corsair K70 RGB MX Red - Audio Technica M50x + Modmic 4.0 - LG 23EA63V x2


Spinthat Spinthat Spinthat Spinthat

Link to comment
Share on other sites

Link to post
Share on other sites

This guy is a moron also.

Only one core ? 

The speed of a single core hanst changed in years ? HMM what about IPC and arquitecture?

There are so many , SO MANY games that use 4 cores, I have seen this so much times personally.

Otherwise everyone would be gaming on celerons lol.

Theres a good reason why the pentium K has been phase out recently.

Bunch of non sense.

Link to comment
Share on other sites

Link to post
Share on other sites

There is no reason to call the fella in the video a moron or any other name as i think many people share their view of these issues and it has variations due to the OS. For example, some people have parked cores and only one "working" core and others have all cores working  and while yes you can manually change it some people just don't know about it but to me that is an OS issue more than a "games only use one core" problem.

 

Another thing is you can set Affinity for programs with multi-core CPUs

 

These are manual tweaks.

 

To me the core of this whole "performance" issue with gaming is mainly controlled by the OS as every program runs on it and may be a reason why Microsoft made it exclusive for win10 but then again it could be they want you to go to win10 for other reasons regarding "intelligence gathering"

 

We will have to just wait and see.

A water-cooled mid-tier gaming PC.

Link to comment
Share on other sites

Link to post
Share on other sites

No 

 

This is stupid xD

 

Obviously not the case ladies and gentlemen, I need you to go look at multicore benchmarks vs single core benchmarks and tell me what you see

Link to comment
Share on other sites

Link to post
Share on other sites

I can't say I believe this. That would mean tools made for monitoring processor usage would also be lying. Tools that have no stake in gaming. I do know some games only use one core because the devs are lazy. Then there are games that are highly complex yet run very smooth like bf4 because the devs are not lazy. It's not going to be the difference of 8 to 60fps. 

 

What if he means literally at the same time. And I'm not tech savy or anything but what if core 1 talks to your gpu and core 2 talks to the gpu 10 millisecond later, core 3 15 ms after core 1, etc. vs all cores talking to it at the exact same time? in the first example, they're not talking to the gpu at the same time but it's indistinguishable from being simultaneous to us, humans.

 

Also nuhvidia.

I always say it like it's N-vidia, En-vidia.

Link to comment
Share on other sites

Link to post
Share on other sites

What if he means literally at the same time. And I'm not tech savy or anything but what if core 1 talks to your gpu and core 2 talks to the gpu 10 millisecond later, core 3 15 ms after core 1, etc. vs all cores talking to it at the exact same time? in the first example, they're not talking to the gpu at the same time but it's indistinguishable from being simultaneous to us, humans.

 

I always say it like it's N-vidia, En-vidia.

I think it's more like core 1 is the middle man telling all the others what to do, bottlenecking performance

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

Stopped watching the video after I heard him pronounce Nvidia.

Reminds me when I went to get my 2nd 680 from some guy, mentioned I had this exact card already and he replied with "oh your sleeing them" at which point I just paused thinking "wut...what in the hell is sleeing?"

System Specs

CPU: Ryzen 5 5600x | Mobo: Gigabyte B550i Aorus Pro AX | RAM: Hyper X Fury 3600 64gb | GPU: Nvidia FE 4090 | Storage: WD Blk SN750 NVMe - 1tb, Samsung 860 Evo - 1tb, WD Blk - 6tb/5tb, WD Red - 10tb | PSU:Corsair ax860 | Cooling: AMD Wraith Stealth  Displays: 55" Samsung 4k Q80R, 24" BenQ XL2420TE/XL2411Z & Asus VG248QE | Kb: K70 RGB Blue | Mouse: Logitech G903 | Case: Fractal Torrent RGB | Extra: HTC Vive, Fanatec CSR/Shifters/CSR Elite Pedals w/ Rennsport stand, Thustmaster Warthog HOTAS, Track IR5,, ARCTIC Z3 Pro Triple Monitor Arm | OS: Win 10 Pro 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

I wouldn't say only one core talks to the GPU in DirectX 11 and down... the problem with multi-threading game engines right meow? Difficulty. Epic even said with Unreal Engine 3 that they are only multi-threading a few operations due to time investments. Stuff like sound, animations and a handful of other things due to difficulty and development time.

The difference between a benchmark and a game? benchmarks are built for multi-threading everything since that is what it's built for. It's not built to deliver a full game, it's just built for synthetic testing of your PC. That's why you can get some impressive visuals and decent framerates in 3DMark yet you load up Final Fantasy XIII and it runs like ass. :P

Read this to see kind of what I mean by difficulty of multithreading games with current 3D APIs:

http://www.cinemablend.com/games/Unreal-Engine-3-Gets-Multithreaded-Support-Devs-Have-No-More-PS3-Excuses-19029.html

So yeah, DirectX 12 promises a lot... but in the end? What really matters is if they made the process easier/dummy-proof for multithreading in DirectX 12 for game engines... There's still a chance of seeing the same crippling issues that we've seen in the past where development costs override the development improvements.

Same can be said currently about game engines and lack of optimized Multi-GPU support as we have seen in Dead Rising 3's lack of AMD CrossFire and NVIDIA SLI support... same with Wolfenstein: New Order.

Look at Elder Scrolls Online as a perfect example! Single-threaded game engine and didn't have proper multi-GPU support until the game started failing hard hahaha...

 

DX 12 is not a bad thing. Thank you AMD and your API formerly known as "Manlte" for pushing the envelope and kicking MS in there collective a**es  that basically made MS start fixing Direct3D

So far? It has all been rumors that Mantle was truly the cause for DirectX 12... I haven't seen a report about Microsoft saying "This is in direct response to Mantle." For all we know? DirectX 12 has been in development for quite some time and they only bothered announcing it after AMD pushed out Mantle (which is in beta and by the sounds of it won't see a 1.0 release).

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Coding a game to work with DX11 to spread workload across cores is not easy to say the least. Getting the majority of the performance you can with minimal effort is going to be the golden path for developers because it's easy and uses the "it works" philosophy. Why spend weeks and weeks on end trying to band-aid your code when you can spend that time optimizing what you know and have?

 

DX12 is only supposed to make this much easier and not interfere with the rendering process as much as previous iterations have.

 

Quite excited to see games start releasing with native DX12 support and see where the performance gains lie across various system configurations.

May be wishful thinking but I hope DX12 means devs will be able to optimize games as well as BF4 easily. If every DX12 game is like BF4, My FX will last me a lot longer before needing to upgrade :)

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

May be wishful thinking but I hope DX12 means devs will be able to optimize games as well as BF4 easily. If every DX12 game is like BF4, My FX will last me a lot longer before needing to upgrade :)

I'm doubtful of that. Mantle was being pushed by DICE so it was developed along side Frostbite 3. That's why it existed at a later release. DX12 would require and entire recoding of the engine and game to get it work, an undertaking I doubt they're going to bother with.

Always trying to find reason.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm doubtful of that. Mantle was being pushed by DICE so it was developed along side Frostbite 3. That's why it existed at a later release. DX12 would require and entire recoding of the engine and game to get it work, an undertaking I doubt they're going to bother with.

I think you misunderstood what I was saying. I'm hoping newer games with DX12 are as well optimized as BF4 was. Not that they should add DX12 to BF4

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×