Jump to content

DX12 Already Used In Just Cause 3 PC - G-Buffer, Conservative Rasterization (supported by maxwell only)

Mr_Troll

DX12 Already Used In Just Cause 3 PC; G-Buffer, Conservative Rasterization & more
 

It seems like Just Cause 3 for PC already uses certain DX12 features to improve the game’s performance and increase the game’s visual quality.

 

Just-Cause-3-9-635x357.jpg

 

 

While Just Cause 3 hasn’t been presented as a DirectX12 title, the game was developed with the use of certain DX12 features. Avalanche Studios has talked about possible DX12 support for their engine in the past, but it seems that Just Cause 3 was developed to make use of DX12 features to improve performance and visual quality on PC.During the Gamers Developers Conference 2016, a programming session will be organized by Avalanche Studios and Intel to cover the changes made by Avalanche for Just Cause 3. The session will be presented by Intel’s graphics software engineer, Antoine Cohade.

 

Just Cause 3 is Avalanche studio’s latest addition to the mind (and lots of other stuff) blowing open world action-adventure series: Just Cause. Published by Square Enix, the game released on December 1, 2015 for Microsoft Windows, PlayStation 4 and Xbox One. During the development of the game, Intel and Avalanche have been working closely together to optimize the game on Iris graphics, but also to make use of DX12’s features to improve the performance even further as well as bring additional visual quality to the game.

 

 

DX12 features ‘exclusive’ to PC version of Just Cause 3

 

It appears that Avalance has adjusted their Avalanche engine to match certain DX12 features, which are exclusive to the PC version of the game. Some of these features include Ordered Independent Transparency, and G-Buffer blending.This session will cover the changes Avalanche made to their engine to match DX12’s pipeline, the implementation process and best practices from DX12. We’ll also showcase and discuss the PC exclusive features enabled thanks to DX12 such as Ordered Independent Transparency, and G-buffer blending using Raster Ordered Views and light assignment for clustered shading using Conservative Rasterization.Just Cause 3 was released back in December of 2015, and suffered from some serious performance issues on both the PS4 and Xbox One. It’s uncertain whether these DX12 are also available for the Xbox One, or if they are truly exclusive to the PC.

 

So they used DX 12_1 features that are not supported on AMD. clever move Nvidia. I was wondering why JC3 ran worse on AMD graphics cards.

 

Source:http://schedule.gdconf.com/session/using-new-directx-features-in-practice-just-cause-3-case-study-presented-by-intel

http://wccftech.com/dx12-3-pc-gbuffer-conservative-rasterization/

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ah, that should explain why Kepler users weren't too happy on launch.

 

@op, you might wanna edit your post.

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why do i not care?

CONSOLE KILLER: Pentium III 700mhz . 512MB RAM . 3DFX VOODOO 3 SLi

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not bad. Still doesnt run very good on my 970 :(

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

So they used DX 12_1 features that are not supported on AMD. clever move Nvidia. I was wondering why JC3 ran worse on AMD graphics cards.

 

Nvidia? It says in the very articles that YOU linked that Intel was working with Avalanche to get these features working, mate.

 

Do you guys deliberately look for something to blame <graphics vendor> for, even when it's not there? Yes, yes you do.

 

God man. You guys who put the blame on the graphics vendors for all of this shit is amazing.

Link to comment
Share on other sites

Link to post
Share on other sites

So that's why Maxwell is the only one that can run this pile of buggy crap. Meh. Curiously no A-sync shaders.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Certain features dx12 does that mean its running on dx 12 or dx11 ?

 

@Kloaked its not new that nvidia make better drivers its been going on for atleast 2 years now.

Link to comment
Share on other sites

Link to post
Share on other sites

Certain features dx12 does that mean its running on dx 12 or dx11 ?

 

@Kloaked its not new that nvidia make better drivers its been going on for atleast 2 years now.

 

This has nothing to do with drivers in this instance. Go read the articles that the OP himself linked.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, why the fuck are we blaming Nvidia again? It was the developers who put the features in. Why are we not blaming the developers at all?

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, why the fuck are we blaming Nvidia again? It was the developers who put the features in. Why are we not blaming the developers at all?

I am not updated on the tech of this particular game.

But judging by the info given there is no reason to 'blame' anybody. A developer supported a feature from a graphics API... Good.

Link to comment
Share on other sites

Link to post
Share on other sites

This has nothing to do with drivers in this instance. Go read the articles that the OP himself linked.

I get it now seems like the just cause devs are biased towards Nvidia really bad marketing strategy in my opinion.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the company that doesn't have a card that supports dx12 featureset, that isn't nvidias fault. The alternative is that developers remove the performance improvements until AMD catches up. I don't think that is a better solution, its much worse. The game ought to still run well and if it doesn't that is the fault of the developers. It's their job to make a game that works on a range of hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia? It says in the very articles that YOU linked that Intel was working with Avalanche to get these features working, mate.

so intel is involved in bringing AMD to their knees. interesting

 

lol joke, i dont want to start flamewar. Clearly, all graphics vendors dont support all dx12 features... yet. And i dont expect them to support them in this year either. Not on hardware level anyway.

 

But its nice they are making dx12 rush forward. dx9 to dx11 was slow.

Link to comment
Share on other sites

Link to post
Share on other sites

Why blame Nvidia?, it clearly said, "Intel". Please read the source before pointing your finger at someone.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the company that doesn't have a card that supports dx12 featureset, that isn't nvidias fault. The alternative is that developers remove the performance improvements until AMD catches up. I don't think that is a better solution, its much worse. The game ought to still run well and if it doesn't that is the fault of the developers. It's their job to make a game that works on a range of hardware.

Yet when that benchmark/game/whatever the hell was the first to run asynchronous shaders while only AMD had support for it, hurr durr let's write off Nvidia as being lazy with dx12 implementation.

Somehow OP pulled it out of his ass to blame Nvidia with this.

So much bullshit from these simpletons.

Link to comment
Share on other sites

Link to post
Share on other sites

So that's why Maxwell is the only one that can run this pile of buggy crap. Meh. Curiously no A-sync shaders.

Or maybe because amd has a terribad communication even with dev teams.

I rememeber for gamework implementation on WT.

A dev came to the forum and explained why they choosen to work with Nvidia and use GW.

 

Because each times they tried to discuss with AMD they get answers months later, or no answers at all.

Meanwhile with Nvidia they send you for free a dev to your office to help you work and optimise features.

I wish i could oc my body, during winter overheating would be great.

Link to comment
Share on other sites

Link to post
Share on other sites

Why blame Nvidia?, it clearly said, "Intel". Please read the source before pointing your finger at someone.

Why blame anybody?

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the company that doesn't have a card that supports dx12 featureset, that isn't nvidias fault. The alternative is that developers remove the performance improvements until AMD catches up. I don't think that is a better solution, its much worse. The game ought to still run well and if it doesn't that is the fault of the developers. It's their job to make a game that works on a range of hardware.

Yet AMD is the company that bothered with A-sync shaders so their cards should perform better in DX12.

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

I get it now seems like the just cause devs are biased towards Nvidia really bad marketing strategy in my opinion.

oh right, so when Ashes of Singularity was biased towards AMD it was OK

 

also, Intel .. bloody Intel is working with the developer to optimize the game for their Iris PRO IGPs

come on!

Link to comment
Share on other sites

Link to post
Share on other sites

Why do i not care?

Maybe because JC3 was pretty much a console optimized game, which is why PC users weren't really happy with the performance of the game when they ran it on their rig? I dunno, just guessing on this one.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the company that doesn't have a card that supports dx12 featureset, that isn't nvidias fault. The alternative is that developers remove the performance improvements until AMD catches up. I don't think that is a better solution, its much worse. The game ought to still run well and if it doesn't that is the fault of the developers. It's their job to make a game that works on a range of hardware.

You've been smoking some of the good stuff I see.
Link to comment
Share on other sites

Link to post
Share on other sites

I wish they would not abandon the previous gen so fast, I am sure my 780 will still run fine.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the company that doesn't have a card that supports dx12 featureset, that isn't nvidias fault. The alternative is that developers remove the performance improvements until AMD catches up. I don't think that is a better solution, its much worse. The game ought to still run well and if it doesn't that is the fault of the developers. It's their job to make a game that works on a range of hardware.

 

They do though, just not at 12_1. 

 

Though, wouldn't it be possible to have an option to have those features turned off? There are still more GCN and pre maxwell gpus out in the market. Wouldn't be fair to the rest of them if only skylake and maxwell supported those features. 

 

post-148449-0-56914000-1452852903.jpg

 

http://www.neogaf.com/forum/showthread.php?t=1148535 The first few posts from launch has quite alot of unpleased GTX 770 users. 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe because JC3 was pretty much a console optimized game, which is why PC users weren't really happy with the performance of the game when they ran it on their rig? I dunno, just guessing on this one.

and consoles are powered by AMD. so that doesnt explain why it runs/ran like ass for AMD based systems.

if it was made for consoles, AMD would have zero issues

Link to comment
Share on other sites

Link to post
Share on other sites

Double standards are a bitch aren't they ?

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×