Jump to content

Stunning Real-Time DirectX12 Demo from Square Enix

qwertywarrior

Yea...if it takes 4 Titan Xs to run that....then no way in hello...are we seeing that in games for long, long time.... DX 12 pointless demo I like to call that.

 

If the PS4 can't render it, then it won't be ported.

 

They need to go back to porting the PC game to the Console game not what we PC games have been dealing with since the PS3 era. That era is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

meh, not impressed with cookie cutter marketing crap. Your not going to see that in games for at least 5 more years.

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome! 

Current system - ThinkPad Yoga 460

ExSystems

Spoiler

Laptop - ASUS FX503VD

|| Case: NZXT H440 ❤️|| MB: Gigabyte GA-Z170XP-SLI || CPU: Skylake Chip || Graphics card : GTX 970 Strix || RAM: Crucial Ballistix 16GB || Storage:1TB WD+500GB WD + 120Gb HyperX savage|| Monitor: Dell U2412M+LG 24MP55HQ+Philips TV ||  PSU CX600M || 

 

Link to comment
Share on other sites

Link to post
Share on other sites

UgR9bfY.jpg

Now I'm wondering how big of a bullshot this is considering how anemic those next gen systems are. 

Let's just hope that FFXV will be properly released on PC .

 

There are enough visual imperfections with this scene (apart from the jpeg comrpression.....) that bring this in line with what is possible with an in engine cutscene from SE. What is selling the image so well is the lighting solution, which is one of the big draw cards for the engine SE has been developing.

Link to comment
Share on other sites

Link to post
Share on other sites

Nice.

 

Just asking; what would prevent this from running on direct X 11? Cause I don't think the graphical effects are derived from DX12.

 

Is it that the scene is so complex with so many polygons that no CPU can feed a GPU with enough draw calls to keep that running on a legacy API? To the point where multiple GPU scaling is redundant until you get the API bottleneck alleviated by using DX12 / Vulkan / Mantle.

Link to comment
Share on other sites

Link to post
Share on other sites

Nice.

 

Just asking; what would prevent this from running on direct X 11? Cause I don't think the graphical effects are derived from DX12.

 

Is it that the scene is so complex with so many polygons that no CPU can feed a GPU with enough draw calls to keep that running on a legacy API? To the point where multiple GPU scaling is redundant until you get the API bottleneck alleviated by using DX12 / Vulkan / Mantle.

it has mostly to do with lighting

great post here

http://www.gamespot.com/forums/system-wars-314159282/sw-loves-dx12-threads-some-xb1-cloud-secret-sauce--31792314/

 

read the whole thing or scroll down where he says "seriously"

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

It was rendered at 4k - assuming 100% scalability - it could run at 1080p with just one Titan .

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×