Jump to content

Final Fantasy XV Windows Edition Benchmark is a total mess and unrepresentative of the final product **Update with Square Enix's response**

For just about everyone saying that NVIDIA is preventing game devs from not optimizing their GameWorks tech for AMD hardware, they're not. The licensing deal is that if they do optimize GameWorks for AMD hardware, it has to not negatively affect NVIDIA hardware performance. They are, however, not allowed to redistribute it.

 

This means that game devs can optimize GameWorks for AMD hardware. There will be a performance deficit still due to hardware differences with NVIDIA and AMD hardware, even if you an immaculate job optimizing them. So you can get the performance pretty damn close (GameWorks on or off) if you ask me. Does the game devs want to put in that kind of time and energy? Certainly not everyone.

 

What I'm trying to get at here is that optimizing GameWorks for AMD hardware is doable. NVIDIA will let you (as a developer) optimize their technology for AMD hardware, as long as:

  1. It does not negatively impact performance on NVIDIA hardware.
  2. You don't redistribute it. That's a breach of their licensing terms.
Quote

Are game developers precluded from optimizing source code provided by Nvidia through the GameWorks program for non Nvidia hardware ?
No.  Our agreements with developers don’t prevent them from working with any other IHVs to optimize their game or any GameWorks features. GameWorks is a middleware, and like any other middleware we offer developers a source licensing. We provide source code, under license, to developers who request it.  Licensees just can’t redistribute our source code to anyone who does not have a license.

If a developer requests source code for an Nvidia GameWorks feature, under license, and is then provided with source code, is that developer then free to edit that code as they see fit to optimize it for IHVs other than Nvidia ? assuming they don’t redistribute it.
Yes. As long as it does not lower performance on NVIDIA GPUs

Quote

This is an important point to touch on. NVIDIA does not enforce any limitation, contractual or otherwise, on game developers’ ability to work with AMD to optimize their games. However because game developers’ are dealing with NVIDIA’s intellectual property it does exercise control over all GameWorks features and will always have the final say with regards to what can and cannot be done with any of the code it owns.

https://wccftech.com/fallout-4-nvidia-gameworks/

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

it's a ugly benchmark and it demands way too much resources imo

Ryzen 5 3600 stock | 2x16GB C13 3200MHz (AFR) | GTX 760 (Sold the VII)| ASUS Prime X570-P | 6TB WD Gold (128MB Cache, 2017)

Samsung 850 EVO 240 GB 

138 is a good number.

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, ARikozuM said:

Nothing stops AMD from sending support of their own. Attacking Nvidia for sponsoring and sending support to a developer is short-sighted especially given the delayed driver issues that AMD always seems to have when a Gameworks title is released.

do you know what black box means?, it means the y don't have access to the code. (seems they changed that policy, good for them)

my point is very simple, amd doesn't do things just so they can reduce perf of nvidea, and they have open sourced most of their driver and video effects, same can't be said for nvidea.

(not saying amd couldn't be a bit better, but that will probably improve as their R&D budget increases)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cj09beira said:

partly because its a black box for both the game devs and amd, which makes it harder to work with, plus it loves to use too much tesselation

Its not a black box for developers. Developers can purchase the right to access the source code and money with it, if they'd like. I don't know if it is true or not but several years ago I heard $30k thrown around as the price for that. Not really that expensive for a AAA game. I'd imagine the price is waived if Nvidia is brought on board to directly help with development, but maybe not.

 

52 minutes ago, JurunceNK said:

For just about everyone saying that NVIDIA is preventing game devs from not optimizing their GameWorks tech for AMD hardware, they're not. The licensing deal is that if they do optimize GameWorks for AMD hardware, it has to not negatively affect NVIDIA hardware performance. They are, however, not allowed to redistribute it.

 

This means that game devs can optimize GameWorks for AMD hardware. There will be a performance deficit still due to hardware differences with NVIDIA and AMD hardware, even if you an immaculate job optimizing them. So you can get the performance pretty damn close (GameWorks on or off) if you ask me. Does the game devs want to put in that kind of time and energy? Certainly not everyone.

 

What I'm trying to get at here is that optimizing GameWorks for AMD hardware is doable. NVIDIA will let you (as a developer) optimize their technology for AMD hardware, as long as:

  1. It does not negatively impact performance on NVIDIA hardware.
  2. You don't redistribute it. That's a breach of their licensing terms.

https://wccftech.com/fallout-4-nvidia-gameworks/

The problem with that agreement is that it does not allow AMD access to the source code. That means developers are on their own to optimize Gameworks features for AMD hardware. Nvidia will not sell AMD a license that means developers are not allowed to ask AMD For help on Gameworks code. Everything around it AMD can help with, AMD just cannot see or touch Gameworks without that license.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Derangel said:

The problem with that agreement is that it does not allow AMD access to the source code. That means developers are on their own to optimize Gameworks features for AMD hardware. Nvidia will not sell AMD a license that means developers are not allowed to ask AMD For help on Gameworks code. Everything around it AMD can help with, AMD just cannot see or touch Gameworks without that license.

That's exactly what I said, correct?

 

Only the developers are allowed to see and optimize the code for non-NVIDIA hardware. They cannot redistribute that code, as that will be a breach of license.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Derangel said:

The problem with that agreement is that it does not allow AMD access to the source code. That means developers are on their own to optimize Gameworks features for AMD hardware. Nvidia will not sell AMD a license that means developers are not allowed to ask AMD For help on Gameworks code.

I have an amazing idea, well not really. RGS, Radeon Game Studio. Become a game developer and make a GameWorks game ;).

 

Sorry-Im-BONKERS.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ARikozuM said:

Nothing stops AMD from sending support of their own. Attacking Nvidia for sponsoring and sending support to a developer is short-sighted especially given the delayed driver issues that AMD always seems to have when a Gameworks title is released.

And what would that support do? They do not have legal access to GameWorks or the implementation of the game. AMD is not granted source code access. Besides, this horribly unoptimized mess, hits NVidia users too. The graphical fidelity is horrible for the extreme performance decrease it yields.

1 hour ago, JurunceNK said:

For just about everyone saying that NVIDIA is preventing game devs from not optimizing their GameWorks tech for AMD hardware, they're not. The licensing deal is that if they do optimize GameWorks for AMD hardware, it has to not negatively affect NVIDIA hardware performance. They are, however, not allowed to redistribute it.

 

This means that game devs can optimize GameWorks for AMD hardware. There will be a performance deficit still due to hardware differences with NVIDIA and AMD hardware, even if you an immaculate job optimizing them. So you can get the performance pretty damn close (GameWorks on or off) if you ask me. Does the game devs want to put in that kind of time and energy? Certainly not everyone.

 

What I'm trying to get at here is that optimizing GameWorks for AMD hardware is doable. NVIDIA will let you (as a developer) optimize their technology for AMD hardware, as long as:

  1. It does not negatively impact performance on NVIDIA hardware.
  2. You don't redistribute it. That's a breach of their licensing terms.

https://wccftech.com/fallout-4-nvidia-gameworks/

Well, the entire point of using GameWorks, is that it's a finished product you can just implement. We have no idea how easy the code is to change or optimize if there is next to no documentation. Furthermore, NVidia (at least used to), require a much more expensive license to be allowed to actually edit the code.

 

Can they optimize? They can't get help from AMD, and they may not necessarily have any idea how this middleware works. After all, no GameWorks game has ever had the GW effects optimized for AMD. I wonder why!

 

All VisualFX GameWorks effects, sans some smoke thing no one uses, is based on tessellation. And the tessellation multiplier is usually around 64x, which is just insane.

 

If they want grass, fur and hair to look good, they should just use TressFX. It's easy to optimize for all, has little performance impact, and they can easily add features to it as seen in Rise of the Tomb Raider.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Notional said:

Well, the entire point of using GameWorks, is that it's a finished product you can just implement. We have no idea how easy the code is to change or optimize if there is next to no documentation. Furthermore, NVidia (at least used to), require a much more expensive license to be allowed to actually edit the code.

 

Can they optimize? They can't get help from AMD, and they may not necessarily have any idea how this middleware works. After all, no GameWorks game has ever had the GW effects optimized for AMD. I wonder why!

 

All VisualFX GameWorks effects, sans some smoke thing no one uses, is based on tessellation. And the tessellation multiplier is usually around 64x, which is just insane.

 

If they want grass, fur and hair to look good, they should just use TressFX. It's easy to optimize for all, has little performance impact, and they can easily add features to it as seen in Rise of the Tomb Raider.

There is documentation for HairWorks at least to help people implement it. The performance section helps with tuning the effect until the developer is happy about it http://docs.nvidia.com/gameworks/content/artisttools/hairworks/index.html

 

An even better solution to all of this is if developers make their own effects similar to GameWorks and GPUOpen, and make them proprietary. Would that be better, or worse?

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, JurunceNK said:

There is documentation for HairWorks at least to help people implement it. The performance section helps with tuning the effect until the developer is happy about it http://docs.nvidia.com/gameworks/content/artisttools/hairworks/index.html

 

An even better solution to all of this is if developers make their own effects similar to GameWorks and GPUOpen, and make them proprietary. Would that be better, or worse?

Depends on the skill of the developer. It is a lot cheaper to license stuff then it is to build engines. A lot of the third party engines are much better at being adapted for multiple genres and needs than developer created ones as well (example: Frostbite is apparently a complete piece of shit if your game isn't a FPS) so it can make a lot more sense for a publisher to mass license 3rd party APIs. Ubisoft has a unique approach to solve that problem. A lot of their games use in-house engines, but the engines are very modular. They're designed to easily allow elements to be swapped in and out. If a game calls for climbing mechanics, for example, the developer can pull the animation engine from Anvil Next and apply it to their own game and modify has needed. This saves the developer both the time and money needed to create a brand new animation engine. These modular elements are also shared company wide, so updates made to one modular part can be used by other studios within the company. Its a fairly ingenious design, but the downside is that it ends up making a lot of their games feel exactly the same.

 

3 minutes ago, raphidy said:

Japanese PC port. Oh gawd. I want Durante to fix this...Again. The hero we don't deserve...

Not a port this time around. They've built the PC version from the ground up. This weirdness aside the benchmark does deliver good performance. Even with the rendering weirdness i'm surprised how well my 1080 does with all the Gameworks effects at both 1440p and 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, JurunceNK said:

There is documentation for HairWorks at least to help people implement it. The performance section helps with tuning the effect until the developer is happy about it http://docs.nvidia.com/gameworks/content/artisttools/hairworks/index.html

 

An even better solution to all of this is if developers make their own effects similar to GameWorks and GPUOpen, and make them proprietary. Would that be better, or worse?

Of course, there is documentation on how to implement it. That is the entire point of middleware. It doesn't state, however, how to actually change the entire code to add effects and it certainly doesn't change that these effects use tessellation very heavily, which is insanely taxing on everything. AMD's tressfx uses compute, which is much more efficient even on Nvidia.

 

Well EA's frostbite engine is really good because everything is in house. Then again, the way Rise of the Tomb Raider and Deus Ex Mankind Divided took TressFX and added features to it and made it really good on both AMD and NVidia is probably the best.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, JurunceNK said:

An even better solution to all of this is if developers make their own effects similar to GameWorks and GPUOpen, and make them proprietary. Would that be better, or worse?

I'd say in a way better, at least game development studios actively compete with each other and try to one up every time. I personally don't like the hardware manufacturer dictating what graphical technologies should be used. I'd much prefer a bit more give and take with developers asking for features and GPU designers implementing them along with their own technology that may, or may not, compliment those requirements.

 

GameWorks pushes too far in to the realm of end to end control for my liking, not even DirectX puts as much technical restriction on techniques as GameWorks does.

 

Now I'm not saying GameWorks forces you to do it the GameWorks way without altering or optimizing, or making large fundamental changes or use non GameWorks functions but it's fair to say if a game is using GameWorks then it is highly aligned to the GameWorks methodology.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Notional said:

Of course, there is documentation on how to implement it. That is the entire point of middleware. It doesn't state, however, how to actually change the entire code to add effects and it certainly doesn't change that these effects use tessellation very heavily, which is insanely taxing on everything. AMD's tressfx uses compute, which is much more efficient even on Nvidia.

 

Well EA's frostbite engine is really good because everything is in house. Then again, the way Rise of the Tomb Raider and Deus Ex Mankind Divided took TressFX and added features to it and made it really good on both AMD and NVidia is probably the best.

I wouldn't call Frostbite good. Outside of DICE no EA studio has anything good to say about it. Bioware and Visceral hated it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Derangel said:

I wouldn't call Frostbite good. Outside of DICE no EA studio has anything good to say about it. Bioware and Visceral hated it.

Yeah, the engine has to actually support what it needs to do. I don't blame DICE for that, but rather how useless EA is in general.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Derangel said:

Not a port this time around. They've built the PC version from the ground up. This weirdness aside the benchmark does deliver good performance. Even with the rendering weirdness i'm surprised how well my 1080 does with all the Gameworks effects at both 1440p and 4K.

Well, the benchmark also runs fine for me too. Their engine looks pc compatible this time around and capable of running the game with unlocked framerate. I'm sure they'll use whatever they can from their console lines of code, they can't just code everything again just for PCMR. I'm just waiting for a pc mess like watch dog was when release. This is basically their first release on pc with their new engine.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, raphidy said:

Well, the benchmark also runs fine for me too. Their engine looks pc compatible this time around and capable of running the game with unlocked framerate. I'm sure they'll use whatever they can from their console lines of code, they can't just code everything again just for PCMR. I'm just waiting for a pc mess like watch dog was when release. This is basically their first release on pc with their new engine.

240 is the cap. Not as good as fully unlocked, but not terrible either.

 

I believe Luminous was designed to work on the PC from the outset. This version of the game is really interesting. This is the first time SE has put this kind of effort into bringing over a console title and it is also a passion project for the game's director. It seems like Tabata spearheaded the effort to make it be more than just a port.

Link to comment
Share on other sites

Link to post
Share on other sites

I really don't understand what the issue with GameWorks is....

 

Does GameWorks run poorly on your hardware? Cool. Turn it off. Problem solved.

 

I don't see why someone else having something you don't have is an issue if it doesn't affect your own experience in any way.

For example: If they didn't include HairWorks, the game would still look exactly the same as if you just turned it off.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, -BirdiE- said:

I really don't understand what the issue with GameWorks is....

 

Does GameWorks run poorly on your hardware? Cool. Turn it off. Problem solved.

 

I don't see why someone else having something you don't have is an issue if it doesn't affect your own experience in any way.

For example: If they didn't include HairWorks, the game would still look exactly the same as if you just turned it off.

The biggest problems with Gameworks are two fold:

 

1. AMD is not allowed to look at code for it so they cannot assist in optimization, leaving everything up to the developer. This is a bad approach. Game code is a mess as is, developers are not good at optimization. AMD and Nvidia drivers are so insanely complex and big these days due to trying to fix the already messy code involved with video games and without AMD being able to look at Gameworks code they cannot create drivers to fix performance problems.

 

2. Gameworks features sometimes get implement in ways that serve no purpose other than killing performance. 16x tessellation is useless. 6x or 8x would have the same visual effect for much less cost. Rendering Gameworks effects on elements the player cannot see is also a pointless waste of resources (not a problem unique to this benchmark). Gameworks is implemented so poorly a lot of times that even on Nvidia hardware its not even remotely worth the performance cost.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Derangel said:

The biggest problems with Gameworks are two fold:

 

1. AMD is not allowed to look at code for it so they cannot assist in optimization, leaving everything up to the developer. This is a bad approach. Game code is a mess as is, developers are not good at optimization. AMD and Nvidia drivers are so insanely complex and big these days due to trying to fix the already messy code involved with video games and without AMD being able to look at Gameworks code they cannot create drivers to fix performance problems.

 

2. Gameworks features sometimes get implement in ways that serve no purpose other than killing performance. 16x tessellation is useless. 6x or 8x would have the same visual effect for much less cost. Rendering Gameworks effects on elements the player cannot see is also a pointless waste of resources (not a problem unique to this benchmark). Gameworks is implemented so poorly a lot of times that even on Nvidia hardware its not even remotely worth the performance cost.

Does turning off GameWorks features not solve this?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, -BirdiE- said:

Does turning off GameWorks features not solve this?

So you'd rather stuff your head in the sand and pretend its not a problem?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Derangel said:

So you'd rather stuff your head in the sand and pretend its not a problem?

If turning off the GameWorks features solves the problems.... I'm just not sure why Nvidia adding in optional features that affect you in no way is a problem.

Maybe I'm missing something, but you certainly haven't brought it up yet.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, -BirdiE- said:

If turning off the GameWorks features solves the problems.... I'm just not sure why Nvidia adding in optional features that affect you in no way is a problem.

Maybe I'm missing something, but you certainly haven't brought it up yet.

No, turning them off IGNORES the problem. It doesn't solve jack shit. I brought up exactly why Gameworks is problematic. Its not my fault if you refuse to actually pay attention.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Derangel said:

No, turning them off IGNORES the problem. It doesn't solve jack shit. I brought up exactly why Gameworks is problematic. Its not my fault if you refuse to actually pay attention.

So you'd rather them just not include anything? Why? What difference does it make?

 

If turning off GameWorks grants me the same performance and graphical fidelity that not including at all does, then the presence of GameWorks in a game has no affect on my experience... That's not ignoring a problem, that's identifying that it's not a problem.

 

You seem to be implying that there's some bigger problem, but have yet to state what this problem is.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, -BirdiE- said:

So you'd rather them just not include anything? Why? What difference does it make?

Why do you assume its either or? What I'd rather they do is actually address the problems so it makes Gameworks more viable.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Derangel said:

Why do you assume its either or? What I'd rather they do is actually address the problems so it makes Gameworks more viable.

I mean, ideally that would be great...

I think we all realize that in every situation it would be ideal for companies to make their products better...

But clearly right now Nvidia doesn't have the capability, or doesn't feel it would be worth the investment. As a consumer, it is your right to express your displeasure, but as a company it's their right to produce whatever product they want. 

 

Personally, I'd argue that you don't have much ground to stand on seeing as there's no other companies offering a better solution, and it's affecting you in no way... But that's merely my opinion, and you have just as much right to yours as I do to mine.

 

Additionally, I was more addressing the people that were upset it was included at all because it didn't run well on their AMD card... and would rather see it cut out because they can't take advantage of it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×