Jump to content

Final Fantasy XV Windows Edition Benchmark is a total mess and unrepresentative of the final product **Update with Square Enix's response**

**Update with Square Enix's response to Gamers Nexus's concerns**

https://www.gamersnexus.net/news/3226-square-enix-addresses-ffxv-benchmark-concerns

 

Square Enix responded to Gamers Nexus's concerns about the accuracy of the benchmark and the issues they discovered in the utility. They are fully aware of these issues, and have acknowledged its existence. Sqare Enix goes to say that the reason why the locked the settings behind three options is for "simplicity's sake", while the final game will allow users to fine-tune their in-game graphics settings. Whether this is justifiable or not is up for debate.

Quote

ffxv-bench-fix-tweet.png

 

Gamers Nexus believes that the issues they discovered in the original post below is oversight, rather than malice, and attributes this to tight development timelines to get this benchmark out in a timely manner.

Quote

 

For now, it appears as if the issue of LOD scaling will be addressed for the final launch. Theses issues included rendering unseen objects (without GameWorks) and rendering incorrect LOD for HairWorks effects. As we stated in the original piece, we believed this to be an oversight, rather than malice, and attributed the issue to tight development timelines at Square Enix.

 

**Original post**

 

Steve Burke TL;DR from the YouTube video "It's a fucked up benchmark utility. We don't recommend it."

 

https://www.gamersnexus.net/game-bench/3224-ffxv-disingenuous-misleading-benchmark-tool-gameworks-tests

 

So Gamers Nexus was doing some testing of this benchmark (they stated they've done 13+ hours figuring out how to benchmark hardware in this game), and they found that it is a complete mess, is unrepresentative of what would be the final product, and is misleading. To make matters worse, you can't fine-tune the graphics settings, and you're stuck with three presets. The "High" preset turns on all but two GameWorks settings, and those are ShadowLibs and VXAO. As for how you can do this, there's a guide out there that goes through how to configure the .ini file to allow for custom tuning of the graphics settings.

Quote

As we discovered after hours of testing the utility, the FFXV benchmark is disingenuous in its execution, rendering load-intensive objects outside the camera frustum and resulting in a lower reported performance metric. We accessed the hexadecimal graphics settings for manual GameWorks setting tuning, made easier by exposing .INI files via a DLL, then later entered noclip mode to dig into some performance anomalies. On our own, we’d discovered that HairWorks toggling (on/off) had performance impact in areas where no hair existed. The only reason this would happen, aside from anomalous bugs or improper use of HairWorks (also likely, and not mutually exclusive), would be if the single hair-endowed creature in the benchmark were drawn at all times.

 

... We're also able to confirm, by testing the default "High," "Standard," and "Low" settings, that the game's default GameWorks configuration is set to the following (High settings):

  • VXAO: Off
  • Shadow libs: Off
  • Flow: On
  • HairWorks: On
  • TerrainTessellation: On
  • Turf: On

 

Gamers Nexus has found that the causes to these issues is that objects and characters not in the camera view (or at least within a set radius of the player) are being rendered, whether they're endowed with GameWorks effects or not. This is especially the case with these "buffalo" characters with HairWorks (which so far are the only characters with HairWorks applied) are being rendered far away from the player with HairWorks still running. Mind you, HairWorks is functional; it's being rendered.

 

By disabling all of the GameWorks options, performance uplifts are massive enough to question the legitimacy of the benchmark. In this quote below, a GeForce GTX 1070 is used as a baseline.

Quote

Before even getting to the details and comparisons, this chart of relative performance scaling shows how the AMD GPU is affected unequally by GameWorks settings. NVidia’s GTX 1070 serves as baseline, marked at 100%. With stock 1080p/High settings, the Vega 56 card maintains 66.3% of the GTX 1070’s performance. When we completely disable GameWorks, something typically impossible while still maintaining equal graphics settings, we see that the Vega 56 card manages to maintain 90% of the GTX 1070’s performance – hiked from 66% baseline, that’s a massive gain, and illustrates uneven performance scaling. For perspective, nVidia cards scale with one another almost 1:1 on this same chart. This unequal impact isn't news, as nVidia creates the technology, optimizes for it, and has presumably had closer access to the title. AMD, meanwhile, is still working on its launch-day drivers (no FFXV drivers were released for this benchmark utility).

fps-gameworks-scaling-relative-v56-1070.

Their results show that AMD's hardware (a Radeon RX Vega 56 was used in these tests) is at a major disadvantage relative to the performance of its nearest competitor: a GeForce GTX 1070, which is to be expected given that this game contains GameWorks technology. Not surprised there. Turning off every GameWorks options brings the RX Vega 56 to 90.1% of a GeForce GTX 1070.

 

Want to know who wins for hitting these GPUs the hardest? That goes to HairWorks! In the instance below, disabling HairWorks allows the RX Vega 56 to gain a massive amount of performance. Doing the same thing on the GeForce GTX 1070 still gives you a performance boost, though not as earth-shattering. But that's to be expected since the game has GameWorks technology baked in. But its implementation is to be questioned.

Quote

As for what’s specifically causing the biggest performance hits, that award goes to HairWorks, the single disablement of which grants Vega a massive performance uplift (and the GTX 1070 a less impressive, but still large uplift). Turf is the next most responsible, and is what Square Enix is using for its grass technology.

 

As for tessellation, well, there's zero impact. AMD's geometry pipeline has improved drastically, despite its history of poor performance with regards to tessellation when compared against NVIDIA's offerings.

Quote

For terrain tessellation, our initial hypothesis was completely wrong: There’s almost no impact. Despite AMD’s historical troubles with tessellation, it appears the newer geometry pipeline is coping well, and is busy getting smashed by geometry from hair, anyway. We've also noticed that terrain tessellation goes away as the benchmark progresses, later being replaced with more traditional normal maps. The impact of this setting could change at game launch, assuming Square Enix makes more use of terrain tessellation.

 

AMD still has work to do to get a driver out when or before Final Fantasy XV Windows Edition is released. The game is over 100GB, and the benchmark is only 3.7GB large.

 

Disabling both HairWorks and Turf Effects nearly doubles the FPS performance for AMD's hardware. That is insane, but it is clearly expected, as NVIDIA owns GameWorks and they've optimized for it, but this example shows that it is poorly implemented by Square Enix, compared to The Witcher 3 developed by CD Projekt RED, whereby these technologies are being implemented, and its hurting performance on even NVIDIA hardware for no apparent visual gain. As for disabling HairWorks on AMD's hardware, you can expect a performance gain of 37%, again it's expected. NVIDIA doesn't suffer as hard of a performance impact, but it does show, and this is reproducible.

Quote

Before getting the FPS charts on screen, this is the settings configuration we’re using for these benchmarks: We’re toggling off GameWorks options under High, one at a time, while leaving other default options still enabled. Stock “High” configuration enables Flow, HairWorks, Turf, and Terrain Tessellation, with VXAO & Shadow Libraries disabled. We disable one at a time in testing. To that end, you should always be comparing baseline to the toggled option, giving a look at which option grants the greatest performance uplift relative to baseline.

 

More importantly, however, is the performance delta between the various settings. Using just the native 1080p/High settings, with our 90-second benchmark, we recorded 38FPS AVG across multiple runs. Disabling Turf and HairWorks nearly doubled our framerate, giving some perspective as to the perceived performance capabilities. We don’t encounter Flow very much in the benchmark (the fire at the end), so can ignore that, and terrain tessellation also has much less impact than we expected. Versus baseline, AMD is gaining an insane 37% performance by disabling HairWorks, but leaving all the other native GameWorks technologies enabled. NVidia, meanwhile, gains about 15% from the same conditions. Enabling VXAO and Shadow libraries, done under the “All GameWorks On” category, all devices take a further tumble, but these settings appear to be disabled by default.

fps-gameworks-scaling-vega-56.png

fps-gameworks-scaling-gtx-1070.png

 

Not to mention, frametimes were kind of a mess as well, and BOTH AMD and NVIDIA suffer those frametime issues. Even if you do run the benchmark three times, it is still noticeable, even with the most powerful of hardware like a GeForce GTX 1080 Ti, like I have. Want to know what's sad, but funny at the same time? NVIDIA made these (admittedly) very cool GameWorks effects, and it was not implemented so well that NVIDIA's hardware is also struggling, despite its performance impact not being as significant as AMD with them on. IMO the only visually impactful thing after watching the video (minus those buffalo thingies) is Turf Effects.

Quote

First off, note that we complained about frametimes on nVidia cards in our GPU benchmark, showing that the company had trouble processing its own GameWorks features without tripping over wide frame-to-frame intervals. The result was occasional stutter and more disparate frame creation time. We can finally illustrate this: At Baseline, our 0.1% lows were at 21FPS which, although better than AMD’s, aren’t really better in terms of AVG-to-low relative scaling. Disabling HairWorks specifically grants us a major uplift in frametime performance, but just a moderate one in AVG FPS. The average increases by 15%, with the lows increasing by more than 2x.

GameWorks seems to be hurting performance for no apparent gain, aside from some grass detail and, in one location, some hair detail. Just again as a reminder: Toggling settings like HairWorks has performance impact even when testing in areas bereft of said setting, even without using Ansel and just using the native benchmark (lest there be any doubt).

ffxv-frametimes-1440p-nvidia-variance.pn

 

To make matters worse, the results are aggregates, and there's zero control of it whatsoever, and should not be trusted until further notice.

Quote

The aggregate numbers are further poisoned by a lack of control, poisoned twice more by the ability to break the benchmark scoring by going off-rails, and one more time by the locked-down graphics settings. Had Square Enix granted the ability to tune these settings manually, we would have found out about this closer to launch. Instead, thanks to help from reddit users, we had to manually find all of these profound inadequacies in the FFXV benchmark, long after most users had already found numbers pertaining to purchase options.

 

All in all, this benchmark was not well-assembled as we all hoped. The only thing we can wait for to get an answer is when the game actually gets released. Square Enix has around 30 days to fully optimize the game, and greatly improve their culling to not render objects and characters out of the camera view and when the player won't be looking at them from a distance, unless they're trying to look for them. Remember guys, this is not the fault of NVIDIA for their GameWorks library (which can be implemented well and have almost no impact for HairWorks, or a slight deficit in performance for the newer stuff like VXAO), or AMD for not releasing a driver ahead of the release of the game. Most of the blame has to be laid on Square Enix, as the blame rests on all three parties, but Square Enix has to take most of the flak for not assembling a proper benchmark that's representative of the final product.

 

I'll admit, it is a visually impressive game! I still plan on getting this game some time after release, or on release depending on what I decide.

 

AGAIN I WILL SAY THIS: Don't solely blame NVIDIA for their GameWorks libraries because they're not responsible for programming the game after all, don't solely blame AMD for not releasing a driver prior to the benchmark's release, and don't solely blame Square Enix either. Square Enix didn't do a good job at building the benchmark, AMD didn't get a driver out to help out the performance when this benchmark was released. NVIDIA didn't program this game, they only created and provided the GameWorks technologies to the developers (Square Enix), and the developers are allowed to optimize and tweak it as they see fit, and not redistribute it in accordance to licensing terms.

 

This benchmark should not be used to gauge purchasing decisions if you plan on buying this game. It's unrealistic, it's not built well, and admittedly, if you want to know what the real performance is like versus your hardware, wait for the game to be released and find out then, as they'll be testing the game when it does get released.

 

Steve (not related to Steve Burke) from Hardware Unboxed even canceled his 30 GPU battle for Final Fantasy XV Windows Edition after he himself saw those strange results. This quote came from the YouTube comments section of the video format of this news, which includes visual examples.

Quote
Nice work Steve! I decided to bail on the GPU testing yesterday and wait for the game to be released because I was seeing some strange results as well. Interesting stuff as always!

 

To finish it off, Gamers Nexus has been in touch with NVIDIA, and they stated that they're working with Square Enix to polish the game before they're scheduled to release it. Plus, this benchmark isn't the game. But if it is, that's a whole other discussion we will get into on a later date.

 

I also request that moderators please moderate this thread, and lock it if need be. I still hope that the initial post is compliant with the Tech News and Reviews Posting Guidelines, and included all required materials.

Edited by JurunceNK
Updated original thread with Square Enix's response. Kept original post below the OP.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Sooo....my 1050 ti on my laptop will run it just fine on medium is what you're saying? :D

 
Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

And the longest news post awards goes to....

Yep. Gamers Nexus did write a pretty long article on this subject, so finding relevant quotes was a bit of a fishing trip.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

The biggest issue by far is the rendering of all the crap past the view distance, which is clearly marked. Second, only relevant for AMD users (myself), no game ready drivers are out yet so results are likely highly unrepresentative of actual game performance once released.

 

(Joke has now been removed)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, huilun02 said:

So this game originally had no Shitworks and made for a 2016 console system with AMD hardware, now has Shitworks shoehorned in for the PC port and runs like crap on AMD hardware? Who'd have figured?

That's not a response I expected for a civil discussion.

 

NVIDIA isn't responsible for programming and optimizing the game, that's Square Enix's responsibility. NVIDIA provides the middleware, and the developers can tweak and optimize it as they see fit, as long as a) it doesn't hurt the performance on NVIDIA hardware, and b) the developer doesn't redistribute it.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JurunceNK said:

That's not a response I expected for a civil discussion.

 

NVIDIA isn't responsible for programming and optimizing the game, that's Square Enix's responsibility. NVIDIA provides the middleware, and the developers can tweak and optimize it as they see fit, as long as a) it doesn't hurt the performance on NVIDIA hardware, and b) the developer doesn't redistribute it.

But man its way easier to blame Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

 

@VegetableStu Hi.

 

(will edit later with proper comment, joke to hard to resist)

My phone doesn't like you at all, it wants bad stuff to happen for this...

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, huilun02 said:

So are you saying this performance is as expected of a 2016 console game running on far superior PC hardware, and no one should be blamed for anything?

 

I'm saying it's not okay to lay the blame exclusively on NVIDIA. Did you intentionally ignore this part?

 

12 minutes ago, JurunceNK said:

NVIDIA isn't responsible for programming and optimizing the game, that's Square Enix's responsibility. NVIDIA provides the middleware, and the developers can tweak and optimize it as they see fit, as long as a) it doesn't hurt the performance on NVIDIA hardware, and b) the developer doesn't redistribute it.

 

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

If you have read every single word of this post, you are a legend

 

edit: a tldr would be cool

Edited by Shreyas1

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still waiting for the game to release... Still hoping that AMD doesn't delay their driver for three weeks... Still hoping that AMD users will disable Hairworks before complaining their AMDs off... Hoping Nvidia will release a Founder's Edition wrap for the in-game cars... 

 

Also, KUPO!

 

9 hours ago, leadeater said:

The biggest issue by far is the rendering of all the crap past the view distance,

Most of Bohemia Interactive's library (ARMA in particular) does the same thing. Helicopter goes over a mountain range and is now a pixel? Render it on the CPU until it leaves the map! It's no big deal! Just think of it as salad dressing on ice cream. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ARikozuM said:

Still waiting for the game to release... Still hoping that AMD doesn't delay their driver for three weeks... Still hoping that AMD users will disable Hairworks before complaining their AMDs off... Hoping Nvidia will release a Founder's Edition wrap for the in-game cars... 

 

Also, KUPO!

The final release will let users tune the game settings to accommodate their hardware and their performance target.

 

It's still messed up IMO to really restrict graphics tweaking behind three presets, and a file which requires something like Notepad++ to edit, and is not in a text file one can easily edit.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, JurunceNK said:

The final release will let users tune the game settings to accommodate their hardware and their performance target.

This is the game they released for their 20th anniversary. Anyone expecting it to look terrible and not be a resource hog is fooling themselves. 

 

That said, Steam server got rekt by the benchmark. Poor Xeon 5650 was not having a good time scoring a measly 2000-2100 with a Titan X Maxwell. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, ARikozuM said:

This is the game they released for their 20th anniversary. Anyone expecting it to look terrible and not be a resource hog is fooling themselves. 

Who is actually expecting this to look terrible? I don't see anyone saying those words or anything similar.

 

We do expect the game to be optimized - in other words, to not waste resources on stuff that doesn't improve the visual quality.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, huilun02 said:

So this game originally had no Shitworks and made for a 2016 console system with AMD hardware, now has Shitworks shoehorned in for the PC port and runs like crap on AMD hardware? Who'd have figured?

It's not really a port. SE built the PC version from the ground up and even built a new version of the engine for it.

 

Nvidia provided development support and is acting as a co-developer. it makes sense that they would implement Gameworks. This performance issue is not Nvidia's fault though. Gameworks is always going to work better on Nvidia hardware, it is designed around features that Nvidia hardware is good at. It seems unfair to call it "Shitworks". The performance hits, even on Nvidia hardware, make sense. They are advanced features that do require power. Whether or not the effects are worth the performance hit is an entirely different discussion however.

 

8 hours ago, Energycore said:

Who is actually expecting this to look terrible? I don't see anyone saying those words or anything similar.

 

We do expect the game to be optimized - in other words, to not waste resources on stuff that doesn't improve the visual quality.

Even with the Hairworks stupidity it does seem like the benchmark runs decently well. The High preset seems like it might be enabling 4K texture or, at least, very high quality ones, along with other interesting and advanced features. My 1080 pulls above 60fps at 1440p and sits around 30 at 4K with the High preset. Turning Gameworks off gives me about another 10fps on average at 4K. Considering all the rendering weirdness that seems good to me. It will be interesting to see how the final game holds up since this benchmark is pretty useless right now.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Derangel said:

Even with the Hairworks stupidity it does seem like the benchmark runs decently well. The High preset seems like it might be enabling 4K texture or, at least, very high quality ones, along with other interesting and advanced features. My 1080 pulls above 60fps at 1440p and sits around 30 at 4K with the High preset. Turning Gameworks off gives me about another 10fps on average at 4K. Considering all the rendering weirdness that seems good to me. It will be interesting to see how the final game holds up since this benchmark is pretty useless right now.

Agreed that we need to look at real gameplay for a more realistic test, and those look like decent numbers. It is very annoying that you need a special editor to actually edit the settings and disable GameWorks.

 

What I wonder is if AMD's "Optimized" preset for Tessellation helps mitigate this problem. You can even set Tessellation to "Off" on all 3D applications on the Settings.

 

Not that I'll play this game xD

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Derangel said:

It's not really a port. SE built the PC version from the ground up and even built a new version of the engine for it.

It's still using Luminous Studio though which is a multi-platform engine anyway, whether it's good or not is an interesting debate. Switching to Unreal for this would have been a bridge too far, even though everything else they are developing uses it. Don't think we'll see anything use Luminous Studio ever again.

 

8 minutes ago, Derangel said:

The performance hits, even on Nvidia hardware, make sense. They are advanced features that do require power. Whether or not the effects are worth the performance hit is an entirely different discussion however.

What people take issue with is how the features were developed, right or wrong. The features in GameWorks are so highly optimized for Nvidia hardware and implements techniques that are specifically weak on AMD GPUs which causes a disproportionate reduction in performance for visual gain across hardware platforms.

 

Is it wrong for Nvidia to offer such a highly optimized development toolset, nope that's actually great. Is it wrong if it negatively effects AMD hardware, no unless it's designed to. You can't call something well optimized if it's doing something that is unnecessary.

 

The balance to this is for AMD to offer a similar set of tools to developers, something I am rather torn about. I don't think it's a good idea unless it's very easy to implement both for either company to offer such tools, have a well optimized game engine that is hardware agnostic that can make the most out of any GPU is much better in my opinion. However that may currently be just unrealistic.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

The balance to this is for AMD to offer a similar set of tools to developers, something I am rather torn about. I don't think it's a good idea unless it's very easy to implement both for either company to offer such tools, have a well optimized game engine that is hardware agnostic that can make the most out of any GPU is much better in my opinion. However that may currently be just unrealistic.

They do or did. Don't know what happened to their game development division and last title I can recall is Alien: Isolation. They need to step their game up if they want to make full use of Vega's (or whatever architecture they choose next) features. Nvidia's Gameworks division provides software and tech support to companies that will accept the sponsorship. This helps Nvidia maintain the best performance or visual quality that they can, something AMD used to do. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

It's still using Luminous Studio though which is a multi-platform engine anyway, whether it's good or not is an interesting debate. Switching to Unreal for this would have been a bridge too far, even though everything else they are developing uses it. Don't think we'll see anything use Luminous Studio ever again.

 

What people take issue with is how the features were developed, right or wrong. The features in GameWorks are so highly optimized for Nvidia hardware and implements techniques that are specifically weak on AMD GPUs which causes a disproportionate reduction in performance for visual gain across hardware platforms.

 

Is it wrong for Nvidia to offer such a highly optimized development toolset, nope that's actually great. Is it wrong if it negatively effects AMD hardware, no unless it's designed to. You can't call something well optimized if it's doing something that is unnecessary.

 

The balance to this is for AMD to offer a similar set of tools to developers, something I am rather torn about. I don't think it's a good idea unless it's very easy to implement both for either company to offer such tools, have a well optimized game engine that is hardware agnostic that can make the most out of any GPU is much better in my opinion. However that may currently be just unrealistic.

Oh trust me, Square Enix will still use the Luminous Studio. It's still a fresh engine when compared against their previous engine, Crystal Tools. They'll use until it hits the limit.

 

With regards to optimizing GameWorks for AMD hardware, the developer is free to do that as long as it does not negatively affect NVIDIA hardware. It is 100% possible. Does the developer want to put in that kind of work to optimize for AMD hardware as well as NVIDIA hardware? Not everyone is going to. They're not allowed to redistribute the source code. You may not be able to get AMD's hardware performance to match NVIDIA's under GameWorks scenarios simply because GameWorks was written to take advantage of NVIDIA's hardware features first.

 

Quote

Are game developers precluded from optimizing source code provided by Nvidia through the GameWorks program for non Nvidia hardware ?
No.  Our agreements with developers don’t prevent them from working with any other IHVs to optimize their game or any GameWorks features. GameWorks is a middleware, and like any other middleware we offer developers a source licensing. We provide source code, under license, to developers who request it.  Licensees just can’t redistribute our source code to anyone who does not have a license.

If a developer requests source code for an Nvidia GameWorks feature, under license, and is then provided with source code, is that developer then free to edit that code as they see fit to optimize it for IHVs other than Nvidia ? assuming they don’t redistribute it.
Yes. As long as it does not lower performance on NVIDIA GPUs

https://wccftech.com/fallout-4-nvidia-gameworks/

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, JurunceNK said:

Oh trust me, Square Enix will still use the Luminous Studio. It's still a fresh engine when compared against their previous engine, Crystal Tools. They'll use until it hits the limit.

Kingdom Hearts 3 actually stopped using it and they started again with Unreal, FF7 Remake is also Unreal. I don't expect to see it used again unless there is something Unreal can't do and they can't add to it to make it do it.

 

7 hours ago, JurunceNK said:

With regards to optimizing GameWorks for AMD hardware, the developer is free to do that as long as it does not negatively affect NVIDIA hardware. It is 100% possible. Does the developer want to put in that kind of work to optimize for AMD hardware as well as NVIDIA hardware? Not everyone is going to. They're not allowed to redistribute the source code. You may not be able to get AMD's hardware performance to match NVIDIA's under GameWorks scenarios simply because GameWorks was written to take advantage of NVIDIA's hardware features first.

What I mean is a dual tool game, GameWorks is only active for Nvidia hardware and [Insert Toolset Name Here] is only active with AMD hardware. Rather than optimizing GameWorks for AMD

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, leadeater said:

It's still using Luminous Studio though which is a multi-platform engine anyway, whether it's good or not is an interesting debate. Switching to Unreal for this would have been a bridge too far, even though everything else they are developing uses it. Don't think we'll see anything use Luminous Studio ever again.

 

What people take issue with is how the features were developed, right or wrong. The features in GameWorks are so highly optimized for Nvidia hardware and implements techniques that are specifically weak on AMD GPUs which causes a disproportionate reduction in performance for visual gain across hardware platforms.

 

Is it wrong for Nvidia to offer such a highly optimized development toolset, nope that's actually great. Is it wrong if it negatively effects AMD hardware, no unless it's designed to. You can't call something well optimized if it's doing something that is unnecessary.

 

The balance to this is for AMD to offer a similar set of tools to developers, something I am rather torn about. I don't think it's a good idea unless it's very easy to implement both for either company to offer such tools, have a well optimized game engine that is hardware agnostic that can make the most out of any GPU is much better in my opinion. However that may currently be just unrealistic.

It is still Luminous, but an updated version of it they called Luminous Pro. I imagine it is still a mess like the older version, but maybe with some tweaks. I agree. Its pretty telling that they switched all internal projects to UE4 before FFXV released.

 

I can understand why they don't, even if it is a seriously dick move, but I do wish Nvidia would allow AMD to look at Gameworks code so that they can better optimize for it.

 

1 minute ago, ARikozuM said:

They do or did. Don't know what happened to their game development division and last title I can recall is Alien: Isolation. They need to step their game up if they want to make full use of Vega's (or whatever architecture they choose next) features. Nvidia's Gameworks division provides software and tech support to companies that will accept the sponsorship. This helps Nvidia maintain the best performance or visual quality that they can, something AMD used to do. 

 

RTG has been such a cluster since it was formed that its no surprise that things have ground to a halt in terms of developer relations. I hope that once things settle down a bit over there they will resume programs that help developers.

 

Just now, JurunceNK said:

Oh trust me, Square Enix will still use the Luminous Studio. It's still a fresh engine when compared against their previous engine, Crystal Tools. They'll use until it hits the limit.

 

With regards to optimizing GameWorks for AMD hardware, the developer is free to do that as long as it does not negatively affect NVIDIA hardware. It is 100% possible. Does the developer want to put in that kind of work to optimize for AMD hardware as well as NVIDIA hardware? Not everyone is going to. They're not allowed to redistribute the source code. You may not be able to get AMD's hardware performance to match NVIDIA's under GameWorks scenarios simply because GameWorks was written to take advantage of NVIDIA's hardware features first.

 

https://wccftech.com/fallout-4-nvidia-gameworks/

Luminous is all but dead. KH3 was originally being developed on Luminous but SE made the team switch to UE4 last year when they licensed the engine. While it was never officially confirmed I'd be willing to bet the same thing happened with FF7R. SE moved every internal project off of Luminous prior to FFXV's release. The insane and troubled development schedule of both FF15 and Luminous led to both being a mess.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Derangel said:

While it was never officially confirmed I'd be willing to bet the same thing happened with FF7R.

Pretty sure that is confirmed now.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Derangel said:

RTG has been such a cluster since it was formed that its no surprise that things have ground to a halt in terms of developer relations. I hope that once things settle down a bit over there they will resume programs that help developers.

Yeah, I have yet to get an answer back to all of my technical issues with AMD's Radeon ProRender, both in Maya, and in Blender. It's why I stopped using it and went back to Cycles.

 

I would definitely, definitely use NVIDIA Iray, if I had USD $299 to buy the plugin, simply because I would be able to get reliable technical support in a timely fashion.

Edited by JurunceNK

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, ARikozuM said:

They do or did. Don't know what happened to their game development division and last title I can recall is Alien: Isolation. They need to step their game up if they want to make full use of Vega's (or whatever architecture they choose next) features. Nvidia's Gameworks division provides software and tech support to companies that will accept the sponsorship. This helps Nvidia maintain the best performance or visual quality that they can, something AMD used to do. 

Wasn't AMD's only assistance and no large toolset like GameWorks though? Side note that's how bad it is that I don't know lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×