Jump to content

Final Fantasy XV Windows Edition Benchmark is a total mess and unrepresentative of the final product **Update with Square Enix's response**

43 minutes ago, -BirdiE- said:

If turning off GameWorks grants me the same performance and graphical fidelity that not including at all does

It doesn't, that's the issue. You can have it on and get the visual improvement at the performance cost which is vastly greater on AMD hardware or you can turn it off and get a visual reduction. You can't have both like you are saying.

 

You can say it's not worth the performance loss but the settings on and off produce different output on to the screen.

 

Not using GameWorks at all means a total change in development so we have no idea what it would look like without it at all, you can produce all the effects of GameWorks without using GameWorks or only implement some of them but that will never happen in a game that is using GameWorks.

 

For example if you turn GameWorks off you won't get TressFX for hair effects but you might get TressFX if GameWorks was never used and that works equally well on all hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, leadeater said:

It doesn't, that's the issue. You can have it on and get the visual improvement at the performance cost which is vastly greater on AMD hardware or you can turn it off and get a visual reduction. You can't have both like you are saying.

 

You can say it's not worth the performance loss but the settings on and off produce different output on to the screen.

You're comparing the wrong things. Obviously by turning it off you get a reduction in graphical fidelity, but it's not a reduction compared to the developer not including game works at all.

 

Let's use numerical values as representations...

 

Option A)

The developer doesn't incorporate GameWorks into their game. Their game has a graphics value of 8, and a performance value of 10.

 

Option B)

The developer incorporates GameWorks into their game. With GW turned on the graphics value is 10, but the performance for new Nvidia cards is 8, and AMD cards it's 5. By turning off GameWorks, it's the same as not incorporating it at all with a graphics value of 8, and a performance value of 10.

 

By including game works you simply give people the OPTION to make the tradeoff. If they decide the tradeoff is not worth it, then they're no worse off than they would have been if GameWorks was not incorporated.

 

It would be illogical to say that consumers are BETTER off with GameWorks not included.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, -BirdiE- said:

You're comparing the wrong things. Obviously by turning it off you get a reduction in graphical fidelity, but it's not a reduction compared to the developer not including game works at all.

 

Let's use numerical values as representations...

 

Option A)

The developer doesn't incorporate GameWorks into their game. Their game has a graphics value of 8, and a performance value of 10.

 

Option B)

The developer incorporates GameWorks into their game. With GW turned on the graphics value is 10, but the performance for new Nvidia cards is 8, and AMD cards it's 5. By turning off GameWorks, it's the same as not incorporating it at all with a graphics value of 8, and a performance value of 10.

 

By including game works you simply give people the OPTION to make the tradeoff. If they decide the tradeoff is not worth it, then they're no worse off than they would have been if GameWorks was not incorporated.

Sorry, see my edit. Didn't fully cover it before posting. tl;dr you can't know what the game would look like or perform if it did not use GameWorks in the first place so it's impossible to say off is the net same result. It's much more likely that if no GameWorks was used similar effects would be implemented by the game developer so GameWorks off would look worse in this hypothetical example. 

 

Edit:

I think you are still ignoring the point the GameWorks doesn't have to negatively impact AMD hardware as much as it does, Option C is fix GameWorks which is NOT hard for Nvidia but impossible for AMD

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

It's much more likely that if no GameWorks was used similar effects would be implemented by the game developer so GameWorks off would look worse in this hypothetical example. 

I don't know that that's true. The average developer using UE4, or Frostbite 2 is not going to alter the engine to render things differently... Unless they also created the engine... But then it would be built right into the engine, and not managed on a game-by-game basis.

 

At least those are my suspicions. But then again, I'm not super familiar with game development so there's always the possibility that I have no idea what I'm talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, leadeater said:

Edit:

I think you are still ignoring the point the GameWorks doesn't have to negatively impact AMD hardware as much as it does, Option C is fix GameWorks which is NOT hard for Nvidia but impossible for AMD

I'm just not sure why it would make any sense for Nvidia to spend resources optimizing their software for their competition...

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, -BirdiE- said:

I don't know that that's true. The average developer using UE4, or Frostbite 2 is not going to alter the engine to render things differently... Unless they also created the engine... But then it would be built right into the engine, and not managed on a game-by-game basis.

 

At least those are my suspicions. But then again, I'm not super familiar with game development so there's always the possibility that I have no idea what I'm talking about.

There are games that use UE4 and TressFX, most of these are just libraries that you can choose to include that supplements existing tools. That's all GameWorks is, a bunch of libraries that includes code for different effects which makes it easier for developers to implement those effects without having to worry too deeply about the underlying code.

 

A game engine is just a framework, most of them use the very same lighting tools as each other which is why games using different engines can look so similar.

 

Think of it like cooking shows, you could watch the host do everything in real time or "Here's one I prepared earlier". The one prepared earlier might actually not taste the same as the one you just watched get prepared but not cooked on TV. Or a more extreme example the picture of a burger at a fast food place vs what you actually get, same ingredients but one took hours to prepare (no GameWorks at all) and photographed and the other... yea that one is GameWorks off ;).

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, -BirdiE- said:

I'm just not sure why it would make any sense for Nvidia to spend resources optimizing their software for their competition...

Because it's not hard, it's basically changing a default parameter. Not that there is evidence but if it was true that Nvidia has crafted GameWorks to hinder AMD then that is anti-competitive and could be illegal, so I think it's more a case of not giving a damn.

 

A lot of games also do not have an easy way to turn GameWorks off and keep all the other settings at max because everything is all tied in with presets and the like, just like this FFXV benchmark. You can hack around that but the default in application option of High and Medium might not just be a simple case of GameWorks On and Off, other settings may get changed.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, leadeater said:

There are games that use UE4 and TressFX, most of these are just libraries that you can choose to include that supplements existing tools. That's all GameWorks is, a bunch of libraries that includes code for different effects which makes it easier for developers to implement those effects without having to worry too deeply about the underlying code.

If that's the case, I understand the frustration... As long as it's aimed at the developers for taking the "easy route" and not at Nvidia for simply making a tool available for people to use.

 

15 minutes ago, leadeater said:

Because it's not hard, it's basically changing a default parameter. Not that there is evidence but if it was true that Nvidia has crafted GameWorks to hinder AMD then that is anti-competitive and could illegal, so I think it's more a case of not giving a damn.

Do you have any evidence to suggest it's not hard? (seriously asking). If I'm Nvidia, I can't think of a single reason I'd put forth any effort whatsoever to make it run well on AMD cards.

 

Even if Nvidia were to artificially hinder AMD cards with their tools, that wouldn't be illegal. Now, if Nvidia was artificially hindering AMD cards in GameWorks and ALSO paying developers to use it, that could be anti-competitive and illegal... But I don't feel like going down that speculative rabbit hole right now.

 

Plus, you never know... You'd expect paying developers to not develop their games for other platforms would be illegal... But it's apparently not.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, -BirdiE- said:

If that's the case, I understand the frustration... As long as it's aimed at the developers for taking the "easy route" and not at Nvidia for simply making a tool available for people to use.

It's kind of a bit of both, I don't exactly blame devs for not changing default values when they work fine on the test hardware they have and don't have the time or willingness to figure out why a comparative AMD card is performing worse than it should.

 

Essentially both don't have it as a priority to do much about it because AMD market share for gamers is so low, if that changes it's only going to hurt Nvidia's reputation more by not doing something about it. Imagine if a game studio went with an AMD toolset that made Nvidia cards perform worse than they should, that's simply a game that will not sell very well.

 

Nvidia shouldn't get a free pass just because they have the highest market share.

 

The tool itself is fine and developers want it, no reason to stop the use of it but also no reason to not fix known issues with it.

 

30 minutes ago, -BirdiE- said:

Do you have any evidence to suggest it's not hard? (seriously asking). If I'm Nvidia, I can't think of a single reason I'd put forth any effort whatsoever to make it run well on AMD cards.

It's just the amount of tessellation being used, toning that down isn't hard. AMD driver optimization for GameWorks games mostly consists of limiting the tessellation amount at the driver level because that is known to work and there is little else they can do without seeing the code of GameWorks.

 

30 minutes ago, -BirdiE- said:

Even if Nvidia were to artificially hinder AMD cards with their tools, that wouldn't be illegal.

Actually it can be, https://en.wikipedia.org/wiki/Competition_law. If it actually is or would be able to be successfully argued in court is another matter, but laws exist that stop companies being dicks to each other but not completely.

 

Edit:

Creating a tool that hurts your competition and then using your market dominance to make the use of that tool prevalent is most certainly in the realms of illegal, game developers don't have to be in on it at all or getting money from Nvidia. Not what I'm saying is happening with GameWorks though.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Derangel said:

Depends on the skill of the developer. It is a lot cheaper to license stuff then it is to build engines. A lot of the third party engines are much better at being adapted for multiple genres and needs than developer created ones as well (example: Frostbite is apparently a complete piece of shit if your game isn't a FPS) so it can make a lot more sense for a publisher to mass license 3rd party APIs. Ubisoft has a unique approach to solve that problem. A lot of their games use in-house engines, but the engines are very modular. They're designed to easily allow elements to be swapped in and out. If a game calls for climbing mechanics, for example, the developer can pull the animation engine from Anvil Next and apply it to their own game and modify has needed. This saves the developer both the time and money needed to create a brand new animation engine. These modular elements are also shared company wide, so updates made to one modular part can be used by other studios within the company. Its a fairly ingenious design, but the downside is that it ends up making a lot of their games feel exactly the same.

Which brings up another topic I'd like to talk about, which is a game engine of my own design. I've been working on the document outlining what this conceptual engine is, what features and toolsets it should have, and the scope and purpose of the engine. But with the discussion as a thread, I feel it’s gone off-topic.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

On 04/02/2018 at 6:44 AM, leadeater said:

And everyone was like "Now that AMD owns consoles things are going to massively improve, RIP Nvidia"... :ph34r:. Who would have guessed consoles and PCs are different.

Actually, AMD has succeeded in what they wanted to do with Consoles.

 

Consoles were part of AMD's way of making a better graphics API which takes advantage of their hardware. All AMD based current gen consoles support GCN architecture and are fairly good at GPU compute stuff*1 . All are technically capable of supporting Direct X 12*2 and Vulkan Support*2 .

 

And also, It's another way of making money for AMD. Nvidia refused to make console SOCs for PS4 and Xbox One families.

 

Also, games which are optimized properly on Console which are ported to PC should run well on AMD hardware because of the optimization in the first place. If something runs badly on a current gen Console then of course it's probably also gonna run badly on AMD PC hardware (do I need to give examples? Watch Dogs, Assasin's Creed, PUBG).

 

Many games which run badly on Consoles or run substantially better on Nvidia hardware on PC are typically games targeting Consoles with GameWorks and then are ported to PC.

 

Yes, game developers can use Gameworks on an Xbox One because I don't even know why. If that game was ported to PC as is with Gameworks then of course AMD PC performance will suffer.

 

*1 - Some GPUs are substantially worse than others in this area

*2 - Feature support subject to Operating System API support.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

Nvidia shouldn't get a free pass just because they have the highest market share.

I agree but realistically what can the average person do other than mass buy AMD cards (/joke) ?

 

Nothing can be done for now unless AMD market share rises substantially.

9 hours ago, leadeater said:

The tool itself is fine and developers want it, no reason to stop the use of it but also no reason to not fix known issues with it.

Exactly.

9 hours ago, leadeater said:

It's just the amount of tessellation being used, toning that down isn't hard. AMD driver optimization for GameWorks games mostly consists of limiting the tessellation amount at the driver level because that is known to work and there is little else they can do without seeing the code of GameWorks.

Actually, VEGA made progress on this front.

 

Prior to VEGA, a driver optimization would be needed on a per game basis to reduce the stupid amounts of tesselation in gameworks games.

 

In the VEGA architecture, a specific feature (Can't remember the exact name of it) only renders what the player can actually see in the game and thus doesn't render the excessive tesselation if it's not visible to the player. The VEGA GPUs only render exactly what the player sees in their camera if the feature is enabled.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, AluminiumTech said:

In the VEGA architecture, a specific feature (Can't remember the exact name of it) only renders what the player can actually see in the game and thus doesn't render the excessive tesselation if it's not visible to the player. The VEGA GPUs only render exactly what the player sees in their camera if the feature is enabled.

I think that might be one of the features not working yet on Vega though? I don't remember but there is a few things that was promised but not yet delivered. AMD drivers still limit tessellation for Vega too, it can handle more though.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, AluminiumTech said:

Actually, AMD has succeeded in what they wanted to do with Consoles.

 

Consoles were part of AMD's way of making a better graphics API which takes advantage of their hardware. All AMD based current gen consoles support GCN architecture and are fairly good at GPU compute stuff*1 . All are technically capable of supporting Direct X 12*2 and Vulkan Support*2 .

 

And also, It's another way of making money for AMD. Nvidia refused to make console SOCs for PS4 and Xbox One families.

 

Also, games which are optimized properly on Console which are ported to PC should run well on AMD hardware because of the optimization in the first place. If something runs badly on a current gen Console then of course it's probably also gonna run badly on AMD PC hardware (do I need to give examples? Watch Dogs, Assasin's Creed, PUBG).

 

Many games which run badly on Consoles or run substantially better on Nvidia hardware on PC are typically games targeting Consoles with GameWorks and then are ported to PC.

 

Yes, game developers can use Gameworks on an Xbox One because I don't even know why. If that game was ported to PC as is with Gameworks then of course AMD PC performance will suffer.

 

*1 - Some GPUs are substantially worse than others in this area

*2 - Feature support subject to Operating System API support.

I was more meaning in relation to the PC space which hasn't gotten any better even though AMD is in the major consoles, some people thought that would make a difference but realistically it hasn't.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

I was more meaning in relation to the PC space which hasn't gotten any better even though AMD is in the major consoles, some people thought that would make a difference but realistically it hasn't.

More DX12 and Vulkan games. Remember, AMD basically gave away Mantle for free to Khronos Group and Microsoft. Vulkan and DX12 were based off of Mantle to some extent.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AluminiumTech said:

More DX12 and Vulkan games. Remember, AMD basically gave away Mantle for free to Khronos Group and Microsoft. Vulkan and DX12 were based off of Mantle to some extent.

DX12 was always going to come though, it had some late improvements to counter Mantle but Mantle didn't trigger the development of DX12. New and better APIs isn't the issue though, games still aren't getting any better optimized for AMD hardware and the only improvements came at the hardware level which if you look at the Titan V which implements those missing hardware features it tips right back to favoring Nvidia hardware again.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

DX12 was always going to come though, it had some late improvements to counter Mantle but Mantle didn't trigger the development of DX12. New and better APIs isn't the issue though, games still aren't getting any better optimized for AMD hardware and the only improvements came at the hardware level which if you look at the Titan V which implements those missing hardware features it tips right back to favoring Nvidia hardware again.

Titan V technically supports DX12 FL 12_1 but Volta (which is based on pascal which in turn is based on Maxwell) was designed with DX11 in mind.

 

It technically supports it but isn't optimizes for it.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, AluminiumTech said:

Titan V technically supports DX12 FL 12_1 but Volta (which is based on pascal which in turn is based on Maxwell) was designed with DX11 in mind.

 

It technically supports it but isn't optimizes for it.

They are all just iterations of the architecture, GCN is no different. That's the thing there is either hardware level support for the DirectX features or there isn't. the how well it's designed for it is a bit hard to judge and all we can look at is FPS and extrapolate from that along with some basic understanding for the architecture (though the actual details is far more complicated than we ever see).

 

This is interesting though, Intel Skylake iGPU and above has a decent fight with GCN5 for best DX12 support.

18104120760l-620x426.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Square Enix responded to Gamers Nexus's concerns about the frametime issues, LOD issues, and other problems discovered in the utility. If you're skipping ahead of the OP (which I strongly advise against, I recommend that you read my OP carefully), here's the Twitter post:

ffxv-bench-fix-tweet.png

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, leadeater said:

I think that might be one of the features not working yet on Vega though? I don't remember but there is a few things that was promised but not yet delivered. AMD drivers still limit tessellation for Vega too, it can handle more though.

you are not wrong

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×