Jump to content

[Updated 11/4/15] Fallout 4 to NOT feature Nvidia Gameworks, but Nvidia is still working with Bethesda on the game.

ChrisxIxCross

No they don't, Nvidia pays them, after all the crap Nvidia has gotten for gameworks do you really think any developer in their right mind puts this stuff in just because Nvidia asks? No way in hell that is happening. Nvidia pays them A LOT of money to put this crap in.!

You got a source for that? Not saying you're right or wrong, just need a source.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Let's start using the correct names though, TressFX is only comparable to Hairworks, not gameworks as a whole

Let me first correct you now: I said I hated both TressFX and Hairworks.

Now, I'm against TressFX and Hairworks because IT SUCKS.

Neither solution is actually good-looking, nor does either of them justify the resource cost. I still find it baffling that there are artists that willingly include it. It's not so much the looks - those are similar to the old, polygonal hair - it's the animations that are just plain not realistic, ever, unless you're making a game about those puppets with spaghetti hair. Because that's what Hairworks/TressFX looks like - Spaghetti.

It'd be hilarious if it weren't so sad. Sure, the demo videos look great, and it's just "meh" during gameplay. The worst part are the cutscenes. It's like Cthulu himself is sitting on the head of the character, waving his tentacles while the character is standing still. And I didn't even get to clipping yet.

 

WTF nothing posted. DERP!

 

Well that is your subjective opinion. I personally like TressFX, and I thought it made Lara Croft look really good, which is important when you have a third person view.

 

TressFX also looks really good in Deus Ex Mankind Divided, especially compared to mesh hair:

 

As for performance, TressFX has little performance hit compared to the effect it produces:

TressFX2/3 has a master/slave strand design, that means much less of a performance hit.

 

Now the reason why I scold on HairWorks, but support TressFX can be summed up in this picture:

tfx_tr_perf.png

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You got a source for that? Not saying you're right or wrong, just need a source.

That's how sponsorship works - Nvidia pay, devs advertise via Nvidia tech - they payed quite a bit to Crytek for Crysis 2 IIRC

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

No they don't, Nvidia pays them, after all the crap Nvidia has gotten for gameworks do you really think any developer in their right mind puts this stuff in just because Nvidia asks? No way in hell that is happening. Nvidia pays them A LOT of money to put this crap in.

Uhhhh yeah? Because they get to add free extras to the game without having to do the work themselves?

You realize that its nvidia making this, not the developers, right? The developers dont need to spend thousands of hours coding some smoke effects because its already made by nvidia. That's why they add it to their game.

 

It saves time, money, and makes it look better for way less effort.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's how sponsorship works - Nvidia pay, devs advertise via Nvidia tech - they payed quite a bit to Crytek for Crysis 2 IIRC

Again with the sources. I'm not seeing them. I'm not inclined to just believe anything.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Again with the sources. I'm not seeing them. I'm not inclined to just believe anything.

http://www.gamespot.com/forums/pc-mac-linux-society-1000004/crytekea-signs-exclusive-deal-with-nvidia-over-cry-27443807/

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you for giving a source. I appreciate those who can back up their claims instead of shouting unverifiable info from the rooftops. +1 rep given. I'll give it a more in-depth read later on when I get the chance.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you for giving a source. I appreciate those who can back up their claims instead of shouting unverifiable info from the rooftops. +1 rep given. I'll give it a more in-depth read later on when I get the chance.

There were a few articles back then but it was what? 4 years ago? Have lost track where they were posted. In any case, there was a giant Tessellated ocean underneath the game world and since the 6000 series back then were shit at Tess it was literally taking 50% off of their performance. Nvidia lost only 20% from it since 500 series were better at Tess.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

It wouldn't be ok because people would still complain about it performing worse on AMD hardware. Yes, I think they overtessellated the hair in W3, and it caused pretty big impact on Nvidia GPUs as well. But Witcher 3 got a patch which included the option to reduce tessellation to 16x I believe, and the game is now playable even on my 970. I don't know if it's Nvidia's fault that it was previously locked to x32, or it was up to devs, but you can set it now to "low".

It was natively locked to 64x and only AMD users were able to manually optimize it via catalyst. If they recently added a tessellation bar in game then good, last I checked they only allowed the AA on hairworks to be adjusted.

Anyway this comes back to the point I made repeatedly in this thread (not targeted at you). This is about much more than an Nvidia vs AMD thing. This is about ensuring that applications use the GPU in the most efficient manner possible to create a visual pay off. That should be the objective when writing these effects even in a gameworks title. To serve the game. The objective should not be to create a pseudo-synthetic workload which helps highlights differences between architectures. NVIDIA users too should strive for the best possible performance, and by that I mean the best possible absolute performance. Not try to do something which makes Nvidia beat AMD, or to make nvidia new gen beat nvidia old gen, or to encourage GPU upgardes. Because inevitably when you strive for those goals you are hurting NVIDIA users too in the process and giving them lower absolute performance than what they could have gotten, just so you can create a fps difference between Nvidia and AMD or between NVIDIA last gen. In the end it's a lose-lose situation for both NVIDIA and AMD users, it doesn't serve the game or the consumers. It's very easy to bring even the most powerful GPU to it's knees with bad software. Good software is about utilizing the resources efficiently.

We should not stand for things like...

Using more tessellation than is required to achieve visual payoff

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Asking the GPU to render things which are not in the scene

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3

I.e. Lowering the framerate artificially with a fake workload

This stuff may make NVIDIA come out slightly ahead of AMD in benchmarks. NVIDIA users may look at those results and feel good not realizing how much absolute performance they have sacrificed in the process. The sad reality in the above workloads is that all GPUs are artificially gimped. So there is no point celebrating the fact that some are more gimped than others.

Link to comment
Share on other sites

Link to post
Share on other sites

60 with

60 without

280 is more than enough to run.

Turn off V-sync please or what ever is limiting the FPS and tell me the numbers.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

It was natively locked to 64x and only AMD users were able to manually optimize it via catalyst. If they recently added a tessellation bar in game then good, last I checked they only allowed the AA on hairworks to be adjusted.

Anyway this comes back to the point I made repeatedly in this thread (not targeted at you). This is about much more than an Nvidia vs AMD thing. This is about ensuring that applications use the GPU in the most efficient manner possible to create a visual pay off. That should be the objective when writing these effects even in a gameworks title. To serve the game. The objective should not be to create a pseudo-synthetic workload which helps highlights differences between architectures. NVIDIA users too should strive for the best possible performance, and by that I mean the best possible absolute performance. Not try to do something which makes Nvidia beat AMD, or to make nvidia new gen beat nvidia old gen, or to encourage GPU upgardes. Because inevitably when you strive for those goals you are hurting NVIDIA users too in the process and giving them lower absolute performance than what they could have gotten, just so you can create a fps difference between Nvidia and AMD or between NVIDIA last gen. In the end it's a lose-lose situation for both NVIDIA and AMD users, it doesn't serve the game or the consumers. It's very easy to bring even the most powerful GPU to it's knees with bad software. Good software is about utilizing the resources efficiently.

We should not stand for things like...

Using more tessellation than is required to achieve visual payoff

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Asking the GPU to render things which are not in the scene

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3

I.e. Lowering the framerate artificially with a fake workload

This stuff may make NVIDIA come out slightly ahead of AMD in benchmarks. NVIDIA users may look at those results and feel good not realizing how much absolute performance they have sacrificed in the process. The sad reality in the above workloads is that all GPUs are artificially gimped. So there is no point celebrating the fact that some are more gimped than others.

YES! THANK YOU! That was the article about the Tess ocean I was looking for

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Turn off V-sync please or what ever is limiting the FPS and tell me the numbers.

have to re-download >.< - I don't keep games I don't re-play on my PC.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

have to re-download >.< - I don't keep games I don't re-play on my PC.

xD It's okay man or woman, I just wanted you to know that Tress FX has impact on FPS how big of an impact I'm not sure but I think it was 10-30 FPS for me but not sure.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

xD It's okay man or woman, I just wanted you to know that Tress FX has impact on FPS how big of an impact I'm not sure but I think it was 10-30 FPS for me but not sure.

An impact is certain - it's more work - but it was the same, more or less, across vendors and brands.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

An impact is certain - it's more work - but it was the same, more or less, across vendors and brands.

You mean the same as it was 60 FPS right ?

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

You mean the same as it was 60 FPS right ?

I mean, both AMD and Nvidia users would lose a few fps but it was the same for both

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Gameworks is their vr right?

It's their blackbox

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

It's their blackbox

 

 

Oh.

 

It's not a black box.

 

Every single time this is brought up, it always comes from the people who irrationally have a bone to pick with a company that makes the extra sauce that developers choose to use.

 

I posted a reply to one of the said irrational people and linked it in my signature. Even though I actually present proof, it's dismissed because it doesn't fit with what they want the truth to be.

Link to comment
Share on other sites

Link to post
Share on other sites

It was natively locked to 64x and only AMD users were able to manually optimize it via catalyst. If they recently added a tessellation bar in game then good, last I checked they only allowed the AA on hairworks to be adjusted.

Anyway this comes back to the point I made repeatedly in this thread (not targeted at you). This is about much more than an Nvidia vs AMD thing. This is about ensuring that applications use the GPU in the most efficient manner possible to create a visual pay off. That should be the objective when writing these effects even in a gameworks title. To serve the game. The objective should not be to create a pseudo-synthetic workload which helps highlights differences between architectures. NVIDIA users too should strive for the best possible performance, and by that I mean the best possible absolute performance. Not try to do something which makes Nvidia beat AMD, or to make nvidia new gen beat nvidia old gen, or to encourage GPU upgardes. Because inevitably when you strive for those goals you are hurting NVIDIA users too in the process and giving them lower absolute performance than what they could have gotten, just so you can create a fps difference between Nvidia and AMD or between NVIDIA last gen. In the end it's a lose-lose situation for both NVIDIA and AMD users, it doesn't serve the game or the consumers. It's very easy to bring even the most powerful GPU to it's knees with bad software. Good software is about utilizing the resources efficiently.

We should not stand for things like...

Using more tessellation than is required to achieve visual payoff

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Asking the GPU to render things which are not in the scene

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3

I.e. Lowering the framerate artificially with a fake workload

This stuff may make NVIDIA come out slightly ahead of AMD in benchmarks. NVIDIA users may look at those results and feel good not realizing how much absolute performance they have sacrificed in the process. The sad reality in the above workloads is that all GPUs are artificially gimped. So there is no point celebrating the fact that some are more gimped than others.

 

I agree with pretty much everything, except I think the difference would be there anyway with tessellation. But if they used it less the game would have actually been playable on AMD cards, and more than playable on Nvidia. It seems like they wanted to make the game unplayable on AMD (but playable on Nvidia), so they hurt the performance on their own hardware as well. They did, however, made it possible to reduce tessellation now in W3, so it's a step in the right direction. 

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

It's not a black box.

 

Every single time this is brought up, it always comes from the people who irrationally have a bone to pick with a company that makes the extra sauce that developers choose to use.

 

I posted a reply to one of the said irrational people and linked it in my signature. Even though I actually present proof, it's dismissed because it doesn't fit with what they want the truth to be.

 

It's literally high level AAA game developers calling it a black box, not some random internet guy.

 

Since then NVidia has changed things with their GameWorks contracts, so you can buy full access to the code (but that will cost). Basic GameWorks access (like CDPR had for Witcher 3) neither shows source code, nor gain you access to change/optimize it.

 

As for you signature, the only GameWorks game you use in your example shows that GameWorks gimps AMD., so not sure what it is you think you prove.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It's literally high level AAA game developers calling it a black box, not some random internet guy.

 

Since then NVidia has changed things with their GameWorks contracts, so you can buy full access to the code (but that will cost). Basic GameWorks access (like CDPR had for Witcher 3) neither shows source code, nor gain you access to change/optimize it.

 

As for you signature, the only GameWorks game you use in your example shows that GameWorks gimps AMD., so not sure what it is you think you prove.

 

Everything you say about this is a complete lie. You're so delusional. We've gone over this so many times I'm starting to think you're insane.

Link to comment
Share on other sites

Link to post
Share on other sites

Everything you say about this is a complete lie. You're so delusional. We've gone over this so many times I'm starting to think you're insane.

 

What is a lie?

The fact that AAA game developers called GameWorks a black box? I can find the source if you want.

The fact that GameWorks games tend to run like crap on AMD? I can quote the list of GameWorks games again, if you want.

 

You can say what you want, but the empirics simply speaks against you. But since you can't provide any sources to back up your claims, but resort to calling me a liar and insane, I think it's safe to say you have nothing.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

What is a lie?

The fact that AAA game developers called GameWorks a black box? I can find the source if you want.

The fact that GameWorks games tend to run like crap on AMD? I can quote the list of GameWorks games again, if you want.

 

You can say what you want, but the empirics simply speaks against you. But since you can't provide any sources to back up your claims, but resort to calling me a liar and insane, I think it's safe to say you have nothing.

 

I've seen your source. I've also seen actual evidence to support that it's a lie, something I replied to you ages ago but you just dismiss it.

 

No one is denying that Gameworks runs like shit on AMD. That's not what the point is.

Link to comment
Share on other sites

Link to post
Share on other sites

I've seen your source. I've also seen actual evidence to support that it's a lie, something I replied to you ages ago but you just dismiss it.

 

No one is denying that Gameworks runs like shit on AMD. That's not what the point is.

 

So the discussion you want to do, is whether a GameWorks branded game is running poorer on AMD without GameWorks effects activated, than NVidia?

 

That is surely a more complicated matter, as few reviewers diasble ALL GameWorks effects in their benchmakrs (this includes HBAO and TXAA).

 

Witcher 3 and Far Cry 4 do seem to work fairly well on AMD without GameWorks, but the rest of them all seem to run poorly on AMD even without these effects. Like in your own example with Assassins Creed unity, where a 780 beats a 290x.

 

If you know of a GameWorks game that runs better on AMD than Nvidia (or at least just similar), please do enlighten me.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You act like every engine can be modified to look however they want without MASSIVE, EXTENDED WORK that will break the next version. Unreal? Almost every single UE3 game had that plastic sheen on every surface. Source? I can tell a Source game from a mile away. Cry Engine? Extreme shininess. Frostbite? Let's put it this way, the first 5 seconds of the new NFS teaser trailer months and months ago? I knew from the way the water looked on the road it was Frostbite, and thus it HAD to be a NFS announcement. It was.

You're proposing to stagnate an intended feel and look to games to suit the 3-4 big game engines out there, and stifle game art for that. No thank you. Witcher 3 has a fantastic engine and nothing out there can make it look like REDEngine. GTA V? Own engine. Nothing else would look right for how Rockstar wanted GTA V to look like.

Umm no. The more support companies doing engine get by getting more customers and money the more often they can update said engines.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×