Jump to content

Well, it looks like HBAO+ is behind all that corruption in Gears of War.

3 hours ago, Citadelen said:

I just noticed it says Games of War instead of Gears of War in the title...

gow.thumb.png.3f22b77f894ff46d3be76636b5

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Citadelen said:

Nvidia has lost what little respect as a company it had left to me, that they haven't got Asynchronous compute in their drivers six months later but they have DX12 Gameworks up and running say's a lot about what their priorities are.

I get the feeling you're copy/quoting that from a certain someone :333

 

But yeah, it's pretty sucky overall, Nvidia are probably going to try to rely on Gameworks to hamper AMDs performance in order to stay competitive.

System specs
  • Graphics card: Asus GTX 980 Ti (Temp target: 60c, fan speed: slow as hell)
  • CPU: Intel 6700k @ 4.2Ghz
  • CPU Heatsink: ThermalRight Silver Arrow Extreme
  • Motherboard: Asus Maximus Viii Gene
  • Ram: 8GB of DDR4 @ 3000Mhz
  • Headphone source: O2 + Odac 
  • Mic input: Creative X-Fi Titanium HD
  • Case: Fractal Design Arc midi R2
  • Boot Drive: Samsung 840 Pro 128GB 
  • Storage: Seagate SSHD 2TB
  • PSU: Be quiet! Dark Power Pro 550w

Peripherals

  • Monitor: Asus ROG Swift PG278Q
  • Mouse: Razer DeathAdder Chroma (16.5 inch/360)
  • Mouse surface: Mionix Sargas 900
  • Tablet: Wacom Intuos Pen
  • Keyboard: Filco Majestouch Ninja, MX Brown, Ten Keyless 
  • Headphones: AKG K7xx
  • IEMs: BrainWavs S1
Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SamStrecker said:

Why is an Nvidia technology running poorly on AMD a surprise?

It does run well on AMD. It runs ok on a 280 series, 290 series, 390 series etc... It's just that something strange happens when you try to run it on a GCN 1.2 graphics card. Basically it's an oversight in the beta games testing process where nobody has tested it on a GCN 1.2 GPU.

 

7 hours ago, AlexGoesHigh said:

Nope, this is why there's no PCPer or Anand article, Fraps, DXtory, OBS and any other recording doesn't work (expect for MS own version that is on the Xbox app but is very barebones and by design is limited to a certain amount of recording time), and it also blocks any kind of overlays, unless is in the games code to allow it, in fact unless they fundamentally change how the sanboxing works or add some sort of pass though API, any apps that wants to touch a store app, the store app must have code that whitelist those apps and its functions.

Oh man that really sucks. Most games don't include an in-game benchmark. So this would mean no more articles from Guru3D where they test 20 different graphics cards to see how they perform. And no adding the exe to steam to use the steam features such as broadcasting, streaming, presence, screenshots etc...

 

Even Nvidia and AMD would have less flexibility for their engineers to test games and find bottlenecks. They would have to totally rely on the game studios shipping them development versions of the games.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, huilun02 said:

The Sabotageworks is real

How people even defend this ahole of a company is beyond me. Heck they even screw with their own 700 series customers.

Microsoft isn't any better with its exclusive and locking down nonsense.

Yes Nvidia created it, they aren't forcing companies to use it. The developers are the ones making the choice not fair to blame Nvidia. Its like blaming a gun company for pulling the trigger.... makes no sense...

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, DocSwag said:

As well, it seems Gears of War implements PhysX in it. As most of us know, PhysX can make your game look a lot better. However, this comes at a performance penalty, especially for AMD graphics cards, since whenever an AMD graphics card is found to be in the system all the PhysX work will be offloaded to the CPU which can cause severe bottle necking for the GPU and so drastically reduce performance.

In past games, PhysX could always have been disabled. However, it seems with Gears of War, it is currently impossible to disable PhysX. There is no setting in the game, and when you try to edit the file that dictates whether or not PhysX is enabled, it changes the setting once your PC connects to the Microsoft servers.

 

Just make basengine.ini read only after changing it? It's not that freaking hard lol.

8 hours ago, QueenDemetria said:

This is why Microsoft needs to abandon gaming entirely, and let someone with more passion and understanding of their customers run a game service.

If you look up, you can read a stupid statement in it's wild habitat.

6 hours ago, samcool55 said:

It's even worse than that. In some cases the 960 beats the 780... (not the ti, but still 780 was expensive) 

Can you imagine a next gen card priced at 200 bucks beating the 980 and destroying the 970, the currently most populair gpu? Would be insane...

But with nvidia i won't be suprised!...

With the architecture changes, it's likely it'll happen.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, huilun02 said:

They were incentivized to use Gameworks. How about I put a gun at your disposal and then pay you to use it?

There is a giant hole.... I can turn down the job....

Incentives isn't forcing anyone, it's still their free will to choose.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, huilun02 said:

Except companies operate for the money, and if taking up the deal means a higher net return, they will do it.

Like so many devs have done before.

An unethical thing to do on both counts.

 

When it comes to morals, there is no way you can defend Nvidia.

So then every company has bad morals... AMD used mantle, Nvidia uses Gameworks. You can't expect one to be "Morale" when the other isn't.

Would never have been a big deal if the DEVELOPER had made the options.. well optional..

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Dabombinable said:

Just FYI, PhysX can be run off the CPU, just no where near as well as any graphics or PhysX card.

 

Yea sure, HBAO+ can be turned off, that is fine. PhysX cannot be turned off, that is not fine. We already know how big the driver overhead is on AMD. That's exploitation of weakness. 

 

The only other one I can think of is project cars where the PhysX cannot be turned off. This would be the second time Nvidia and a game dev is doing this. Not a good sign. 

 

 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, huilun02 said:

AMD made Mantle and didn't stop anyone or Nvidia from using it. In fact they let other companies use it to make DX12 and Vulkan.

AMD made TressFX and doesn't deliberately screw with Nvidia cards' performance.

They pushed for Adaptive Sync as a standard in DP and HDMI, doesn't bar Nvidia for using it.

 

One does not simply generalize as an excuse to defend a scumbag company.

I just don't understand how you can define a company that create proprietary software to incentivize its consumer base as a "scumbag"... Say hello to Apple, Microsoft, and every other tech giant out there. It was AMD choice to make their software open source, oh darn here is that CHOICE word again.

Yes a monopoly isn't a good thing but on the other hand making a better product with more incentives is what makes the world go round.

Link to comment
Share on other sites

Link to post
Share on other sites

Guys regardless of the usual problems where gameworks is a bit slow (even on Nvidia) this particular case isn't an issue of 'gameworks' running slow on AMD.. It runs fine in this game on most AMD GPUs. It's just some bug where it doesn't run correctly on GCN 1.2 graphics cards. Which hasn't been caught during the testing phase by the developer, the dev is to blame here.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, huilun02 said:

The Sabotageworks is real

How people even defend this ahole of a company is beyond me. Heck they even screw with their own 700 series customers.

Microsoft isn't any better with its exclusive and locking down nonsense.

The point of the argument was this lol, and it was answered before its not sabotage if you don't have to use it... makes a lot of sense doesn't it. If you cant turn it off then that's the developers fault. Not that hard to comprehend......

Link to comment
Share on other sites

Link to post
Share on other sites

People like to form opinions from clickbait articles.

I usually don't comment, but I just can't help myself here. No one anywhere is mentioning that the game is made using the old code . The old code used was infused with nvidia tech back then. They probably did a hasty job of converting it to the new stuff . In this haste they didn't change a function that swaps the implementation of HBAO for normal SSAO on some AMD cards (that should happen on all of them .) Why they didn't have an option , I don't know , but you can do it automatically . 

Second is even more important. The old Unreal 3 on the last days of its development was leaning very heavily on PhysX as it's physics engine. So much so that the new UE4 uses physx entirely for the physics. In fact most of you on AMD hardware have already run games perfectly fine with physx in the back end. 

Just really wanted to drop that off my chest , and I probably will not answear more. Have fun discussing. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, samcool55 said:

It's even worse than that. In some cases the 960 beats the 780... (not the ti, but still 780 was expensive) 

Can you imagine a next gen card priced at 200 bucks beating the 980 and destroying the 970, the currently most populair gpu? Would be insane...

But with nvidia i won't be suprised!...

960 showed the same exact performance on Project Cars launch benches as the original Titan

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

The ugly truth is Nvidia has become a scumbag company with gameworks.

 

As for Microshaft, it's a mixed bag really. Half the shit they do is awesome, the other half is total bullshit. This was probably some error they left in the game though.

 

Also  I think I have created the spawn of hell:

 

Uplay + Gameworks 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Prysin said:

no, guns are not needed. Just a brown paper bag full of benjamins

Now I can't say that that didn't happen.

 

11 hours ago, Humbug said:

it's on the dev. Supposed to test the game properly before release.

it seems they didn't test any GCN 1.2 graphics cards.

But doesn't Microsoft own them? They're also the publisher, so the fault falls as much on Microsoft as much as the devs. If an incorrectly made part is shipped from a factory to a customer you blame QA just as much as the one who made it incorrectly, no?

 

 

3 hours ago, RagnarokDel said:

Just make basengine.ini read only after changing it? It's not that freaking hard lol.

-snip-

You don't like to read, do you?

 

8 hours ago, Valentyn said:

That's literally MS there. No one can access the game files or configs. The UWP system their store uses hides everything, and if you manage to get access using a bootable Linux flash stick to explore the files; any changes you make is automatically reverted when the game is launched in the Windows Store.

 

This how MS designed UWP and anything that goes onto their store has to abide by it all. 

Which also includes only running in Windowed mode, no mGPU support, no API hooks, no modding, no overlays ( no performance reviews or benchmarks except if the game has a built in one ), and more.

Not just this post either, there's another and this was also briefly mentioned in the OP.

You know what's easier than buying and building a brand new PC? Petty larceny!
If you're worried about getting caught, here's a trick: Only steal one part at a time. Plenty of people will call the cops because somebody stole their computer -- nobody calls the cops because they're "pretty sure the dirty-bathrobe guy from next door jacked my heat sink."

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RagnarokDel said:

 

With the architecture changes, it's likely it'll happen.

 

No, it shouldn't. Look at AMD, they use an old architecture and that one keeps up very well. Even with the newer Nvidia architectures.

I'm sorry but if you release a new architecture, and your rival can keep up with you by releasing a refreshed line-up, i don't know...

 

And it is something we can only see on Nvidia cards, if you look at equal AMD cards, everything is fine.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

@DocSwag Please start removing formatting when you are pasting copied text.  If on a PC note the popup that asks you to remove formatting.  Evey single one of your threads have this issue.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Pvt. 8Ball said:

I get the feeling you're copy/quoting that from a certain someone :333

 

But yeah, it's pretty sucky overall, Nvidia are probably going to try to rely on Gameworks to hamper AMDs performance in order to stay competitive.

Yes, I am, but it's true none the less.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, ivan134 said:

No mobile GPUs?

No Multi GPU support. UWP doesn't allow anything access to the game files or executables, so NV or AMD can't simply create an SLI or Xfire profile. The game developer needs to actually build multi card support directly into the game, and then there's absolutely no way it can be added later on through a patch, or by NV or AMD.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Valentyn said:

No Multi GPU support. UWP doesn't allow anything access to the game files or executables, so NV or AMD can't simply create an SLI or Xfire profile. The game developer needs to actually build multi card support directly into the game, and then there's absolutely no way it can be added later on through a patch, or by NV or AMD.

That. Is. Retarded. Typical MS.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Dabombinable said:

That. Is. Retarded. Typical MS.

to be honest, even for MS, this is a new level of retarded.

Usually they are just obnoxious, not straight up sabotaging shit.

 

Although, i wonder what their next move will be.... Monthly subscription to use D3DX12 or later?

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Fetzie said:

And the inability to disable PhysX?

Yes because all over games that exist currently that have Nvidia features in allow you to disable them. Again, this not Nvidias fault that the developers are too incompetent enough to add the ability to turn things on and off. Only thing Nvidia does is give the developer access to their toolkit, its up to the developer on how they use it. 

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Valentyn said:

The game developer needs to actually build multi card support directly into the game, and then there's absolutely no way it can be added later on through a patch, or by NV or AMD.

I think the dev will be able to patch it in later. Just that they have to get the patch approved from Microsoft?

Link to comment
Share on other sites

Link to post
Share on other sites

This is gonna happen more often. Its nvidia's fault for making it run crap on pretty much everything and Microsoft's/dev's fault for using it (and taking money for using it). Nvidia is losing the performance advantage going into dx12 big time. They will do ANYTHING to make AMD look slow(er). They still hold 70% of the market and 95% of ppl here use and recommend Nvidia. Hell most everyone here saying nvidia are scumbags for doing this use nvidia gpu's, and you're all waiting for pascal, you're gonna buy it when it comes out.

 

you're all gonna wait and see if AMD's polaris is any good, but you trust the scumbags to give you a good card. Guess what? You're giving these scumbags those bags of money they pay the devs to sabotage the amd cards.... So this is gonna end only 1 way, AMD will look slow, everyone buys nvidia, AMD goes under.... Thanks everyone for ruining pc gaming, hope ur proud of urselves.

 

looking forward to the reactions saying im a dumbass and i dunno what im talking about.... Ill throw this reaction in your face when it turned out im right...

I have no signature

Link to comment
Share on other sites

Link to post
Share on other sites

I'd almost be pissed, it it wasn't for the fact that nobody should fucking care about a fucking crap relaunch. 

 

Also I say that a consumer protection advocate group should mandate that Nvidia Gameworks features are capable of being turned off and fully fucking disclosed to the end user.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×