Jump to content

Well, it looks like HBAO+ is behind all that corruption in Gears of War.

4 minutes ago, Humbug said:

I think the dev will be able to patch it in later. Just that they have to get the patch approved from Microsoft?

According to PCPer anything that significant needs to be done during development, and is extremely difficult.

 

They had to work directly with Oxide to even get FCAT support built into the game engine, and that took a lot of back and forth, and Still doesn't even work properely or on all GPUs.

Multi GPU support is going to be significantly more difficult, especially since NV or AMD can't get in there directly and their long working Profiel system is defunct with UWP.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Misanthrope said:

I'd almost be pissed, it it wasn't for the fact that nobody should fucking care about a fucking crap relaunch. 

 

Also I say that a consumer protection advocate group should mandate that Nvidia Gameworks features are capable of being turned off and fully fucking disclosed to the end user.

 

7 minutes ago, Helly said:

This is gonna happen more often. Its nvidia's fault for making it run crap on pretty much everything and Microsoft's/dev's fault for using it (and taking money for using it). Nvidia is losing the performance advantage going into dx12 big time. They will do ANYTHING to make AMD look slow(er). They still hold 70% of the market and 95% of ppl here use and recommend Nvidia. Hell most everyone here saying nvidia are scumbags for doing this use nvidia gpu's, and you're all waiting for pascal, you're gonna buy it when it comes out.

 

Except that this has Nothing to do with NV, did you even look up anything else related to the Windows Store and games? How UWP forces restrictions on Games, and removes features that have been there for decades?

Never mind that in all newly released games like Fallout 4 and Rise of the Tomb Raider, HBAO+ actually runs better on AMD hardware than NVIDIA.

This is all squarely on MS, their developers and their horrific UWP system for the Windows Store. 

http://www.overclock3d.net/reviews/gpu_displays/rise_of_the_tomb_raider_pc_performance_retested_with_new_amd_drivers/6

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_retested_hbao_performance_impact/1

Please folks, look up PCPer's work and articles related to UWP and games, the same for Ars Technica and how massively bad it is for everyone. It makes even Ubisoft's terrible gamework's implementations or Batman Arkham Knight look good.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Humbug said:

LOL no.

not even close.

 

Ok ill admit it, its changed somewhat over the last few months, but its still "get a 970/380(x)/390". I would like to see more AMD only recommendations. Also, please understand that im not an AMD fanboy, even though it might look like that. I just hate nvidia :P

 

oh and also, less talk about how awesome pascal is gonna be..... More polaris pls ;)

I have no signature

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Valentyn said:

 

 

Except that this has Nothing to do with NV, did you even look up anything else related to the Windows Store and games? How UWP forces restrictions on Games, and removes features that have been there for decades?

Never mind that in all newly released games like Fallout 4 and Rise of the Tomb Raider, HBAO+ actually runs better on AMD hardware than NVIDIA.

This is all squarely on MS, their developers and their horrific UWP system for the Windows Store. 

http://www.overclock3d.net/reviews/gpu_displays/rise_of_the_tomb_raider_pc_performance_retested_with_new_amd_drivers/6

http://www.overclock3d.net/reviews/gpu_displays/fallout_4_retested_hbao_performance_impact/1

Please folks, look up PCPer's work and articles related to UWP and games, the same for Ars Technica and how massively bad it is for everyone. It makes even Ubisoft's terrible gamework's implementations or Batman Arkham Knight look good.

Actually you completely misunderstood: Whenever an Nvidia gameworks feature is included in the game, it should be clearly labeled and clearly disclosed. This would be the responsibility OF MICROSOFT (obviously, they're the publisher) I was actually on your side all along.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

Actually you completely misunderstood: Whenever an Nvidia gameworks feature is included in the game, it should be clearly labeled and clearly disclosed. This would be the responsibility OF MICROSOFT (obviously, they're the publisher) I was actually on your side all along.

Ah terribly sorry, so far though I've just seen needless NV bashing it's gotten out of hand. To the point people have completely forgotten even the Rise of the Tomb Raider MS Store version is still a horrible mess as well.

People just need to boycott the MS Store until they wake up and actually care about PC gamers.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Prysin said:

to be honest, even for MS, this is a new level of retarded.

Usually they are just obnoxious, not straight up sabotaging shit.

 

Although, i wonder what their next move will be.... Monthly subscription to use D3DX12 or later?

Either way, as long as I have 1 fully Windows 7 compatible computer that can't upgrade to Windows 10, none of them will be running MSX and its DX12.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Helly said:

 

Ok ill admit it, its changed somewhat over the last few months, but its still "get a 970/380(x)/390". I would like to see more AMD only recommendations.

 

in this forum or places like this you see loads of AMD recommendations. Just go to the graphics card section and browse.

 

AMD's problem in the GPU space is a marketing one. A lot of bs gets spread about their drivers and stuff which when combined with the brand strength of the 'geforce' name causes an issue for AMD. So for example if you have one bad experience with an AMD GPU you will get a lot of negative reinforcement saying AMD sucks. That's because NVIDIA marketing is strong, so even when people have serious technical issues with NVIDIA they are more likely to chalk it down to bad luck.

 

In the GPU space AMD doesn't seem to have a problem in technology or performance, their drivers are solid, they can use the same architecture for years and still outperform NVIDIA at most price points... But they do have a marketing issue.

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Humbug said:

In the GPU space AMD doesn't seem to have a problem in technology or performance, their drivers are solid, they can use the same architecture for years and still outperform NVIDIA at most price points... But they do have a marketing issue.

 

 

If you go to reddit it's become a meme to always say "should have gotten the AMD 390", because it gets recommended so much.

My biggest issue with AMD is their drivers personally; which is why I moved from a FirePro based OpenCL workstation to an NV Cuda one.

It's more of an issue for me with games as I like to be able to do both on my system. AMD needs better and faster Crossfire support drivers , they need to fix the long going Clock Speed fluctuation bug, and they most certainly need Xfire and FreeSync Windowed Mode support.

Give me those and I'll be on Dual Polaris; just like I was on dual Tahiti before Maxwell.

I do have to hand it to them, they are amazing at squeezing out performance on cards over the years. Who knew that when the 7970 launched and was only able to compete with the GTX 580 we'd be here now with it as a refined Tonga core as the 380 still working hard.

Even the 290X on launch matching the Titan and now going up against and even beating the GTX 980 as the refined 390X is amazing.

 

On the note of Marketing however AMD are useless. I'm still salty about them being silent on Purehair and it's base being TressFX 3.0. They need to get people to look at them

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Even more details as to why Gears of War is a massive steaming pile of dung. It's still using the original release engine of Unreal 3 with everything else tacked on, and then forced into the terrible UWP system's restrictions.

 

@DocSwag

http://www.extremetech.com/gaming/223858-the-new-gears-of-war-ultimate-edition-is-a-dx12-disaster

 

 

Quote

Unlike Ashes of the Singularity or Fable Legends, Gears of War Ultimate Edition was never designed to be a DX12 — or even a DX11 — title. When Digital Foundry reviewed the game last August, it noted:

 

 

 

While Gears of War 4 is in development using Unreal Engine 4, Gears Ultimate instead opts for more familiar ground – the original 2006 source code. From the beginning, the Ultimate Edition was designed to capture the original experience as accurately as possible while updating its presentation for the current generation. More recent versions of Unreal Engine 3, and even UE4, were considered early in development, but the decision to stick with the original codebase was made in order to preserve the original simulation. (emphasis added)

 

 

 

 

Gears of War Ultimate Edition isn’t a new implementation of a classic game, it’s built on the same source code and engine as its 10-year-old predecessor. That means everything The Coalition did to bring the game into the modern age, like adding 4K support and higher-quality textures, was done with a version of the Unreal Engine that was barely out of diapers. Not even the latest version of UE3 supports DX12 — but Microsoft decided to stuff it into a decade-old title and shove it into the Windows Store. However they hacked the engine to implement DirectX 12, there’s no way that the 2006-era Unreal engine could ever be considered a good candidate for the process.

 

 

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Valentyn said:

 

If you go to reddit it's become a meme to always say "should have gotten the AMD 390", because it gets recommended so much.

My biggest issue with AMD is their drivers personally; which is why I moved from a FirePro based OpenCL workstation to an NV Cuda one.

It's more of an issue for me with games as I like to be able to do both on my system. AMD needs better and faster Crossfire support drivers , they need to fix the long going Clock Speed fluctuation bug, and they most certainly need Xfire and FreeSync Windowed Mode support.

Give me those and I'll be on Dual Polaris; just like I was on dual Tahiti before Maxwell.

I do have to hand it to them, they are amazing at squeezing out performance on cards over the years. Who knew that when the 7970 launched and was only able to compete with the GTX 580 we'd be here now with it as a refined Tonga core as the 380 still working hard.

Even the 290X on launch matching the Titan and now going up against and even beating the GTX 980 as the refined 390X is amazing.

 

On the note of Marketing however AMD are useless. I'm still salty about them being silent on Purehair and it's base being TressFX 3.0. They need to get people to look at them

Thats the downside of Open Source shit.

once you "give it away" you cannot just go "we made that"... because PureHair is a modified version. We do not know HOW modified, but it IS modified. Thus AMD would be in the wrong to claim "look at that, thats our tech". It is not, is it Crystal Dynamics tech.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Prysin said:

Thats the downside of Open Source shit.

once you "give it away" you cannot just go "we made that"... because PureHair is a modified version. We do not know HOW modified, but it IS modified. Thus AMD would be in the wrong to claim "look at that, thats our tech". It is not, is it Crystal Dynamics tech.

They don't need to claim it's their tech, they can claim however that it's a perfectly example of what's to come with GPUOpen. They provided Crystal Dynamics and Nixxes with TressFX and it was modified and redesigned to perfectly fit their needs and game; while allowing amazing performance on all platforms.

That's the beauty of it. Instead they said nothing, even NVIDIA mention Pure Hair on their website siting how it's similar tech to Hairworks and how brilliant it is.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Citadelen said:

I just noticed it says Games of War instead of Gears of War in the title...

Lol whoops will fix.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

If you ask me this is MS's fault, as they implemented it pretty badly and it was also them who made it impossible to disable PhysX. 

However, all the Nvidia points being made are saying Gameworks should die in general. At least in my opinion, I think the probably here is MS, but Gameworks is also bad, in general. This case probably isn't Nvidias fault.

Either way, Gears of War seems to be a total failure.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

http://wccftech.com/nvidia-gameworks-visual-corruption-gears-war-ultimate-edition/

Quote

We have also tested the game again with the newly released patch. This patch was intended to fix the visual corruption issue with ambient occlusion turned on. However, it appears that all this patch does to address the issue is forcibly disable Ambient Occlusion, even if enabled through the menu.

Ms getting desperate.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Valentyn said:

They don't need to claim it's their tech, they can claim however that it's a perfectly example of what's to come with GPUOpen. They provided Crystal Dynamics and Nixxes with TressFX and it was modified and redesigned to perfectly fit their needs and game; while allowing amazing performance on all platforms.

That's the beauty of it. Instead they said nothing, even NVIDIA mention Pure Hair on their website siting how it's similar tech to Hairworks and how brilliant it is.

Well, its no shock that AMDs marketing department is completely void of all common sense.

 

Litterally, they should fire all of them due to sheer incompetence

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, QueenDemetria said:

This is why Microsoft needs to abandon gaming entirely, and let someone with more passion and understanding of their customers run a game service.

Don't you mean Nvidia? Microsoft understands and knows gaming inside out.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Valentyn said:

 That means everything The Coalition did to bring the game into the modern age, like adding 4K support and higher-quality textures, was done with a version of the Unreal Engine that was barely out of diapers. Not even the latest version of UE3 supports DX12 — but Microsoft decided to stuff it into a decade-old title and shove it into the Windows Store. However they hacked the engine to implement DirectX 12, there’s no way that the 2006-era Unreal engine could ever be considered a good candidate for the process.

Ya according to what we have been told from Croteam, from Valve, and from Chris Roberts there is no benefit to simply making a game engine DX12 or Vulkan compatible. To actually get the benefit you have to refactor the engine at a more fundamental level, otherwise there are loads of other bottlenecks which make Vulkan / DX12 pointless.

 

But in the case of Gears of War I guess we will not have a DX11 version to compare.

Link to comment
Share on other sites

Link to post
Share on other sites

Forget about NVidia and Microsoft for a second; how in the world of yarn did the developers manage to keep so much of the original code and implementation?

 

This level of developer laziness is absolutely deplorable. This is neither the fault of NVidia, which has had their work in GOW from the get-go, nor Microsoft, which had the GOW series as a console exclusive with a GFWL caveat.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Humbug said:

Ya according to what we have been told from Croteam, from Valve, and from Chris Roberts there is no benefit to simply making a game engine DX12 or Vulkan compatible. To actually get the benefit you have to refactor the engine at a more fundamental level, otherwise there are loads of other bottlenecks which make Vulkan / DX12 pointless.

 

But in the case of Gears of War I guess we will not have a DX11 version to compare.

Worst is I don't even think DX11 could be properly supported. Unreal Tournament 3 only officially launched in 2007 to show case the Unreal 3 Engine in all it's glory.

 

Which means the 2006 version wasn't even completely finished to Epic's own standards for use. Never mind the engine was inherently built for DX9 at the time, never mind DX10 or 11.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

R6QmJvy.thumb.jpg.303d3d7f5796ab1877af2e

My PC specs; Processor: Intel i5 2500K @4.6GHz, Graphics card: Sapphire AMD R9 Nano 4GB DD Overclocked @1050MHz Core and 550 MHz Memory. Hard Drives: 500GB Seagate Barracuda 7200 RPM, 2TB Western Digital Green Drive, Motherboard: Asus P8Z77-V , Power Supply: OCZ ZS series 750W 80+ Bronze certified, Case: NZXT S340, Memory: Corsair Vengance series Ram, Dual Channel kit @ 1866 Mhz, 10-11-10-30 Timings, 4x4 GB DIMMs. Cooler: CoolerMaster Seidon 240V

Link to comment
Share on other sites

Link to post
Share on other sites

Why am i not surprised that nvidia did this ?

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

Not surprised with the reaction here esp when the forum is full of kids that easily points fingers at anybody. 

 

The one that should be blamed here are the microsoft devs for poorly implementing features. But what can you expect? They are bunch of console peasants devs that know little about pc. And also, it's a MICROSOFT STORE only game. The options you have will be so limited. You have to deal with a lot of bullshit. Try playing RoTTR in there.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DocSwag said:

If you ask me this is MS's fault, as they implemented it pretty badly and it was also them who made it impossible to disable PhysX. 

However, all the Nvidia points being made are saying Gameworks should die in general. At least in my opinion, I think the probably here is MS, but Gameworks is also bad, in general. This case probably isn't Nvidias fault.

Either way, Gears of War seems to be a total failure.

If the game was built in UDK, then PhysX is used as the core physics engine as there is no support for Havok. And since there's no option to disable it, I'm guessing it's built in UDK.

 

Looking through gameplay, there aren't any PhysX specific particles or fabrics or anything like that, just the core game physics. At that point it's no heavier than if it was built in Havok, albeit with worse multithreading when running on the CPU which shouldn't be an issue anyway unless you've got a potato PC. Remember it's a physics engine that does more than just pretty effects

LTT's fastest Valley 970, slowest Valley Basic and Extreme HD scores

 

Desktop || CPU - i5 4690k || Motherboard - ASUS Gryphon Z97 || RAM - 16GB Kingston HyperX 1866MHz || GPU - Gigabyte G1 GTX 970 *Cough* 3.5GB || Case - Fractal Design Define R5 || HDD - Seagate Barracuda 160GB || PSU - Corsair AX760
Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Texhnlyze said:

Not surprised with the reaction here esp when the forum is full of kids that easily points fingers at anybody. 

 

The one that should be blamed here are the microsoft devs for poorly implementing features. 

and the game devs themselves

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×