Jump to content

[Updated 11/4/15] Fallout 4 to NOT feature Nvidia Gameworks, but Nvidia is still working with Bethesda on the game.

ChrisxIxCross

IMHO Witcher 3 is EXACTLY what GameWorks should be (once they patched in a tesselation slider). Graphical enhancements that are optional yet very demanding to actually push stuff to the limits of current and next-gen technologies.

So 64x or whatever amount of tessellation HairWorks used is pushing the limits of current technology.

Sorry.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

So 64x or whatever amount of tessellation HairWorks used is pushing the limits of current technology.

Sorry.

I did say "once they patched in a tesselation slider".

 

8x is a more reasonable limit to use today, but who knows maybe when everyone is playing 4k on this game being like this things graphics suck, the 64x hairworks will be a highlight...

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I did say "once they patched in a tesselation slider".

 

8x is a more reasonable limit to use today, but who knows maybe when everyone is playing 4k on this game being like this things graphics suck, the 64x hairworks will be a highlight...

kek, yeah

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

That started with one of the older Batman games and Crysis I think - none of can be proven, and it's not true from my perspective of opinion and investigation. Nvidia is not gimping AMD. Gimping AMD implies there is some code embedded in the game that is purposefully making Radeon cards run the game worse than they could despite not using Nvidia effects. Someone mentioned Crysis had that issue with water rendering under the world when it didn't need to be there, which is plausible, but that's the only one I've seen so far that could prove some meddling on Nvidia's end.

 

Also, there's a reason those effects are labeled "Nvidia Gameworks" or have "Nvidia" next to the effect name in the menus.

 

inb4 someone replies to me saying the contrary and labels Nvidia as bad as Hitler, when AMD is all the same

well sir...

if you knew that the competition would perform worse if you threw shitloads of tesselation at them... and instead of you know, using different techniques which would prolly yield the same visual result AND probably the same or better FPS (tesselation is pretty hard on any GPU really) then why is it not blatantly obvious that when Nvidia enters the stage and spamms tesselation like 14 yo spamms the chat in a MMO.

 

excessive usage of something the competition does worse, JUST because it is blatantly obvious that the competition does worse... well, yeah. It does not need to be a code to nerf AMD. this is sort of the "legal" way.... a code gimping AMD would litterally be an anti-competitive move and could get Nvidia sued for ridiculous amounts by the FTC and EU due to their market share.

 

On the flip side, using just a technique, that hits your own hardware hard too, not just AMD, just because you know AMD will get hit much harder... yeah well its just "circumstancial evidence"... until you realize Nvidia does this whenever they add "the way its meant to be played".... its an abusive tactic that has yet to be "discovered as a deliberate action"

Link to comment
Share on other sites

Link to post
Share on other sites

well sir...

if you knew that the competition would perform worse if you threw shitloads of tesselation at them... and instead of you know, using different techniques which would prolly yield the same visual result AND probably the same or better FPS (tesselation is pretty hard on any GPU really) then why is it not blatantly obvious that when Nvidia enters the stage and spamms tesselation like 14 yo spamms the chat in a MMO.

 

excessive usage of something the competition does worse, JUST because it is blatantly obvious that the competition does worse... well, yeah. It does not need to be a code to nerf AMD. this is sort of the "legal" way.... a code gimping AMD would litterally be an anti-competitive move and could get Nvidia sued for ridiculous amounts by the FTC and EU due to their market share.

 

On the flip side, using just a technique, that hits your own hardware hard too, not just AMD, just because you know AMD will get hit much harder... yeah well its just "circumstancial evidence"... until you realize Nvidia does this whenever they add "the way its meant to be played".... its an abusive tactic that has yet to be "discovered as a deliberate action"

 

Wait so what are you referring to exactly? What game?

Link to comment
Share on other sites

Link to post
Share on other sites

There are no doubt Nvidia are putting pressure where AMD weakness is with gameworks.

There are no game studio, that would willingly publish a game with dedicated code to gimp AMD cards.

 

Nvidia probably convincing game studios, they need less developer time on porting the game (if it is a console-title) with gameworks. Might explain some of the bad ports been released. Game studios are most likely happy to make cuts in expenses. However gameworks have started gathering a negative effect publicity with it. Lets see how it will continue in the future.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Wait so what are you referring to exactly? What game?

any Nvidia title past 2013 has had some varying degree of heavy tesselation in them. Litterally ALL of them.

You cannot always change the settings, but its there if you dig through ini files or force it off in CCC you see better performance.

 

yes, it does make the game look better.

yes it does make maxwell look better.

yes, all we can do is deal with it

 

but i know you are illogically stubborn when discussing the possibility that Nivida MAY have done some shady play.

So i will do the most sensible thing in response.

Link to comment
Share on other sites

Link to post
Share on other sites

any Nvidia title past 2013 has had some varying degree of heavy tesselation in them. Litterally ALL of them.

You cannot always change the settings, but its there if you dig through ini files or force it off in CCC you see better performance.

 

yes, it does make the game look better.

yes it does make maxwell look better.

yes, all we can do is deal with it

 

but i know you are illogically stubborn when discussing the possibility that Nivida MAY have done some shady play.

So i will do the most sensible thing in response.

 

Well they did say Maxwell had more tessellation horsepower than their last gen cards, so I would imagine they'd want to utilize that. But if it can be proven that they purposefully made the game with a stupid high tessellation setting that cannot be changed at all (talking outside of Gameworks) then there may be something to that.

but i know you are illogically stubborn when discussing the possibility that Nivida MAY have done some shady play.

So i will do the most sensible thing in response.

I'm not illogically stubborn, I'm not going to change my mind by hearing illogical whining coming from the fanboy brigade. All I see some of the users here say is "gameworks is a blackbox" and all this other bullcrap that cannot be proven. I provide context and evidence yet it's dismissed because it doesn't fit with what those users want the truth to be.

You can go and look at any argument that I've given and it's almost always been the logical one. I haven't stepped into any fanboyism because I don't want to look like some kind of shill. If and when Nvidia fucks up (and they do it a lot), I see it. I'm not a blind sheep.

Link to comment
Share on other sites

Link to post
Share on other sites

So 64x or whatever amount of tessellation HairWorks used is pushing the limits of current technology.

Sorry.

This doesn't matter at all it's an optional Nvidia feature that would not even be in the game without them.

Nvidia blocked those features before out for AMD users all together in games like CoD Ghosts there is 0 reason for an AMD user to complain about Witcher 3 performance.

It's like Nvidia users complaining that Battlefield 4 runs as fast on a 7970 as a 780 because of Mantle.

 

well sir...

if you knew that the competition would perform worse if you threw shitloads of tesselation at them... and instead of you know, using different techniques which would prolly yield the same visual result AND probably the same or better FPS (tesselation is pretty hard on any GPU really) then why is it not blatantly obvious that when Nvidia enters the stage and spamms tesselation like 14 yo spamms the chat in a MMO.

 

excessive usage of something the competition does worse, JUST because it is blatantly obvious that the competition does worse... well, yeah. It does not need to be a code to nerf AMD. this is sort of the "legal" way.... a code gimping AMD would litterally be an anti-competitive move and could get Nvidia sued for ridiculous amounts by the FTC and EU due to their market share.

 

On the flip side, using just a technique, that hits your own hardware hard too, not just AMD, just because you know AMD will get hit much harder... yeah well its just "circumstancial evidence"... until you realize Nvidia does this whenever they add "the way its meant to be played".... its an abusive tactic that has yet to be "discovered as a deliberate action"

First you don't understand why tessellation is even used it's not to make objects look prettier.

Tessellation allows for dynamic altering of object meshes which is more efficient as an LOD instead of the traditional mesh swapping and it reduces at the same time Vram usage heavily which is why Nvidia uses it.

It also has the side effect that you have barely any pop in and it saves developing time.

They aren't gimping performance of AMD cards they simply suck at tessellation that's not Nvidia's problem that's a problem with AMD GPU's.

And tessellation performance on Nvidia has been better than AMD since Fermi it was a massive focus of them in the 500 series they released 2 tessellation tech demos they have been pushing tessellation long before gameworks and that's a good thing.

It's no different from Ashes Of Singularity running better on AMD because it's compute focused which is what AMD cards are better in.

Nvidia and AMD GPU's are focused on different aspects of graphics and depending on the games goal it will yield different results it has nothing to do with gimping either one.

If you have a compute heavy game it will run better on current AMD cards if you have a game that does a lot of mesh switching through tessellation which many open world games do it will run better on Nvidia.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Well they did say Maxwell had more tessellation horsepower than their last gen cards, so I would imagine they'd want to utilize that. But if it can be proven that they purposefully made the game with a stupid high tessellation setting that cannot be changed at all (talking outside of Gameworks) then there may be something to that.

I'm not illogically stubborn, I'm not going to change my mind by hearing illogical whining coming from the fanboy brigade. All I see some of the users here say is "gameworks is a blackbox" and all this other bullcrap that cannot be proven. I provide context and evidence yet it's dismissed because it doesn't fit with what those users want the truth to be.

You can go and look at any argument that I've given and it's almost always been the logical one. I haven't stepped into any fanboyism because I don't want to look like some kind of shill. If and when Nvidia fucks up (and they do it a lot), I see it. I'm not a blind sheep.

you can never prove whether it is intentional or not that all "the way its meant to be played" titles feature heavy tesselation.

remember, it does NOT need to be x64... x8 or x16 would be plenty enough for ANY Nvidia card to gain 5fps+ lead on the corresponding AMD card....

 

Now how can you disprove this?

by forcing tesselation off in both CCC and NCP... however is there any guarantees that Nvidia hasnt "hardlocked" their own GPUs to use the tesselation settings of the game no matter what, to make sure such testing NEVER can reveal the issue at hand?

Link to comment
Share on other sites

Link to post
Share on other sites

you can never prove whether it is intentional or not that all "the way its meant to be played" titles feature heavy tesselation.

remember, it does NOT need to be x64... x8 or x16 would be plenty enough for ANY Nvidia card to gain 5fps+ lead on the corresponding AMD card....

 

Now how can you disprove this?

by forcing tesselation off in both CCC and NCP... however is there any guarantees that Nvidia hasnt "hardlocked" their own GPUs to use the tesselation settings of the game no matter what, to make sure such testing NEVER can reveal the issue at hand?

 

I don't know. I don't make the games.

 

I just know the whole gameworks conspiracy is a load of crap and an overreach to what's actually going on.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know. I don't make the games.

 

I just know the whole gameworks conspiracy is a load of crap and an overreach to what's actually going on.

gameworks is just poorly made in the first place... it gimps BOTH sides...

 

tesselation spam being prevalent in especially Nvidia titles. Now, a few titles doing it is one thing. Pretty much all of them doing it falls under "once is a coincidence, twice can still be a coincidence. thrice is definetively fishy"

Link to comment
Share on other sites

Link to post
Share on other sites

Good to see things haven't changed much in the last 8 months on LTT lol. No shortage of people screaming Nvidia is the devil!!

You can't be serious.  Hyperthreading is a market joke?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

gameworks is just poorly made in the first place... it gimps BOTH sides...

 

tesselation spam being prevalent in especially Nvidia titles. Now, a few titles doing it is one thing. Pretty much all of them doing it falls under "once is a coincidence, twice can still be a coincidence. thrice is definetively fishy"

 

Tessellation isn't the confirmed dominant factor in the poorly optimized titles. We had one game that tessellation was confirmed to be jacked up on, and that was The Witcher 3. They even fixed it post-launch, like any normal developer would that actually cares about their titles even after release. So even then, I don't even know why that's a factor in your case against Gameworks because that issue was tied to HairWorks - then if you had a Radeon 285 you could run Hairworks at a certain setting and not have any major drops.

 

The problem I have with you lot is that you're tying this issue to Nvidia's library, and that's just not correct. You guys almost always put the blame on Nvidia when anyone in their right mind would put the blame on the development studio.

 

Sure, make the case for the effects being poorly coded. I gotcha. But what engine is even offering proper agnostic alternatives to the likes of HairWorks, FleX, PhysX? None. Nvidia has engineers that specialize in that stuff and they send them to the studios to help with the game if they ask for it, from what I understand.

 

I've given examples of games that use the Gameworks effects well before, but again, that gets thrown out the window because it doesn't fit in with the truth you guys want this to be.

 

Now, don't interpret this like I'm some "Nvidiot" (my favorite), "Nvidia shill", "Nvidia apologist" or whatever name you wanna call me. I'm just telling you there's more to this than just screaming that Nvidia is pulling some shit. They do pull some shit, and they need to get their shit together, but this conspiracy crap isn't it.

Link to comment
Share on other sites

Link to post
Share on other sites

Tessellation isn't the confirmed dominant factor in the poorly optimized titles. We had one game that tessellation was confirmed to be jacked up on, and that was The Witcher 3. They even fixed it post-launch, like any normal developer would that actually cares about their titles even after release. So even then, I don't even know why that's a factor in your case against Gameworks because that issue was tied to HairWorks - then if you had a Radeon 285 you could run Hairworks at a certain setting and not have any major drops.

 

The problem I have with you lot is that you're tying this issue to Nvidia's library, and that's just not correct. You guys almost always put the blame on Nvidia when anyone in their right mind would put the blame on the development studio.

 

Sure, make the case for the effects being poorly coded. I gotcha. But what engine is even offering proper agnostic alternatives to the likes of HairWorks, FleX, PhysX? None. Nvidia has engineers that specialize in that stuff and they send them to the studios to help with the game if they ask for it, from what I understand.

 

I've given examples of games that use the Gameworks effects well before, but again, that gets thrown out the window because it doesn't fit in with the truth you guys want this to be.

 

Now, don't interpret this like I'm some "Nvidiot" (my favorite), "Nvidia shill", "Nvidia apologist" or whatever name you wanna call me. I'm just telling you there's more to this than just screaming that Nvidia is pulling some shit. They do pull some shit, and they need to get their shit together, but this conspiracy crap isn't it.

read my comment:

part A: Gameworks is poorly made.

 

part B: tesselation spam in nvidia games.

 

tesselation spam was a thing prior to gameworks. It is unrelated to gameworks as a whole.

 

Nvidia LOVES tesselation. AMD suck at tesselation.

There is NO SURPRISES HERE. Now put on your reading glasses and stay tuned.

 

The whole thing i am arguing for, and rightfully so, is that pretty much all Nvidia titles WITH OR WITHOUT GAMEWORKS ADDITIONS, run with mid to high tesselation. Either as a "fixed constant". Thus not possible to adjust ingame. Or with an ingame slider.

 

In either case, we know that Nvidia applies a metric tonn of tesselation. but why?

well we also know they are stingy on their VRAM and memory bus... as @JasonRoGo pointed out, tesselation eases the load on VRAM.... what a funny coincidence...

So each game Nvidia fiddles with is optimized very well. Purely for Nvidia. NO SURPRISE THERE ATLEAST.

 

Funnily enough, why didnt Nvidia just do like AMD? Why didnt they just sell 3GB cards and 256bit memory bus? then they wouldnt need as much tesselation spam and not to mention they would generally do better and their cards live a little longer before hitting the "VRAM limit" as newer games push more and more VRAM usage....

There is no answer to this. It is simply food for thought

 

Even more funny is that despite tesselation beign a boon, it also DOES put a strain on Nvidias own hardware, especially as you load up x8 to x16..... So why is Nvidia so obsessed about Tesselation? Is it purely because it looks good? Saves a little dev time?

NO

 

It is because AMD suck at it, and Nvidia doesnt. It is a pure advantage to them. Since A LOT of the Nvidia titles doesnt feature a slider to adjust the tesselation levels it is nigh impossible to guess what it is set to without either reading the code, or forcing higher/lower values in CCC and NCP until the visual fidelity matches the stock game. An excessively time consuming process which wont really prove anything other other "how shit the game will run on AMD cards"....

 

The funny thing is, in AMD titles or neutral titles the tesselation spam is WAY LOWER... there may still be tesselation, but i have yet to see it exceed X8 at worst... most of them uses X2 or X4.....

So why is it Nvidia cranks their tesselation higher? Is it purely to make the game look better?

Because the difference between X8 and X16 isnt that screaming visually, but it is performance wise. So i doubt they do it for the looks.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Could you fix your formatting? I'll actually read it if you do. I promise.

Link to comment
Share on other sites

Link to post
Share on other sites

Bethesda showcases its Creation Engine, the company game engine which powers Fallout 4. They says that their secret for impressive graphics is the close work that developers and their artist. But also, in order to make the game graphically shine, they have partnered itself with NVIDIA, who help them develop and optimize the game shaders and make it even more visually impressive. One of the features that NVIDIA help, the company says, is volumetric light.
 

 
(click for full image size)
Fallout4_graph01web.jpg

 
The game studio does says that the game will work on any platform. You don't need to have an Nvidia GPU to enjoy the graphics.
 

To create that volumetric light spilling across the scene (sometimes called “god rays”) we worked with our friends at NVIDIA, who’ve we worked with dating back to Morrowind’s cutting-edge water. The technique used here runs on the GPU and leverages hardware tessellation. It’s beautiful in motion, and it adds atmospheric depth to the irradiated air of the Wasteland. Like all the other features here, we’ve made it work great regardless of your platform.

 
However, the studio doesn't say how it will perform on AMD hardware as obviously the NVIDIA engineers only their GPUs from the ground up, and can and will optimize teh shaders of these effects for their graphics cards.
 
 
The company also talks about other visual effects made by them, with impressive screen shots on their site if you are interested.
 
Source: https://bethesda.net/?utm_source=Twitter&utm_medium=social&utm_campaign=11415-FO-graphics#en/events/game/the-graphics-technology-of-fallout-4/2015/11/04/45
 
 
Looks like you AMDbros are getting screwed over again thanks to our friends at Nvidia  :). I honestly don't understand why developers agree to this stuff. I own a 980 Ti but I'm completely against this.  
 
Bethesda posted this in response to some leaked screenshots that looked...plain bad. Too bad they kinda ended up shooting themselves in the foot by angering AMD owners. 
 
It's sad to see this continuing, as much as I love games like the Witcher 3 I believe that working with Nvidia exclusively does not help the tech community as a whole.

Guide to GTX 900 Series: http://linustechtips.com/main/topic/457526-nvidia-900-series-basic-performance-guide/

Performance expert, building noob. 

There is no such thing as excess in hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like PhysX. Just buy an old GPU second hand for like 15 bucks and you'll be fine.

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like PhysX. Just buy an old GPU second hand for like 15 bucks and you'll be fine.

 

That's not PhysX.

Link to comment
Share on other sites

Link to post
Share on other sites

What exactly is stopping AMD from working with game developers? 

 

Seems like no foul to me. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Kinda like how Nvidia fans got screwed over with Battlefront?

 

What?

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is giving a better experience to most people? Just because AMD can't be bothered investing time and money into providing the best experience for their consumers doesn't mean Nvidia is somehow in the wrong here.

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×