Jump to content

"Nvidia Disappointed that Witcher, Cars "Tainted by False Allegations""

MegaDave91

That's not Nvidia's or the studio's fault, it's AMD's. AMD needs to hire people who can actually do a job, not a Chief Gamer Scientist who likes to stir up marketing spins to get the scraps off of the dinner table: making meme's of Radeon cards "Actually having 4GB of VRAM" and then claiming Nvidia is hurting their performance in Gameworks titles, a completely false accusation.

What you're saying is you want Radeon cards to be able to run Nvidia effects? Is that why you keep bringing up your arguments?

Yes it is sustainable. AMD should get their shit together because they could be handing Nvidia's asses to them. But no, they have to put Huddy out there to get all of the gullible people (AMD and Nvidia owners) riled up and raise their pitchforks to Nvidia for no reason.

 

It's not NVidia's fault, that AMD does not have source code access, needed for proper optimization? What? Like I said, we know for a fact, that optimization without source code access, is ineffective and extremely difficult. Hiring more people, will not change this.

 

What I'm saying, is that I want gamers to be able to use the effects, that are built into the game, so they can experience it, like the DEVS wants you to. Idgaf who invented/financed the effect; I just want gamers to be able to experience the game fully, vendor neutral. Like I said, I don't understand, why any gamer would want it any other way.

 

If a game can no longer render hair, fire, physics, facial movements, etc, would you say that is ok? Sure it's an extreme example, but could be the future scenario, if a game, is GameWorks exclusive, and fully integrated. Why would ANY gamer want this?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

What exactly are you getting at, and how is this Nvidia's spin? This has always been this way from the get-go.

Why is it a bad thing that they can't show or share Gameworks code with AMD? Nothing in the code is purposely gimping AMD, and the effects that do not perform well can be turned off. It's an issue you're trying to turn into an Nvidia-hate train, just like anyone else who refuses to see the facts.

 

Okay while I can see your point, Nvidia did nothing wrong, we are still heading down a very slippery slope. We already are getting segmentation in monitors, do we really need segmentation in games too? This is on top of the fact that the PC games market gets screwed by consoles (not due to anyones fault, it is just the way it is).

 

And I am 99% sure AMD said they approached CDPR about including TressFX in The Witcher, to which they declined as it is just too much work (or maybe AMD came too late with the offer). Either way, everyone needs to be careful.

 

Gamers want access to all games equally, publihsers want to sell to all gamers, Nvidia doesn't want to be a monopoly (as proof by their AMD is strong competitor speech), and AMD doesn't want to go broke.

 

Everyone (including Nvidia) needs to cop the f**k on and come to some mutual agreements. Nvidia can continue down this path, but it may come back to bite them in the ass. Nobody wants to go to the mattresses. Blood is a big expense. It's bad for business.

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

It's not NVidia's fault, that AMD does not have source code access, needed for proper optimization? What? Like I said, we know for a fact, that optimization without source code access, is ineffective and extremely difficult. Hiring more people, will not change this.

 

What I'm saying, is that I want gamers to be able to use the effects, that are built into the game, so they can experience it, like the DEVS wants you to. Idgaf who invented/financed the effect; I just want gamers to be able to experience the game fully, vendor neutral. Like I said, I don't understand, why any gamer would want it any other way.

 

If a game can no longer render hair, fire, physics, facial movements, etc, would you say that is ok? Sure it's an extreme example, but could be the future scenario, if a game, is GameWorks exclusive, and fully integrated. Why would ANY gamer want this?

It very much so is AMD's fault. Hiring people who can go and work with developers like Nvidia is doing would be a good start. If they're already doing that, they sure are spread very thin or don't know what they're doing and Huddy is just putting a spin on it.

What you're saying is you want to use Nvidia's effects on Radeon cards. You know good and well that won't happen properly unless Nvidia and AMD come to some kind of agreement. IDGAF who invented or financed the effect either, and I would also like everyone to have the benefits that Nvidia users have. Too bad. We don't live in a perfect world. If you want to do something about it, go tell AMD to tell Huddy to shutup or actually do some real PR work. And for God's sake tell them to change his title.

A game will always be able to render that stuff, so your scenario is bogus. Nvidia's effects are Nvidia's in-house created effects. Whenever devs want to use them, Nvidia's the evil one since AMD can't run the effect (even though they can turn it off) amirite?

Also, please stop using so many commas bro. It makes what you're typing super hard to read, especially since you're using way more than you'd need.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay while I can see your point, Nvidia did nothing wrong, we are still heading down a very slippery slope. We already are getting segmentation in monitors, do we really need segmentation in games too? This is on top of the fact that the PC games market gets screwed by consoles (not due to anyones fault, it is just the way it is).

 

And I am 99% sure AMD said they approached CDPR about including TressFX in The Witcher, to which they declined as it is just too much work (or maybe AMD came too late with the offer). Either way, everyone needs to be careful.

 

Gamers want access to all games equally, publihsers want to sell to all gamers, Nvidia doesn't want to be a monopoly (as proof by their AMD is strong competitor speech), and AMD doesn't want to go broke.

 

Everyone (including Nvidia) needs to cop the f**k on and come to some mutual agreements. Nvidia can continue down this path, but it may come back to bite them in the ass. Nobody wants to go to the mattresses. Blood is a big expense. It's bad for business.

PC gaming isn't getting screwed by consoles. The lord and savior of the PC gaming community, AMD, are the ones powering the stupid things. Blame them. We're not getting segregation in games, one company is leaving another company in the dust (for right now it seems). It's that simple.

I don't know if AMD went to CDPR late or if CDPR declined, but if CDPR declined that would upset me personally even as an Nvidia user. As much as they delayed the game they could have made another exception or even put it in their pipeline ahead of their schedule in the first place.

As far as I know, AMD are the ones who aren't wanting to come to some kind of agreement, like @Mr Moose has pointed out before time and time again. If those articles are true, AMD are the ones holding PC gaming back. Lol.

Link to comment
Share on other sites

Link to post
Share on other sites

It very much so is AMD's fault. Hiring people who can go and work with developers like Nvidia is doing would be a good start. If they're already doing that, they sure are spread very thin or don't know what they're doing and Huddy is just putting a spin on it.

What you're saying is you want to use Nvidia's effects on Radeon cards. You know good and well that won't happen properly unless Nvidia and AMD come to some kind of agreement. IDGAF who invented or financed the effect either, and I would also like everyone to have the benefits that Nvidia users have. Too bad. We don't live in a perfect world. If you want to do something about it, go tell AMD to tell Huddy to shutup or actually do some real PR work. And for God's sake tell them to change his title.

A game will always be able to render that stuff, so your scenario is bogus. Nvidia's effects are Nvidia's in-house created effects. Whenever devs want to use them, Nvidia's the evil one since AMD can't run the effect (even though they can turn it off) amirite?

Also, please stop using so many commas bro. It makes what you're typing super hard to read, especially since you're using way more than you'd need.

 

You seem to forget that AMD created an entire graphics API with EA! And also TressFX with Crystal Dynamics. So idk what you are on about.

 

The graphics market is what we consumers wants it to be. If we say we don't accept this tendency it will change. If we support it, it will continue. I'm sad to see a lot of people in here seems to support it. That is bad for gamers and consumers in general.

 

Lot's of APEX stuff will not run properly if at all so no it's not "bogus". I don't care if they are in house. When they are part of the game, I want to be able to use it. Again, why would anyone be against vendor neutral effects, tech? Turning it off still means the gamer misses out on something the game is represented in videos as having.

 

Commas are set differently than in Danish, so it might not be proper.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

PC gaming isn't getting screwed by consoles. The lord and savior of the PC gaming community, AMD, are the ones powering the stupid things. Blame them. We're not getting segregation in games, one company is leaving another company in the dust (for right now it seems). It's that simple.

I don't know if AMD went to CDPR late or if CDPR declined, but if CDPR declined that would upset me personally even as an Nvidia user. As much as they delayed the game they could have made another exception or even put it in their pipeline ahead of their schedule in the first place.

As far as I know, AMD are the ones who aren't wanting to come to some kind of agreement, like @Mr Moose has pointed out before time and time again. If those articles are true, AMD are the ones holding PC gaming back. Lol.

 

We are getting screwed. Exclusives for xbox, playstation and PC for that fact. I am not blaming anyone. I am not PCMR guy. But merely by the fact that consoles have a large cash spending market, we get screwed. GTA coming out a year after consoles? Bad ports last minute? Again I'm not blaming anyone. Everyone is just looking out for their best interests and that is common sense.

 

But the fact is it just causes shit between all parties. Like I said, blood is bad for business. There is no immediate threat at the moment but things could take a turn for the worse.

 

I am not taking AMD's side too much either. I love AMD for their openness, but their shit talking tactics are not helping and they do seem childish. Some of those quotes are cringe worthy. Although I wouldn't be surprised if these recent controversies have nothing to do with game performance and everything to do with selling 390x's in a couple of weeks. Rattle up the market and then show off your new shiny card. This is especially important now as people are gettting tied into gpu/monitor platforms. It ain't too easy now to just sell your GPU and switch sides without losing a lot of cash.

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI Forte 7.1, XFX PRO650W, Silverstone RV02 Monitors: Asus PB278Q, LG W2243S-PF (Gaming / overclocked to 74Hz) Peripherals: Logitech G9x Laser, QPad MK-50, AudioTechnica ATH AD700

Link to comment
Share on other sites

Link to post
Share on other sites

We are getting screwed. Exclusives for xbox, playstation and PC for that fact. I am not blaming anyone. I am not PCMR guy. But merely by the fact that consoles have a large cash spending market, we get screwed. GTA coming out a year after consoles? Bad ports last minute? Again I'm not blaming anyone. Everyone is just looking out for their best interests and that is common sense.

The studios want the big money. If they have a gold title, they're going to release it on consoles and focus on consoles. So if you want to blame someone for late releases or bad ports, it's the studios making the games. If the consoles didn't exist then gaming as we know it would be even more niche than it is now and we likely wouldn't have the titles that we do today.

But the fact is it just causes shit between all parties. Like I said, blood is bad for business. There is no immediate threat at the moment but things could take a turn for the worse.

Nvidia and AMD always leapfrog. The difference this year is Gameworks is getting more popular with games and now people have something to blame for AMD having poor performance. I guarantee you if Gameworks wasn't in Witcher 3 at all, everyone would either take the performance like it is and not complain, or they'd complain to CDPR about bad performance; because with Nvidia out of the picture the clear ones at fault are the studio's and/or the vendor's drivers.

I am not taking AMD's side too much either. I love AMD for their openness, but their shit talking tactics are not helping and they do seem childish. Some of those quotes are cringe worthy. Although I wouldn't be surprised if these recent controversies have nothing to do with game performance and everything to do with selling 390x's in a couple of weeks. Rattle up the market and then show off your new shiny card. This is especially important now as people are gettting tied into gpu/monitor platforms. It ain't too easy now to just sell your GPU and switch sides without losing a lot of cash.

I have serious doubts that it's got something to do with the 390x's coming up, but more to do with how AMD lets their marketing teams do marketing. People like Richard Huddy just stir stuff up in spite. They did it during the whole 970 crap and they did it when OriginPC quit carrying Radeon cards.

Link to comment
Share on other sites

Link to post
Share on other sites

  • "Don't be a dick" - Wil Wheaton.
  • "Be excellent to each other" - Bill and Ted.

The moderator team will be keeping a careful eye on this thread - any posts found to violate the Code of Conduct will be removed, and warning points will be handed out. If the topic starts to get out of hand, it will be closed.

 

Keep it civil please. ;)

"Be excellent to each other" - Bill and Ted
Community Standards | Guides & Tutorials | Members of Staff

Link to comment
Share on other sites

Link to post
Share on other sites

How about they fix it instead of bitching about how they didn't do it when they did?

There is nothing for them to fix. Why should Nvidia optimize for AMD, AMD should be doing all that themselves. It looks like to me CD just didn't optimize their game at all for both Nvidia and AMD which we know happens to pretty much all PC games that come out. It's now up to the GPU manufactures, AMD and Nvidia to optimize the games separately. If AMD didn't then that's their own fault. And to be honest the game doesn't run very well on any GPU from my standards. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

This steals the focus from the real problems, the game runs terrible(and looks terrible) with hairworks/gameworks disabled but no one is talking about that.

Link to comment
Share on other sites

Link to post
Share on other sites

This steals the focus from the real problems, the game runs terrible(and looks terrible) with hairworks/gameworks disabled but no one is talking about that.

Most people (both NVIDIA and AMD) are saying it runs well with gameworks disabled.
Link to comment
Share on other sites

Link to post
Share on other sites

I started my build with two evga ftw 970s on SLI.. I returned both of them when the vram issue was brought to light. 5-months hiding it on the dark before admitting to its false marketing. Gg Nvidia.

ROG X570-F Strix AMD R9 5900X | EK Elite 360 | EVGA 3080 FTW3 Ultra | G.Skill Trident Z Neo 64gb | Samsung 980 PRO 
ROG Strix XG349C Corsair 4000 | Bose C5 | ROG Swift PG279Q

Logitech G810 Orion Sennheiser HD 518 |  Logitech 502 Hero

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Seriously, did anyone expect NVIDIA custom made settings to run exceedingly well on AMD hardware?  And does anyone honestly have that much of an issue with NVIDIA working to optimize games for their hardware?  Just because a game runs better on an NVIDIA card doesnt mean AMD was sabotaged, and just because a benchmark shows an AMD card running like shit doesn't mean the idiots putting the benchmark together took off NVIDIA optimized settings.

 

I still don't get this debate. 

 

No one expects it, but as the Hairworks fix shows it's something that absolutely could have run perfectly fine on AMD cards if it wasn't given a Nvidia logo and the ran a more reasonable level of tessellation.  It doesn't even run well on Kepler or older Nvidia GPU's because they don't have the sheer tessellation output of Maxwell. (Which is why Nvidia made it so egregious, to make people want to upgrade to maxwell cards.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously, did anyone expect NVIDIA custom made settings to run exceedingly well on AMD hardware?  And does anyone honestly have that much of an issue with NVIDIA working to optimize games for their hardware?  Just because a game runs better on an NVIDIA card doesnt mean AMD was sabotaged, and just because a benchmark shows an AMD card running like shit doesn't mean the idiots putting the benchmark together took off NVIDIA optimized settings.

 

I still don't get this debate. 

 

I expect nVidia stuff to actually work on current nVidia cards though. And Hairworks clearly does not. If you spent 800 bucks on a 780ti this time last year you get almost the same performance as an AMD card when you turn on Hairworks. In that chart, the GTX 780 has HALF the performance of the 980 with Hairworks enabled. In fact, according to that chart the 290X outperforms the GTX 780 at Hairworks by something like 50%!

 

I'd be tempted to accuse nVidia of sabotaging their own customers' experience in Witcher 3 before accusing them of doing that to AMD customers on the basis of that chart.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

No one expects it, but as the Hairworks fix shows it's something that absolutely could have run perfectly fine on AMD cards if it wasn't given a Nvidia logo and the ran a more reasonable level of tessellation.  It doesn't even run well on Kepler or older Nvidia GPU's because they don't have the sheer tessellation output of Maxwell. (Which is why Nvidia made it so egregious, to make people want to upgrade to maxwell cards.)

x16 tessellation obviously adds to the look of hairworks.  Even at x8, there's significant fidelity loss.  Getting angry that NVIDIA made hairworks x16 tessellation in the year 2015 is pretty dumb. 

 

That argument is basically saying that NVIDIA should cripple their advanced custom graphical option simply because AMD's hardware doesn't like tessellation.  That's like saying foliage should have had less tessellation so that the Xbox One would get better framerates.  R9 2XX AMD cards have always had shitty tessellation, even 2 years ago.  The fact that it's causing problems on a tessellation heavy feature shouldn't be anything short of expected. 

 

NVIDIA custom settings are going to run better on NVIDIA hardware.  Whether you agree with them or not, they don't detract from the base game or its settings at all.  If Witcher 3 simply had no hairworks at all, no HBAO+, leading to a shittier maxed out level of graphical fidelity, but with better framerates on pointless benchmarks, people would honestly complain less.  Because that's all this boils down to; what we've been shown in poorly planned benchmarks.  Anyone with an AMD card who turns off hairworks will run the game fine.  Frankly hairworks runs like dog shit on my 980 anyway,

 

 

We can't demand more advanced graphical settings then be pissed that when they're added, they end up being heavily demanding.  They're options.  Just because a game can't be maxed out easily doesn't mean it's unoptimized to shit, or sabotaged. 

 

 

With this mindset people are spewing we'll just go backwards and have less graphical options, no more optional hairworks experiments, no more optional HBAO+, or optional PCSS, and for what?  So Johnny the 15 year old can feel better about his purchase of an R9 280x now that it runs the game at a higher framerate maxed out?  We lower the bar just so people feel better about their hardware? 

 

There are obviously issues with just awful optimization in some games, but when it comes to optional settings that aren't normally in games like the ones I mentioned above, why should anyone care.  It's extra. 

Link to comment
Share on other sites

Link to post
Share on other sites

I expect nVidia stuff to actually work on current nVidia cards though. And Hairworks clearly does not. If you spent 800 bucks on a 780ti this time last year you get almost the same performance as an AMD card when you turn on Hairworks. In that chart, the GTX 780 has HALF the performance of the 980 with Hairworks enabled. In fact, according to that chart the 290X outperforms the GTX 780 at Hairworks by something like 50%!

 

I'd be tempted to accuse nVidia of sabotaging their own customers' experience in Witcher 3 before accusing them of doing that to AMD customers on the basis of that chart.

 

The Keplar issue is another thing entirely, and I agree with you on part of that.

 

But either way, it's hairworks.  The most frivolous, arbitrary, pointless graphical addition to the game ever, and frankly it still runs like dog shit on maxwell.  It's an optional setting that isn't anything close to being an integral part of the game. 

Link to comment
Share on other sites

Link to post
Share on other sites

Poor AMD... victims of their own bullshit rebrands

 

you could say the same for nVidia just like the 600 series 600 TI series The 700 Series and 700 TI series the same chips same rebrands! Get you answers straight. 700 series was a huge fail in performance. nVidia is good at marketing manipulation.

EOC folding stats - Folding stats - My web folding page stats

 

Summer Glau: Quote's The future is worth fighting for. Serenity

 

My linux setup: CPU: I7 2600K @4.5Ghz, MM: Corsair 16GB vengeance @1600Mhz, GPU: 2 Way Radeon his iceq x2 7970, MB: Asus sabertooth Z77, PSU: Corsair 750 plus Gold modular

 

My gaming setup: CPU: I7 3770K @4.7Ghz, MM: Corsair 32GB vengeance @1600Mhz, GPU: 2 Way Gigabyte RX580 8GB, MB: Asus sabertooth Z77, PSU: Corsair 860i Platinum modular

Link to comment
Share on other sites

Link to post
Share on other sites

x16 tessellation obviously adds to the look of hairworks.  Even at x8, there's significant fidelity loss.  Getting angry that NVIDIA made hairworks x16 tessellation in the year 2015 is pretty dumb. 

 

That argument is basically saying that NVIDIA should cripple their advanced custom graphical option simply because AMD's hardware doesn't like tessellation.  That's like saying foliage should have had less tessellation so that the Xbox One would get better framerates.  R9 2XX AMD cards have always had shitty tessellation, even 2 years ago.  The fact that it's causing problems on a tessellation heavy feature shouldn't be anything short of expected. 

 

NVIDIA custom settings are going to run better on NVIDIA hardware.  Whether you agree with them or not, they don't detract from the base game or its settings at all.  If Witcher 3 simply had no hairworks at all, no HBAO+, leading to a shittier maxed out level of graphical fidelity, but with better framerates on pointless benchmarks, people would honestly complain less.  Because that's all this boils down to; what we've been shown in poorly planned benchmarks.  Anyone with an AMD card who turns off hairworks will run the game fine.  Frankly hairworks runs like dog shit on my 980 anyway,

 

 

We can't demand more advanced graphical settings then be pissed that when they're added, they end up being heavily demanding.  They're options.  Just because a game can't be maxed out easily doesn't mean it's unoptimized to shit, or sabotaged. 

 

 

With this mindset people are spewing we'll just go backwards and have less graphical options, no more optional hairworks experiments, no more optional HBAO+, or optional PCSS, and for what?  So Johnny the 15 year old can feel better about his purchase of an R9 280x now that it runs the game at a higher framerate maxed out?  We lower the bar just so people feel better about their hardware? 

 

There are obviously issues with just awful optimization in some games, but when it comes to optional settings that aren't normally in games like the ones I mentioned above, why should anyone care.  It's extra.

I think there's serious diminishing returns with increased tesellation and would have really liked an ingame option to tweak it myself. I agree with the rest of your post though.

I just take issue with nvidia constantly injecting proprietary bullshit into the marketplace. No one wants to go back to the days of having three different GPU's in their system because certain effects only work on one brand or the other.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

I think there's serious diminishing returns with increased tesellation and would have really liked an ingame option to tweak it myself. I agree with the rest of your post though.

I just take issue with nvidia constantly injecting proprietary bullshit into the marketplace. No one wants to go back to the days of having three different GPU's in their system because certain effects only work on one brand or the other.

Disagreeing with gameworks and its boundaries is fine.  But accusing NVIDIA of sabotaging everyone's video cards, in a way that would make you think NVIDIA's engineers went to a small African town themselves and shot all the children there, is incredibly counter-productive. 

Link to comment
Share on other sites

Link to post
Share on other sites

you could say the same for nVidia just like the 600 series 600 TI series The 700 Series and 700 TI series the same chips same rebrands! Get you answers straight. 700 series was a huge fail in performance. nVidia is good at marketing manipulation.

Those rebrands didn't damage Nvidia's market share any where as badly as AMD's did.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

x16 tessellation obviously adds to the look of hairworks.  Even at x8, there's significant fidelity loss.  Getting angry that NVIDIA made hairworks x16 tessellation in the year 2015 is pretty dumb. 

 

 

It's set to 64x by default. I think you're thinking HairworksAA, which is set to 16x. 

 

These settings are absolutely unnecessary for just rendering hair. 

 

Tessellation from 16x and 64x is unnoticeable in Witcher 3...it's clearly to push the sales of Maxwell GPUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

It's set to 64x by default. I think you're thinking HairworksAA, which is set to 16x. 

 

These settings are absolutely unnecessary for just rendering hair. 

 

Tessellation from 16x and 64x is unnoticeable in Witcher 3...it's clearly to push the sales of Maxwell GPUs. 

 

I'm going to be getting Witcher 3 soon (hopefully next week) and I'll be able to judge what everyone keeps saying, like you just said: "Tessellation from 16x and 64x is unnoticeable in Witcher 3...it's clearly to push the sales of Maxwell GPUs."

 

I'm not saying you guys are 100% wrong, I just have serious doubts that it's to only push Maxwell sales; but if it actually does have an unrealistic setting for it, I'll be able to see it for myself.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to be getting Witcher 3 soon (hopefully next week) and I'll be able to judge what everyone keeps saying, like you just said: "Tessellation from 16x and 64x is unnoticeable in Witcher 3...it's clearly to push the sales of Maxwell GPUs."

 

I'm not saying you guys are 100% wrong, I just have serious doubts that it's to only push Maxwell sales; but if it actually does have an unrealistic setting for it, I'll be able to see it for myself.

 

How are you going to find out, when you cannot change the tessellation setting on NVidia cards?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

How are you going to find out, when you cannot change the tessellation setting on NVidia cards?

 

I have a 280x too.

Link to comment
Share on other sites

Link to post
Share on other sites

How are you going to find out, when you cannot change the tessellation setting on NVidia cards?

.ini config files.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×