Jump to content

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

Its pointing out the poor tessellation performance of older Nvidia cards and AMD's current ones.

 

That's my current train of thought.  I'll happily change that if there is a better hypothesis. I don't buy half the conspiracy BS people are touting. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That doesn't even make sense.  You keep telling everyone how wonderful AMD is in their inventions of ram and HBM etc.  except you conveniently keep forgetting about Hynix'a involvement, you seem to forget that AMD did not invent nor perfect gddr5 it was developed and showcased by German company quimonda. Hynix got involved and made it on a smaller 60nm chip.  Just because AMD  was the first to put it onto a GPU doesn't make it their tech or their innovation.

 

Seriously you can't dismiss a class from a soap box.

I haven't taunted how wonderful AMD is at all. As if I would need to mention Hynix involvement in manufacturing of the product or JEDEC for the standard as I guess I was reaching too high with this is basic stuff you guys should already know. Most people don't really care as HBM would still be off in the distance unless someone stepped up and worked out the problems with the memory technology. AMD was the one to solved the puzzle meanwhile opening it up to the industry. If them doing such a thing offends you in any way then you need better things to do with your time than downplaying progress.

Link to comment
Share on other sites

Link to post
Share on other sites

That's my current train of thought.  I'll happily change that if there is a better hypothesis. I don't buy half the conspiracy BS people are touting. 

I honestly can't see how people can think old graphics cards can match new ones in every single way. If that was the case no one would even bother upgrading.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't taunted how wonderful AMD is at all.

 

really, how about in your last 2 posts:

 

snip...  AMD has made HBM available for Nvidia who which will use it with Pascal the same way Nvidia obtained GDDR3 and GDDR5.  :rolleyes:

 

AMD was the one to solved the puzzle meanwhile opening it up to the industry.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

really, how about in your last 2 posts:

Also, 'to solved the puzzle'. I just had to do that.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

really, how about in your last 2 posts:

I can't see where you're linking facts to some kind of fanboyism (we all know you're headed there).

Link to comment
Share on other sites

Link to post
Share on other sites

I can't see where you're linking facts to some kind of fanboyism (we all know you're headed there).

 

Here's some facts, see if you can agree with them in the same way:

 

AMD have run a poor business over the last 10 years, with many bad decisions that effect their ability to compete.

 

AMD's issues with tessellation are the root cause of hairworks being a problem.

 

AMD refused to accept free CUDA when it was offered,

 

AMD turned down free physx when it was offered

 

AMD's market share is low because they are not selling product, both CPU's and GPU's, this is because many independent reviews place AMD products behind Intel and Nvidia.

 

People don't always want to save a few dollars, sometimes they are happy to pay for something that is better.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Here's some facts, see if you can agree with them in the same way:

 

AMD have run a poor business over the last 10 years, with many bad decisions that effect their ability to compete.

 

AMD's issues with tessellation are the root cause of hairworks being a problem.

 

AMD refused to accept free CUDA when it was offered,

 

AMD turned down free physx when it was offered

 

AMD's market share is low because they are not selling product, both CPU's and GPU's, this is because many independent reviews place AMD products behind Intel and Nvidia.

 

People don't always want to save a few dollars, sometimes they are happy to pay for something that is better.

People have different perspectives on subjects due to variable knowledge and opinion of them.

 

AMD's been lead by some pretty big idiots over the years which is what ultimately put the company in financial desperation (you can't blame the entire company for poor leadership). That's like beating the shit out of a dog for pissing on your floor when you refuse to train it.

 

AMD's issues with tessellation is due to their financial status. Without them being able to invest into pushing their latest architecture to new products they will indeed face problems that could of been easily resolved with a little invested money (which they don't have hence the rebranding of products). If the R9 290X was on GCN 1.2 it would perform roughly ~57 FPS in comparison to this chart. Which tends to be within margin of where the card falls performance wise in most other games.

UgLxUFO.png

 

AMD doesn't need CUDA when there are more widely adopted (open) platforms in the compute industry (OpenCL). Now AMD has HSA which is essentially (not entirely) their own CUDA that's also rumored to be coming with support of discrete cards.

 

AMD doesn't like proprietary software like most of us developers. Keep in mind AMD has even planned an "OpenWorks" project to shut Nvidia's offering down by offering a completely open source framework that offers all of the same functionality as GameWorks. I wouldn't blame them for turning it down (it has low adoption anyways hence why it hit GIT). You can only lock down software for so long and Nvidia is learning that the hard way.

 

AMD's market share is low for numerous reasons. One could point at their slacking nature in both the CPU and GPU departments. Not so much in the GPU department as they're just trying too hard to be conservative and pushing old architectures out on newly branded cards. Things would be a bit different in terms of their GPU sales if the R200 series was entirely based on GCN 1.2. Although no one is in dispute that Bulldozer was a flop.

 

Some people do, although most tend to buy what's best for their budget. If all you had was $150-200 then by all means get a R9 280X instead of a GTX 750 Ti/GTX 960.

 

As you can see far from an "AMD bible thumper" in that regard. I state things how I see them and nothing more. I'm also not afraid of sticking up for an underdog in the industry when people make wild and crazy accusations that hold zero weight. It would be no different than someone attacking Intel or Nvidia with complete nonsense. The objective isn't to be objective but to be right. I personally enjoy coming on LTT and spreading my knowledge. That's why I'm here every single day (pretty much).

Link to comment
Share on other sites

Link to post
Share on other sites

AMD doesn't need CUDA when there are more widely adopted (open) platforms in the compute industry (OpenCL). Now AMD has HSA which is essentially (not entirely) their own CUDA that's also rumored to be coming with support of discrete cards.

You do realize why AMDs workstation GPU is doing badly? Even through they keep tanking the price?

AMD most certainly need CUDA. Without it, AMDs solution is no solution at all for many customers.

AMD could have tanked the price to 1/10th, and nvidia would still hold a majority of the marketshare.

Link to comment
Share on other sites

Link to post
Share on other sites

You do realize why AMDs workstation GPU is doing badly? Even through they keep tanking the price?

AMD most certainly need CUDA. Without it, AMDs solution is no solution at all for many customers.

AMD could have tanked the price to 1/10th, and nvidia would still hold a majority of the marketshare.

Are you familiar with how much of even your standard desktop software uses OpenCL? It's certainly used in a lot more places than CUDA.

Link to comment
Share on other sites

Link to post
Share on other sites

Are you familiar with how much of even your standard desktop software uses OpenCL? It's certainly used in a lot more places than CUDA.

Are you familiar with how little that have to do with anything I preached about?

My point still stands, as it currently is looking, AMDs solution is no option at all for many nvidia customers (in the workstation segment).

Link to comment
Share on other sites

Link to post
Share on other sites

You did read where AMD used the words "completely sabotaged"?  if that isn't petty and an example of how they would make any claim they thought they could get away with then there is no such thing.   

 

We can only go on what they are saying and what claims the industry is making.  Given this information it is more probable that AMD just made a poor choice when they turned down CUDA and physx.

 

"AMD" or one individual? was that an official press release from AMD? Was it merely more rhetoric from their "Games Scientist?" I have heard that POV from more than just AMD employees. An opinion of an employee is not "AMD's" like the opinion of a Chief level officer. I have heard much misgiving over gameworks specifically, from inside AMD, not so much on PhysX (it's old news), but more lamenting the closed system keeping them from optimizing, not claiming premeditated sabotage. 

 

Oh, they DEFINITELY made a poor choice, even without the guarantee they should have put a team on getting AMD hardware up on nVidia tech. Open up their market base, improve their outlook, then, even IF nVidia cuts them off, now its nVidia being the bad guy and AMD being the put upon waif not nVidia being the exasperated benefactor and AMD the shortsighted or poorly managed underdog.

Link to comment
Share on other sites

Link to post
Share on other sites

People have different perspectives on subjects due to variable knowledge and opinion of them.

 

AMD's been lead by some pretty big idiots over the years which is what ultimately put the company in financial desperation (you can't blame the entire company for poor leadership). That's like beating the shit out of a dog for pissing on your floor when you refuse to train it.

 

AMD's issues with tessellation is due to their financial status. Without them being able to invest into pushing their latest architecture to new products they will indeed face problems that could of been easily resolved with a little invested money (which they don't have hence the rebranding of products). If the R9 290X was on GCN 1.2 it would perform roughly ~57 FPS in comparison to this chart. Which tends to be within margin of where the card falls performance wise in most other games.

UgLxUFO.png

 

AMD doesn't need CUDA when there are more widely adopted (open) platforms in the compute industry (OpenCL). Now AMD has HSA which is essentially (not entirely) their own CUDA that's also rumored to be coming with support of discrete cards.

 

AMD doesn't like proprietary software like most of us developers. Keep in mind AMD has even planned an "OpenWorks" project to shut Nvidia's offering down by offering a completely open source framework that offers all of the same functionality as GameWorks. I wouldn't blame them for turning it down (it has low adoption anyways hence why it hit GIT). You can only lock down software for so long and Nvidia is learning that the hard way.

 

AMD's market share is low for numerous reasons. One could point at their slacking nature in both the CPU and GPU departments. Not so much in the GPU department as they're just trying too hard to be conservative and pushing old architectures out on newly branded cards. Things would be a bit different in terms of their GPU sales if the R200 series was entirely based on GCN 1.2. Although no one is in dispute that Bulldozer was a flop.

 

Some people do, although most tend to buy what's best for their budget. If all you had was $150-200 then by all means get a R9 280X instead of a GTX 750 Ti/GTX 960.

 

As you can see far from an "AMD bible thumper" in that regard. I state things how I see them and nothing more. I'm also not afraid of sticking up for an underdog in the industry when people make wild and crazy accusations that hold zero weight. It would be no different than someone attacking Intel or Nvidia with complete nonsense. The objective isn't to be objective but to be right. I personally enjoy coming on LTT and spreading my knowledge. That's why I'm here every single day (pretty much).

 

Yeah, that's about what I thought you say.

 

In other words no, you can't just say those things because even though they are true, they don't have the Positive AMD spin you like to put on everything.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's been lead by some pretty big idiots over the years which is what ultimately put the company in financial desperation (you can't blame the entire company for poor leadership). That's like beating the shit out of a dog for pissing on your floor when you refuse to train it.

What kind of logic is that? Of course we will blame AMD for poor decisions. We don't track down each individual responsible for those decisions and then blame them.

I am pissed at Intel for pretty much abandoning the performance crowd. I am pissed at Intel for that, not the individual who decided to do it. I am pissed at Samsung for not including microSD and a removable battery in the S6. I am pissed at Samsung not the individual person who suggested it and pushed it though during the design process.

See where I am going? When a company does something bad it is the companies' fault. I don't even know why you care that we blame AMD and not some individual. It is just because it hurts you to see "AMD" and "bad" in the same sentence?

 

 

AMD's issues with tessellation is due to their financial status. Without them being able to invest into pushing their latest architecture to new products they will indeed face problems that could of been easily resolved with a little invested money (which they don't have hence the rebranding of products). If the R9 290X was on GCN 1.2 it would perform roughly ~57 FPS in comparison to this chart. Which tends to be within margin of where the card falls performance wise in most other games.

The reason why AMD's tessellation performance is bad doesn't matter. The numbers won't get better because you say they don't have enough money to make it better.

It's nice to see that you at least can admit that AMD is behind in terms of tessellation performance.

 

 

It would be no different than someone attacking Intel or Nvidia with complete nonsense.

You mean you would stand up for Intel and Nvidia if, purely hypothetical here, some Dick accused one of them of injecting things into games with the sole intention of ruining let's say AMD's performance, and the person making that claim had 0 evidence to support it? If fact there was quite a lot of evidence to support the idea that he was wrong.

It's nice to hear that you would stand on the rational side which questions those accusations, instead of doing a bunch of mental gymnastics to try and rationalize the claims.

 

=)

Link to comment
Share on other sites

Link to post
Share on other sites

Its pointing out the poor tessellation performance of older Nvidia cards and AMD's current ones.

 

The problem is that a 780ti for instance, has better tessellation performance in benchmarks, than 970, and close to 980. So why is there such a huge gap in HairWorks performance then? Something's not right.

 

Take a look at the benchmark in the middle, right under frame timing:

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/

 

A 285 should not be neck and neck with a Titan, and the 780ti should not be beaten by a 290x by 12-19%. That is without HairWorks. NVidia has gimped their 700 series.

 

Here's some facts, see if you can agree with them in the same way:

 

AMD's issues with tessellation are the root cause of hairworks being a problem.

 

AMD refused to accept free CUDA when it was offered,

 

AMD turned down free physx when it was offered

 

Depends on how you see it. Ultimately, poor performance of HairWorks, is due to it being horribly optimized, for instance, by being wasteful of tessellation. But with all the flak HairWorks has gotten, it's not even the worst part of GameWorks, as like the other *works under VisualFX, they only depend on DX11. A lot of the other effects, depends on APEX (advanced PhysX), and they will be a stuttery horrible mess on AMD.

 

What should AMD use CUDA for? CUDA support does not automatically result in full GameWorks support and optimization. Just look at the 700 series. As for professional use, like video editing, I see your point. But I have to wonder, for how long, until CUDA is scrapped.

 

We really don't know how "free" it was. Back then physics where primarily Havoc, that would run fine on AMD CPU's, so why go into this?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is that a 780ti for instance, has better tessellation performance in benchmarks, than 970, and close to 980. So why is there such a huge gap in HairWorks performance then? Something's not right.

 

Take a look at the benchmark in the middle, right under frame timing:

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/

 

A 285 should not be neck and neck with a Titan, and the 780ti should not be beaten by a 290x by 12-19%. That is without HairWorks. NVidia has gimped their 700 series.

 

 

Depends on how you see it. Ultimately, poor performance of HairWorks, is due to it being horribly optimized, for instance, by being wasteful of tessellation. But with all the flak HairWorks has gotten, it's not even the worst part of GameWorks, as like the other *works under VisualFX, they only depend on DX11. A lot of the other effects, depends on APEX (advanced PhysX), and they will be a stuttery horrible mess on AMD.

 

What should AMD use CUDA for? CUDA support does not automatically result in full GameWorks support and optimization. Just look at the 700 series. As for professional use, like video editing, I see your point. But I have to wonder, for how long, until CUDA is scrapped.

 

We really don't know how "free" it was. Back then physics where primarily Havoc, that would run fine on AMD CPU's, so why go into this?

I don't know what benchmarks your looking at, but the R9 285 was far from the Titan X in performance. 

Edit: Oh, the original Titan. You do realise that that car gets beaten by the R9 290X and GTX 970? In most games. And the R9 290X seems to be doing very well.

Edit 2: You yourself mentioned before that OpenCL and CUDA have advantages and disadvantages, people use what suits the task at hand. And Nvidia offers both. The less features that distinguish graphics cards (features not performance) the better things get for the consumer due to more competition.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know what benchmarks your looking at, but the R9 285 was far from the Titan X in performance. 

Edit: Oh, the original Titan. You do realise that that car gets beaten by the R9 290X and GTX 970? In most games. And the R9 290X seems to be doing very well.

 

Yeah I meant the Original Titan, which is a 780 with double precision and 6GB vram. It should not be close to a 285, but a 290. Something's just wrong with kepler cards.

 

290x is doing well, which shows how well optimized Witcher 3 really is. But that is without HairWorks. You can look at HairWorks in the bench above. Just press the tab "HairWorks", and all the blue ones with "An" is on, where "auf" is off.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I meant the Original Titan, which is a 780 with double precision and 6GB vram. It should not be close to a 285, but a 290. Something's just wrong with kepler cards.

 

290x is doing well, which shows how well optimized Witcher 3 really is. But that is without HairWorks. You can look at HairWorks in the bench above. Just press the tab "HairWorks", and all the blue ones with "An" is on, where "auf" is off.

Nvidia has had some 'problems' with their drivers lately such as disabling mobile GPU overclocking - again - (it affects more than maxwell based GPU's BTW), while even AMD's old drivers seem to be handling things just fine. TBH its highly likely that Nvidia's trying to pull a fast one and making AMD look better as an uanintentional side effect-which they kind of are since reducing the tessellation is easy for any AMD user but not for those with Nvidia cards who want things to look decent - meaning AMD cards can make the game look and perform better to a certain point after the tweaking. Long story short, both companies are as bad as each other and are out to make a profit, and sometimes the methods used by them work, and other times they don't.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

You mean you would stand up for Intel and Nvidia if, purely hypothetical here, some Dick accused one of them of injecting things into games with the sole intention of ruining let's say AMD's performance, and the person making that claim had 0 evidence to support it? If fact there was quite a lot of evidence to support the idea that he was wrong.

It's nice to hear that you would stand on the rational side which questions those accusations, instead of doing a bunch of mental gymnastics to try and rationalize the claims.

 

The thing is, can anyone even claim Nvidia uses gameworks to sabotage AMD if you can just turn the settings off?

 

To me, people with AMD cards complaining about gameworks is like someone with a 960 complaining that he/she can't run the game on ultra. I consider those settings "premium", it's just like ultra: the settings are only available to a small part of the playerbase. If you're really that set on having hairworks, then you're SOL. Buy an Nvidia card (which still gets hammered by hairworks anyway, so whatever).

 

It's really not that different from other Nvidia features (DSR, adaptive Vsync, ...), it just has to be implemented in the game itself. I can't imagine that CDPR would've turned down if AMD had offered them TressFX and HDAO as alternatives (Edit: Insert ON TIME here, apparently.) to Hairworks and HBAO+. 

 

Now, if the game ran bad on ALL settings on AMD cards, that would be something else, but that ISN'T the case, only hairworks is an issue, and it's not worth complaining about.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is, can anyone even claim Nvidia uses gameworks to sabotage AMD if you can just turn the settings off?

 

To me, people with AMD cards complaining about gameworks is like someone with a 960 complaining that he/she can't run the game on ultra. I consider those settings "premium", it's just like ultra: the settings are only available to a small part of the playerbase. If you're really that set on having hairworks, then you're SOL. Buy an Nvidia card (which still gets hammered by hairworks anyway, so whatever).

 

It's really not that different from other Nvidia features (DSR, adaptive Vsync, ...), it just has to be implemented in the game itself. I can't imagine that CDPR would've turned down if AMD had offered them TressFX and HDAO as alternatives to Hairworks and HBAO+. 

 

Now, if the game ran bad on ALL settings on AMD cards, that would be something else, but that ISN'T the case, only hairworks is an issue, and it's not worth complaining about.

Especially when the settings are adjusted in the CCC.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is, can anyone even claim Nvidia uses gameworks to sabotage AMD if you can just turn the settings off?

 

To me, people with AMD cards complaining about gameworks is like someone with a 960 complaining that he/she can't run the game on ultra. I consider those settings "premium", it's just like ultra: the settings are only available to a small part of the playerbase. If you're really that set on having hairworks, then you're SOL. Buy an Nvidia card (which still gets hammered by hairworks anyway, so whatever).

 

It's really not that different from other Nvidia features (DSR, adaptive Vsync, ...), it just has to be implemented in the game itself. I can't imagine that CDPR would've turned down if AMD had offered them TressFX and HDAO as alternatives to Hairworks and HBAO+. 

 

Now, if the game ran bad on ALL settings on AMD cards, that would be something else, but that ISN'T the case, only hairworks is an issue, and it's not worth complaining about.

 

If it's a part of the game, then yes. There is a huge difference between a filter like Anti Aliasing, being proprietary (don't care), and an actual ingame effect, that is part of the games graphics. A 290x is just as premium as a 970, if not more. So yes there is a difference, when you as an AMD user, get less of the graphics, than what you see in the videos, even with a highend AMD card.

 

Actually CDPR did turn down TressFX, as they where too far into production. Also CDPR has had problems with resources (manpower), so I doubt they'd have time to implement a third hair effect. But that is also kind of the point: It should not necessary to have a third effect. Standard and premium should be enough, as long as the latter, is not vendor biased.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia has had some 'problems' with their drivers lately such as disabling mobile GPU overclocking - again - (it affects more than maxwell based GPU's BTW), while even AMD's old drivers seem to be handling things just fine. TBH its highly likely that Nvidia's trying to pull a fast one and making AMD look better as an uanintentional side effect-which they kind of are since reducing the tessellation is easy for any AMD user but not for those with Nvidia cards who want things to look decent - meaning AMD cards can make the game look and perform better to a certain point after the tweaking. Long story short, both companies are as bad as each other and are out to make a profit, and sometimes the methods used by them work, and other times they don't.

 

Yes, disabling MOBILE GPU overclocking, such a horrible thing. Not. How many does that effect, really? You're beating a dead horse here, as far as I am concerned. Do you know how 100% of laptops, AIOs, and anything with a Mobile GPU is designed and created? I do. They have a certain thermal and power limit, and they pretty much max it out. Sure if you have 10 fans blowing underneath with the bottom off, you can overclock MAYBE (power is the issue), but its not the smartest idea in the world. Its up to the MANUFACTURER of said product, NOT NVIDIA, to decide what that thermal and power limit is. Then that Manufacturer sets the clock speeds accordingly. Fixed power supply with very tight thermal limits... its going to AND HAS led to dead laptops.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, disabling MOBILE GPU overclocking, such a horrible thing. Not. How many does that effect, really? You're beating a dead horse here, as far as I am concerned. Do you know how 100% of laptops, AIOs, and anything with a Mobile GPU is designed and created? I do. They have a certain thermal and power limit, and they pretty much max it out. Sure if you have 10 fans blowing underneath with the bottom off, you can overclock MAYBE (power is the issue), but its not the smartest idea in the world. Its up to the MANUFACTURER of said product, NOT NVIDIA, to decide what that thermal and power limit is. Then that Manufacturer sets the clock speeds accordingly. Fixed power supply with very tight thermal limits... its going to AND HAS led to dead laptops.

You don't get it. Its the components in the laptop that get damaged by some people, so its up to the laptop manufacturer to sort out anything not Nvidia. Nvidia knows that people can squeeze a lot of performance out of mobile GPU's. This for example is the record my friend set on his MSI gaming laptop: http://hwbot.org/submission/2850826_higleb_unigine_heaven___xtreme_preset_geforce_gtx_860m_1240.53_dx11_marks

 

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

This.

 

If anything nvidia gimped their own cards (Kepler)

 

GCN gpus perform fine in the game .

 

They didn't gimp their own cards. The non-Maxwell cards don't have good enough tessellation performance to run Hairworks. When Maxwell launched Nvidia even said their Maxwell GPU's would have (2 or 3?) times the tessellation performance than Kepler.

 

Am I the only one just waiting for TotalBiiscuit to do a video on this topic and introduce a bit of sanity into this argument?

 

Not saying this with any ill-will, but there's a few people on this forum including me who are keeping a level head with this and are just looking at facts rather than drawing up conspiracy theories. TB would probably say something along the lines of "*cue English accent* Turn the damn effect off if you can't run it! The game runs fine without the effect." He, like other people on this forum, are going to actually do their research on what the issue is with Witcher 3 and see there is no such gimping or code locking of any kind going on.

 

*sigh*

 

Look as an AMD owner, I still think Richard needs to well:

 

-snipped the gif-

 

He has made his point before and there is nothing to gain from just throwing more flames to the fucking fire. The feature can be fucking turned off for fuck's sake. That makes it close to a non fucking issue is not like the non hairworks hair looks disgusting or makes all characters go fucking bald. This actually doesn't help us AMD gamers and just furthers his fucking agenda.

 

And for what? Is Nvidia gonna change? Fat fucking chance. This would just marginalize AMD gamers even further. Wait for a fucking game where you can't turn off gameworks to make a big stink otherwise this is just pandering.

 

You're doing it wrong. If you own a Radeon card you're supposed to draw ridiculous conclusions and side with the Chief Gaming Scientist of AMD without question.

Link to comment
Share on other sites

Link to post
Share on other sites

If it's a part of the game, then yes. There is a huge difference between a filter like Anti Aliasing, being proprietary (don't care), and an actual ingame effect, that is part of the games graphics. A 290x is just as premium as a 970, if not more. So yes there is a difference, when you as an AMD user, get less of the graphics, than what you see in the videos, even with a highend AMD card.

 

You're not entitled to use Nvidia software just because they added it into the game that is just retarded.

Am I'm going to be pissed now that HD7970 users get through Mantle performance of a GTX780 in Battlefield 4?

No, because AMD worked hard to produce the software they spend the R&D and worked with the developer together and I have no right to use it as I don't own any of their products.

And the same applies the other way around there is literally no debate here besides fanboys feeling entitled to get extra features for free.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×