Jump to content

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

You're not entitled to use Nvidia software just because they added it into the game that is just retarded.

Am I'm going to be pissed now that HD7970 users get through Mantle performance of a GTX780 in Battlefield 4?

No, because AMD worked hard to produce the software they spend the R&D and worked with the developer together and I have no right to use it as I don't own any of their products.

And the same applies the other way around there is literally now debate here besides fanboys feeling entitled to get extra features for free.

 

Again, mantle had absolutely no influence on the gaming experience of NVidia users.

 

But you're missing the point. What do you or us, as gamers, want out of the industry? Segregation, and gimped experience for one or both vendors, or equal experiences in content, effects, graphics, etc. for all gamers? Because that is why we all spend obscene amounts of money on a piece of tech, that goes obsolete in a few years: To play awesome games. Why would any gamer condone proprietary tech, that gimps the experience for fellow gamers?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Again, mantle had absolutely no influence on the gaming experience of NVidia users.

Neither does Hairworks, or anything in the Gameworks library for AMD users. You can turn the effects off.

But you're missing the point. What do you or us, as gamers, want out of the industry? Segregation, and gimped experience for one or both vendors, or equal experiences in content, effects, graphics, etc. for all gamers? Because that is why we all spend obscene amounts of money on a piece of tech, that goes obsolete in a few years: To play awesome games. Why would any gamer condone proprietary tech, that gimps the experience for fellow gamers?

Nothing Nvidia gives developers gimps performance for the red team. That's a myth that you guys keep spreading and I still cannot figure out why. I do have a guess though: you're easily susceptible to AMD's marketing tactics and users on the internet who make a claim and you just run with it.

Link to comment
Share on other sites

Link to post
Share on other sites

If it's a part of the game, then yes. There is a huge difference between a filter like Anti Aliasing, being proprietary (don't care), and an actual ingame effect, that is part of the games graphics. A 290x is just as premium as a 970, if not more. So yes there is a difference, when you as an AMD user, get less of the graphics, than what you see in the videos, even with a highend AMD card.

 

Actually CDPR did turn down TressFX, as they where too far into production. Also CDPR has had problems with resources (manpower), so I doubt they'd have time to implement a third hair effect. But that is also kind of the point: It should not necessary to have a third effect. Standard and premium should be enough, as long as the latter, is not vendor biased.

 

If AMD was late to the party, then that's AMD's fault, not Nvidia or CDPR. Nvidia also provided a couple of people to help CDPR with implementation. They spent money on this, AMD did not. If you don't do shit in a group assignment, do you still expect good grades?

 

You are actually contradicting yourself here though: AA can be proprietary, but hair physics cannot? Both provide a different visual appearance, albeit more visible to most people in the case of hairworks. That being said, Hairworks is inherently heavy on the GPU. Even on Nvidia GPUs it comes at a fairly hefty cost (honestly, it requires SLI 970 or higher to even run decently). Nvidia developed hairworks using tesselation, Nvidia helped implement hairworks into TW3. They went the extra mile to assure their customers got the premium service they think they deserve. AMD did not. The customers of Nvidia indirectly paid for hairworks, they developed it with the money they earned from their purchases. That's the premium features they get for choosing Nvidia. AMD customers chose AMD, they get AMD's features. 

 

My point still stands: This is no different from in-driver features, it's a feature that requires a certain type of hardware (in this case hardware with sufficient tesselation performance) in order to work properly. The only difference is that it requires in game implementation, which is fine if the developer is willing to put in this extra work. You don't even get less graphics, you can still turn it on if you're willing to live with the hefty performance impact. It's almost hilarious how you try to imply that hairworks is an essential feature, too. I'd say it's an experimental feature at best, in the state it is in right now.

 

Yes, disabling MOBILE GPU overclocking, such a horrible thing. Not. How many does that effect, really? You're beating a dead horse here, as far as I am concerned. Do you know how 100% of laptops, AIOs, and anything with a Mobile GPU is designed and created? I do. They have a certain thermal and power limit, and they pretty much max it out. Sure if you have 10 fans blowing underneath with the bottom off, you can overclock MAYBE (power is the issue), but its not the smartest idea in the world. Its up to the MANUFACTURER of said product, NOT NVIDIA, to decide what that thermal and power limit is. Then that Manufacturer sets the clock speeds accordingly. Fixed power supply with very tight thermal limits... its going to AND HAS led to dead laptops.

I overclocked a Laptop GPU once. Then I brought it in for repair after it blew up a month later. 0/10 wouldn't recommend.

 

If you're buying a laptop for overclocking, you're doing it wrong. Also, My AMD card in my laptop doesn't allow overclocking either, why is this a discussion? 

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

You don't get it. Its the components in the laptop that get damaged by some people, so its up to the laptop manufacturer to sort out anything not Nvidia. Nvidia knows that people can squeeze a lot of performance out of mobile GPU's. This for example is the record my friend set on his MSI gaming laptop: http://hwbot.org/submission/2850826_higleb_unigine_heaven___xtreme_preset_geforce_gtx_860m_1240.53_dx11_marks

 

 

Its more than just the components, you actually can damage the GPU/CPU with overclocking. Its the thing with bumping up things, power for instance. Then you have the thermals, of which you don't always have room to play around. Nvidia knows what their GPUs can do, yes. BUT, it also is up to the Manufacturer to DESIGN the laptop. How about you slam a 980M into a Dell XPS 13 Touch and then overclock it, what do you think will happen, even if you somehow manage to get it in? Is it Nvidias fault you can't get that GPU in there? No, it is Dells technically, since it wasn't designed that way. As well, if you have only 100w to play with, and your system is using 98w what will happen when you bump it up to 110w? Heat is the same, everything in a laptop is designed for a specific purpose, power consumption, thermal dissipation w and w/o fans, etc.

 

 

 I overclocked a Laptop GPU once. Then I brought it in for repair after it blew up a month later. 0/10 wouldn't recommend.

 

If you're buying a laptop for overclocking, you're doing it wrong. Also, My AMD card in my laptop doesn't allow overclocking either, why is this a discussion? 

 

Because people fanboys will use anything to attack, lie and degrade a company.

 

Yeah, I haven't done it myself, but I know people that have been in that situation and were wondering why Nvidia would allow it in the first place.

Link to comment
Share on other sites

Link to post
Share on other sites

 

If AMD was late to the party, then that's AMD's fault, not Nvidia or CDPR. Nvidia also provided a couple of people to help CDPR with implementation. They spent money on this, AMD did not. If you don't do shit in a group assignment, do you still expect good grades?

 

You are actually contradicting yourself here though: AA can be proprietary, but hair physics cannot? Both provide a different visual appearance, albeit more visible to most people in the case of hairworks. That being said, Hairworks is inherently heavy on the GPU. Even on Nvidia GPUs it comes at a fairly hefty cost (honestly, it requires SLI 970 or higher to even run decently). Nvidia developed hairworks using tesselation, Nvidia helped implement hairworks into TW3. They went the extra mile to assure their customers got the premium service they think they deserve. AMD did not. The customers of Nvidia indirectly paid for hairworks, they developed it with the money they earned from their purchases. That's the premium features they get for choosing Nvidia. AMD customers chose AMD, they get AMD's features. 

 

My point still stands: This is no different from in-driver features, it's a feature that requires a certain type of hardware (in this case hardware with sufficient tesselation performance) in order to work properly. The only difference is that it requires in game implementation, which is fine if the developer is willing to put in this extra work. You don't even get less graphics, you can still turn it on if you're willing to live with the hefty performance impact. It's almost hilarious how you try to imply that hairworks is an essential feature, too. I'd say it's an experimental feature at best, in the state it is in right now.

 

AMD has been optimizing Witcher 3 with CDPR for a long time, so they invested resources in the devs as well. We don't know CDPR reasoning. They stated it was too late, but considering the half year delay, it might just as well be resource limitations. If we assume, it really was too late, is that a good reason to deny gamers access to a proper experience?

 

No contradiction here. AA is a filter, and doesn't change the graphics or effects per se. At 4k, aa is almost pointless. There are plenty of standardized types of AA out there. NVidia's proprietary ones, are not really any useful anyways.

Does that not just prove, that HairWorks is stupidly unoptimized for everyone? AMD invented TressFX years ago, so I think they've done plenty. So let me ask you:If TressFX 3 in Deus Ex Mankind Divided, was closed source, so NVidia could not optimize, and it ran like shit on NVidia. Would you condone that?

 

I don't agree, optimization is necessary, either way (just look at the 700 series), so source code access is necessary either way. But it doesn't change the fact, that this effect runs at 64x tessellation, which is insanely wasteful.

Experimental? It's a graphical effect released in a final build of a released game. Calling it experimental or a beta is silly.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Can we take 5 minutes off of the fan war and acknowledge some mutual information for a second? I decided to dig into more about the hairworks thing, because i do find relationships between developers and hardware manufacturers to be an interesting one, and i managed to find some details i did not know before.

 

http://www.forbes.com/sites/jasonevangelho/2015/05/21/amd-is-wrong-about-the-witcher-3-and-nvidias-hairworks/

 

No idea if this article is genuine or not, but it states that the claims made by Huddy is unfounded, because gameworks was shown in Witcher 3 over a year ago, and was in development over 2 years ago. This now begs the following question: did AMD not attempt to ask the developers to use TressFX prior to the developmental process of the game? Or did CDPR outright refuse to do it? If what Huddy said is true, about working with AMD from the beginning, one would think TressFX would be involved in some way or another. The coexistence of these technologies are not impossible, and we know this for a fact based on some other AAA titles. They also claim Nvidia said they could work with other vendors too, so its not like Nvidia took away AMD's only fighting chance to compete in this specific title.

 

I skipped a lot of the posts from page 5-9, because it was getting out of hand with the spread of misinformation, so sorry if this was already posted. Just thought it might shine more light on the topic.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has been optimizing Witcher 3 with CDPR for a long time, so they invested resources in the devs as well. We don't know CDPR reasoning. They stated it was too late, but considering the half year delay, it might just as well be resource limitations. If we assume, it really was too late, is that a good reason to deny gamers access to a proper experience?

 

No contradiction here. AA is a filter, and doesn't change the graphics or effects per se. At 4k, aa is almost pointless. There are plenty of standardized types of AA out there. NVidia's proprietary ones, are not really any useful anyways.

Does that not just prove, that HairWorks is stupidly unoptimized for everyone? AMD invented TressFX years ago, so I think they've done plenty. So let me ask you:If TressFX 3 in Deus Ex Mankind Divided, was closed source, so NVidia could not optimize, and it ran like shit on NVidia. Would you condone that?

 

I don't agree, optimization is necessary, either way (just look at the 700 series), so source code access is necessary either way. But it doesn't change the fact, that this effect runs at 64x tessellation, which is insanely wasteful.

Experimental? It's a graphical effect released in a final build of a released game. Calling it experimental or a beta is silly.

 

Yea tressFX was 100% finished when it was released, it also looked fantastic /s

 

Maybe not experimental, that's just me not choosing the right word for it, but at the very least not finished whatsoever. I'm also talking about the tech, not the specific implementations.

 

TressFX is a different story though. It's open source by nature, but if it weren't, I wouldn't complain. I don't care because simulated hair in games is blehg. If you let such a small setting take away from a still fantastic looking game, then your priorities are wrong. I love great graphics, don't get me wrong, but I don't care for all the special bells and whistles. I still value framerate over looks in the end. I don't encourage closed stuff like GW, not at all, but I'm also against blaming a company for doing it. 

 

As long as it is optional I don't see a problem with it. Would it be better to have a centralized system? Sure, but that isn't the case right now, and complaining about it isn't going to help anyone at this point. If TressFX 3 is any good, Devs will pick it up, and Nvidia will ultimately start working on it themselves. In the state it was in in tomb raider, however, I can't blame them for making their own version, because DAMN that was ugly. There's a reason it's not widely used at all. I hope the next version looks better, otherwise I'll still have no reason to enable it, even if it barely impacts performance.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

Yea tressFX was 100% finished when it was released, it also looked fantastic /s

 

Maybe not experimental, that's just me not choosing the right word for it, but at the very least not finished whatsoever. I'm also talking about the tech, not the specific implementations.

 

TressFX is a different story though. It's open source by nature, but if it weren't, I wouldn't complain. I don't care because simulated hair in games is blehg. If you let such a small setting take away from a still fantastic looking game, then your priorities are wrong. I love great graphics, don't get me wrong, but I don't care for all the special bells and whistles. I still value framerate over looks in the end. I don't encourage closed stuff like GW, not at all, but I'm also against blaming a company for doing it. 

 

As long as it is optional I don't see a problem with it. Would it be better to have a centralized system? Sure, but that isn't the case right now, and complaining about it isn't going to help anyone at this point. If TressFX 3 is any good, Devs will pick it up, and Nvidia will ultimately start working on it themselves. In the state it was in in tomb raider, however, I can't blame them for making their own version, because DAMN that was ugly. There's a reason it's not widely used at all. I hope the next version looks better, otherwise I'll still have no reason to enable it, even if it barely impacts performance.

 

I think you are misinformed. TressFX version 1.0 was done, when released with Tomb Raider. I thought it set a new high standard for hair, and still do. Crystal dynamics had to fix a bug in the game for NVidia users, but that was it. TressFX will be version 3 on Deus Ex Mankind Divided.

 

HairWorks has been included in 3 AAA games now, so stating it's not done, is simply incorrect. That does not mean, that we won't see upgrades or new versions of it of course. But it is not released as beta's.

 

Oh I think Witcher 3 looks amazing, even without HairWorks. My issue is more general, in that I don't like close exclusive graphics effects, that will give certain gamers less of an experience (at least graphically), just because they chose another brand of GPU.

 

Again, I'm not necessarily against middleware made by graphics vendors, but making it closed an unoptimized, is a problem I think.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I think you are misinformed. TressFX version 1.0 was done, when released with Tomb Raider. I thought it set a new high standard for hair, and still do. Crystal dynamics had to fix a bug in the game for NVidia users, but that was it. TressFX will be version 3 on Deus Ex Mankind Divided.

 

HairWorks has been included in 3 AAA games now, so stating it's not done, is simply incorrect. That does not mean, that we won't see upgrades or new versions of it of course. But it is not released as beta's.

 

Oh I think Witcher 3 looks amazing, even without HairWorks. My issue is more general, in that I don't like close exclusive graphics effects, that will give certain gamers less of an experience (at least graphically), just because they chose another brand of GPU.

 

Again, I'm not necessarily against middleware made by graphics vendors, but making it closed an unoptimized, is a problem I think.

 

Have you seen how it looks? If that's what you call "done", then by all means, be my guest. I expected more back then. As I've said, I hope tressFX3 will impress, but We'll see then.

 

I was, again, talking about the technology. Neither solution actually looks good in my opinion, and that's what I mean by "experimental". This tech is relatively new, and unfortunately not good enough for me. I still prefer polygonal hair over the solutions that exist now. In my opinion, it doesn't look good enough to be considered a "must have" setting, far from.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

Can't win the hardware war, might as well win the PR war.

I wish Intel just crushed them already and then gave their x86 patent to Samsung or some other competitor (Samsung seems like an interesting one).

Intel and Nvidia each would love access to AMD's IP, though neither can afford to let it die. If AMD dies Intel has to deal with Samsung, IBM, Nvidia, or potentially Apple on the x86 front. If AMD dies and Intel gets ahold of all that graphics IP, Nvidia is screwed. Intel dGPUs using HDL on a better process with a CUDA license means the end of Nvidia's dGPU business.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I think you are misinformed. TressFX version 1.0 was done, when released with Tomb Raider. I thought it set a new high standard for hair, and still do. Crystal dynamics had to fix a bug in the game for NVidia users, but that was it. TressFX will be version 3 on Deus Ex Mankind Divided.

 

HairWorks has been included in 3 AAA games now, so stating it's not done, is simply incorrect. That does not mean, that we won't see upgrades or new versions of it of course. But it is not released as beta's.

 

Oh I think Witcher 3 looks amazing, even without HairWorks. My issue is more general, in that I don't like close exclusive graphics effects, that will give certain gamers less of an experience (at least graphically), just because they chose another brand of GPU.

 

Again, I'm not necessarily against middleware made by graphics vendors, but making it closed an unoptimized, is a problem I think.

 

I do agree with you on vendors working closely with software designed only for their hardware, as it has (and will continue) to promote an unhealthy relationship with developers. What i would much rather see, is Nvidia or AMD get into the business of releasing software that improves the experience for all gamers, regardless of which graphics medium they chose. They can make the money from licensing the software, no point in making it exclusive. The problem is, Nvidia and AMD absolutely refuse to use anything made by each other. We've seen this before with CUDA, Mantle, etc and we will see it for as long as they both exist.

 

As long as this unhealthy relationship exists, there will always be a perceived bias from the community. When someone who uses an AMD graphics cards experiences a hiccup in a game that says 'Nvidia: The way its meant to be played" on a title card, that will be their prime target to blame. The same will be said vice versa. 

 

At the end of the day, it all comes down to business. Nvidia wants to inflate the importance of their GPU's, and AMD wants to do the same. They will do whatever it takes to entice people to purchase their hardware over a competitor. If it means making exclusive software featured only on their hardware, one cannot blame them for it. People often claim AMD is devoid of guilt because they offered Nvidia to have Mantle for free, and that its open source, but using it could have put Nvidia at AMD's mercy, and I am certain everyone understands that. No company wants to rely on a competitor unless it is absolutely necessary, and even then, they hate it. 

 

Situations like these tend to suck for the consumer, because they feel they are being stripped of an experience based on a wrong choice. From what i am reading, it seems AMD and CDPR are working on making sure that everyone will get a great experience with it, so lets hope they figure something out. From what i already see, there are workarounds that alleviate most of the issues. I am gonna stop this long wall of text here, because i forgot exactly where i was going with this. 

 

Snap into a Slim Jim.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Have you seen how it looks? If that's what you call "done", then by all means, be my guest. I expected more back then. As I've said, I hope tressFX3 will impress, but We'll see then.

 

I was, again, talking about the technology. Neither solution actually looks good in my opinion, and that's what I mean by "experimental". This tech is relatively new, and unfortunately not good enough for me. I still prefer polygonal hair over the solutions that exist now. In my opinion, it doesn't look good enough to be considered a "must have" setting, far from.

 

I have, and I like it compared to the alternatives. But what do you think is wrong with it? What would you like to be changed?

 

This is what TressFX 3 looks like on Deus EX:

 

Dawn-Engine-3.jpg

 

Is it because it's too "shiny"?

 

I do agree with you on vendors working closely with software designed only for their hardware, as it has (and will continue) to promote an unhealthy relationship with developers. What i would much rather see, is Nvidia or AMD get into the business of releasing software that improves the experience for all gamers, regardless of which graphics medium they chose. They can make the money from licensing the software, no point in making it exclusive. The problem is, Nvidia and AMD absolutely refuse to use anything made by each other. We've seen this before with CUDA, Mantle, etc and we will see it for as long as they both exist.

 

Well NVidia will use Vulkan, I'm sure. TressFX also worked more than fine on NVidia. I get your point, but since AMD generally uses open tech, NVidia can use, I don't think it's completely fair to criticize them to the same degree. But yeah, industry standards like Adaptive Sync, is really the best solution for consumers, and that goes for middleware as well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Intel and Nvidia each would love access to AMD's IP, though neither can afford to let it die. If AMD dies Intel has to deal with Samsung, IBM, Nvidia, or potentially Apple on the x86 front. If AMD dies and Intel gets ahold of all that graphics IP, Nvidia is screwed. Intel dGPUs using HDL on a better process with a CUDA license means the end of Nvidia's dGPU business.

Can you explain further with this?

Link to comment
Share on other sites

Link to post
Share on other sites

Can you explain further with this?

 

It means the X86 license will be available to any of those companies, allowing them to become a competitor in the CPU market. 

 

Well NVidia will use Vulkan, I'm sure. TressFX also worked more than fine on NVidia. I get your point, but since AMD generally uses open tech, NVidia can use, I don't think it's completely fair to criticize them to the same degree. But yeah, industry standards like Adaptive Sync, is really the best solution for consumers, and that goes for middleware as well.

 

I understand that AMD tries to keep everything open, but history has shown us that both companies tend to avoid software produced by each other. Vulkan will be an exception ,because it is the direct replacement for OGL, which would be stupid if Nvidia did not support it. Vulkan will be an interesting subject entirely, because if you look at the Kronus list of companies that supported them, even Microsoft is on it. Strikes me as odd, considering Vulkan is slated to compete with DX12.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It means the X86 license will be available to any of those companies, allowing them to become a competitor in the CPU market. 

There are always VIA..

I don't really see any of them having any interest in x86 neither.

Link to comment
Share on other sites

Link to post
Share on other sites

I have, and I like it compared to the alternatives. But what do you think is wrong with it? What would you like to be changed?

 

This is what TressFX 3 looks like on Deus EX:

 

Dawn-Engine-3.jpg

 

Is it because it's too "shiny"?

 

 

My problem lies more with the movement, though the hair often looks too much like a shampoo commercial as well.

 

More often than not the hair looks like it's either weightless or too heavy. In tomb raider, the hair flew in all directions as if there was no friction/weight to it. Hairworks is slightly better, but on animals it moves too much. It all looks WAY too soft (especially considering those are wild animals and Geralt doesn't really bathe THAT often.

 

It's hard to explain what exactly I don't like, but this should give you a general idea. 

 

Tangled (the movie) is an example of good hair in CGI, even though it looks shiny(but that's probably a style choice), it has the right amount of weight to it.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

There are always VIA..

I don't really see any of them having any interest in x86 neither.

 

I can totally see both Samsung and Nvidia wanting a stake in x86. Apple too. Nvidia is already experimenting with SoC's using Tegra, and Samsung would gladly expand their market presence in absolutely anything. Apple would mostly want it to put that final touch on keeping everything in-house on the hardware aspect of their products. I do not know if they would have the resources to directly compete with Intel (Apple probably would, but they would not cater to the average consumer like Intel) but i know they would still jump at the opportunity for x86.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can totally see both Samsung and Nvidia wanting a stake in x86. Apple too. Nvidia is already experimenting with SoC's using Tegra, and Samsung would gladly expand their market presence in absolutely anything. Apple would mostly want it to put that final touch on keeping everything in-house on the hardware aspect of their products. I do not know if they would have the resources to directly compete with Intel (Apple probably would, but they would not cater to the average consumer like Intel) but i know they would still jump at the opportunity for x86.

What segment of the market would samsung target with x86?

Same thing for Nvidia. I see Nvidia having more interest in ARM.

Why wouldn't apple just go full ARM, with an upscales ARM core? Would make much more sense.

Link to comment
Share on other sites

Link to post
Share on other sites

It means the X86 license will be available to any of those companies, allowing them to become a competitor in the CPU market. 

 

 

I understand that AMD tries to keep everything open, but history has shown us that both companies tend to avoid software produced by each other. Vulkan will be an exception ,because it is the direct replacement for OGL, which would be stupid if Nvidia did not support it. Vulkan will be an interesting subject entirely, because if you look at the Kronus list of companies that supported them, even Microsoft is on it. Strikes me as odd, considering Vulkan is slated to compete with DX12.

 

I just want to comment on the first thing. While it's correct that a sale of AMD, voids the contracts and licenses of x86, it is important to remember, that Intel uses AMD IP as well. So those contracts and licensing deals would have to be re made either way.

 

Yeah I agree. That is why I support open industry standards a lot more than GameWorks and the likes.

 

It's not that odd, since Vulkan has to run properly on Windows as well.

 

My problem lies more with the movement, though the hair often looks too much like a shampoo commercial as well.

 

More often than not the hair looks like it's either weightless or too heavy. In tomb raider, the hair flew in all directions as if there was no friction/weight to it. Hairworks is slightly better, but on animals it moves too much. It all looks WAY too soft (especially considering those are wild animals and Geralt doesn't really bathe THAT often.

 

It's hard to explain what exactly I don't like, but this should give you a general idea. 

 

Tangled (the movie) is an example of good hair in CGI, even though it looks shiny(but that's probably a style choice), it has the right amount of weight to it.

 

The only big issue I saw with TR, was the no gravity thing. But yeah, when you look at the tech demoes, it's just a bit too bouncy all these techs. Sadly I have not seen animation of TressFX 3, but if it's like 2, it won't be that different

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I made a mistake, it was Richard Huddy who stated this:

 

http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/

 

As for the contracts, they are under heavy NDA, so they will never be disclosed in public. All we can do is take Huddy's word for it. So that's up to the individual.

 

As for exclusivity, NVidia has apparently stated, they do not prevent devs, from using other tech, so that is good. Really wish, we could get a TressFX 3 patch, but that is not going to happen. EIther way, changing the tessellation multiplier, pretty much solves the issue with HairWorks, smashing performance.

 

Exactly, Huddy said this, and there are zero reasons for me, or anyone else for that matter, to take his word for it.

 

As for the contract goes - as you said, they are under NDA, so I'm not sure why you automatically assumed that they are exclusive, since you didn't even see them. I work for IBM, and our contracts are under NDA as well, not because IBM is doing something illegal, or is trying to cripple competition, but because there's sensitive information about customers and business partners.

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Its more than just the components, you actually can damage the GPU/CPU with overclocking. Its the thing with bumping up things, power for instance. Then you have the thermals, of which you don't always have room to play around. Nvidia knows what their GPUs can do, yes. BUT, it also is up to the Manufacturer to DESIGN the laptop. How about you slam a 980M into a Dell XPS 13 Touch and then overclock it, what do you think will happen, even if you somehow manage to get it in? Is it Nvidias fault you can't get that GPU in there? No, it is Dells technically, since it wasn't designed that way. As well, if you have only 100w to play with, and your system is using 98w what will happen when you bump it up to 110w? Heat is the same, everything in a laptop is designed for a specific purpose, power consumption, thermal dissipation w and w/o fans, etc.

 

 

 

Because people fanboys will use anything to attack, lie and degrade a company.

 

Yeah, I haven't done it myself, but I know people that have been in that situation and were wondering why Nvidia would allow it in the first place.

Tell that to my friend running the overclocked GTX860M and i7. Both are kept well within thermal limits and both have been fine for over a year.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I do agree with you on vendors working closely with software designed only for their hardware, as it has (and will continue) to promote an unhealthy relationship with developers. What i would much rather see, is Nvidia or AMD get into the business of releasing software that improves the experience for all gamers, regardless of which graphics medium they chose. They can make the money from licensing the software, no point in making it exclusive. The problem is, Nvidia and AMD absolutely refuse to use anything made by each other. We've seen this before with CUDA, Mantle, etc and we will see it for as long as they both exist.

 

As long as this unhealthy relationship exists, there will always be a perceived bias from the community. When someone who uses an AMD graphics cards experiences a hiccup in a game that says 'Nvidia: The way its meant to be played" on a title card, that will be their prime target to blame. The same will be said vice versa. 

 

At the end of the day, it all comes down to business. Nvidia wants to inflate the importance of their GPU's, and AMD wants to do the same. They will do whatever it takes to entice people to purchase their hardware over a competitor. If it means making exclusive software featured only on their hardware, one cannot blame them for it. People often claim AMD is devoid of guilt because they offered Nvidia to have Mantle for free, and that its open source, but using it could have put Nvidia at AMD's mercy, and I am certain everyone understands that. No company wants to rely on a competitor unless it is absolutely necessary, and even then, they hate it. 

 

Situations like these tend to suck for the consumer, because they feel they are being stripped of an experience based on a wrong choice. From what i am reading, it seems AMD and CDPR are working on making sure that everyone will get a great experience with it, so lets hope they figure something out. From what i already see, there are workarounds that alleviate most of the issues. I am gonna stop this long wall of text here, because i forgot exactly where i was going with this. 

 

Snap into a Slim Jim.

 

Good, points.  I personally believe that proprietary actually drives innovation further than open source.  Only because while open source would lead to companies pushing hardware for faster and better performance, Innovation and options come from proprietary tech. Because of proprietary tech we have AMD trying to compete with Gsync and now we have the option of freesync.  after a while the better tech will win out and become the standard.  However if Nvidia had made Gsync open source, then AMD would have no motivation to invest valuable resources into freesync, thus no options and no new development/research.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Good, points.  I personally believe that proprietary actually drives innovation further than open source.  Only because while open source would lead to companies pushing hardware for faster and better performance, Innovation and options come from proprietary tech. Because of proprietary tech we have AMD trying to compete with Gsync and now we have the option of freesync.  after a while the better tech will win out and become the standard.  However if Nvidia had made Gsync open source, then AMD would have no motivation to invest valuable resources into freesync, thus no options and no new development/research.

 

There are instances where that is exemplified and there are instances where the opposite is exemplified. And that starts to get iffy in the face of competing standards, open and non, or instances of subjective valuation or non-factual or tenuously connected information skewing consumption.

 

Proprietary tech spurs competition, but open tech does as well, and where proprietary tech may push parallel investments open tech allows for parallel production and development. DeGuerreotype did not corner the market as an open tech, but it did spur advancement and derivative works, VHS did beat out Beta but many would say that was a case of the inferior tech winning out. Neither form trumps the other, both exist together throughout history. I could go into philosophical or ethical differences and examples but it doesn't matter in this discussion. When you get down to it, both have their pros and cons, but from the standpoint of the consumer, i.e. US, neither of these in and of themselves are better, the real question is how do we get the best bang for our buck, the most subjective valuation in the ROI from the purchases and investments we make. And while we may all have different opinions on which we prefer we can all agree we want the best possible performance and longevity for the money we spend, and want our previous investments to keep some form of value as the marketplace shifts over time. The real arguments and antipathy stems from the feeling of malinvestment or contrived limitations or stumbling blocks that are put between us and our purchases by the powers that be. on the hardware side, software, regulatory, etc there are many players pushing things back and forth and in the end the problems WE have stem from those machinations effecting US and our purchases; preventing us from getting the value we wanted from our investment.

 

There's a lot of economics we could get into over this subject, but in the end its a question of the money we make, being spent on items we hope to get a certain amount of enjoyment from, and over time many factors extending or paring back that return on our investment.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD did a live stream earlier in response to some of the criticism that was directed at them along with more information as to why they think Gameworks messed their plans up bigtime:

 

http://www.twitch.tv/amd/v/5334646

 

(Starts at 9:20)

Link to comment
Share on other sites

Link to post
Share on other sites

AMD did a live stream earlier in response to some of the criticism that was directed at them along with more information as to why they think Gameworks messed their plans up bigtime:

 

http://www.twitch.tv/amd/v/5334646

 

(Starts at 9:20)

Good watch..

16:30 also talks about the Tesselation issues the game has.

"Why use 64x Tesselation when 16-32x has NO distinguishable difference, and in fact some the pixels carrying multiple polygons already have no way to convey that extra detail"

 

/To differentiate their product stacks between Kepler and Maxwell I guess,... the 780Ti would be neck and neck with Maxwell's 970 no doubt if the game was "Optimized/Balanced/Leveled out equally" as far as Tesselation goes.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×