Jump to content

AMD FreeSync VS Nvidia G-Sync (Tom's Hardware)

Rekx

Heyyo,

That makes absolutely no sense to me... they had ways of setting AA on hairworks from the beginning... but not tessellation? You see how that doesn't make sense right? I think it was patch 1.7 that added a slider for "NVIDIA Hairworks Preset" which is probably the tessellation slider.

TFiUMcu.png

Of course I haven't bought a COD game in years lol so I have no idea how I think COD Ghost handles hairworks neither but maybe someone can give some insight into that???

Sadly I have no idea how to code beyond basic script from GTA III... so I have no idea tbh how easy or hard it is to implement and tweak.

After reading over what frame rate control is? That's the same as using Rivia Tuner Statistics Server and setting max fps in that... NVIDIA has it built into their driver as well but it's a hidden tweak that can be used via NVIDIA Inspector. It's still a decent idea but it's nothing new tbh. I found using a framerate cap of 30fps on Final Fantasy XIII and XIII-2 made the game a lot more playable as it wouldn't constantly flip between slower and faster animations in combat from frametime variance. I even plopped down a guide to help those with the same issue as me.

http://steamcommunity.com/sharedfiles/filedetails/?id=388731782

True, NVIDIA could very well adopt AMD's Freesync... but ultimately? It's their choice not to. In the end? A better solution would have been for VESA to make an industry standard and publish it themselves instead of what we have now of "here's the basics on how it works, go make your own solution with it" which has caused this splinter...

It's also my hopes and dreams too... more performance out of my i7-3770k would be handy! :)

No doubt. Mainly? DirectX 9 support from game devs needs to really die or become an afterthought or something lol. I seriously don't know anyone on Vista or XP that game... their PCs are essentially social media with Google Chrome or Firefox with adblock plus on it with malware blocking on to prevent me from having to fix the things so darn much. :P

World of Tanks is definitely the prime example of how their BigWorld engine needs a port from DirectX 9 to DirectX 11 or heck! skip that entirely and go to DirectX 12! That game used to drive me nuts! :(

My PC is linked in my sig for specs...

LSJ7q90.jpg

so yeah... I got a beefy enough PC with even a dedicated SSD for World of Tanks... 48.4 fps... why? BigWorld Engine (aka BugWorld Engine) is single-threaded... even though it's almost been a decade that quad core CPUs have been out (Intel Core2Quad Q6600 released in 2006).

Both GPUs are at 50% utilization and 60c and 1.3GB VRAM used...

about 4GB of RAM used on the game...

CPU? Four cores on IDLE, one at 44% (physics is my guess), one at 23% (FMOD sound system), one at 13% (Windows OS) and one at high load of 85%... thanks WarGaming. fix your damn game engine already... BugWorld... you annoy me lol.

The game doesn't have multi-threading... it has "idle core detection" which flips the computing to the most idle threads of the bunch... oh yay. :P

To their credit though in patc 9.9? They revamped the lighting and shadow shaders and implemented a REALLY good version of Anti-Aliasing... Temporal Super Sampling Anti-Aliasing (TSSAA). It's definitely one of the best versions of AA I've seen yet. :)

So it is a lot more optimized than when I took that screenshot in version 9.5 I think it was.

And I though War Thunder had bad performance.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

The kinds of setups you're talking about are most definitely in the "prosumer" category. At that point, you've left the regular consumer market and stepped into the prosumer/high-end enthusiast market

But to say they aren't for gamers/gaming is a little off considering there are gamers out there and situation for gaming that would require that you run 2 and 3 way sli/xfire. I do agree they are the prosumer chips, but would argue they aren't defacto workstation chips.

Link to comment
Share on other sites

Link to post
Share on other sites

I am surprised nobody has posted this yet:

post-216-0-37548200-1439675128.png

 

These are the results from the blind tests.

60% says G-Sync was better.

21% says FreeSync was better.

19% says they were the same.

 

I don't think the results here matters that much though, since variable refresh rate seems to be a few years away from becoming common, and by then we probably have far better monitors on the market with variable refresh rate (at least on the AMD side). Hopefully by then both of then we will have monitors capable of variable refresh rate on both AMD and Nvidia hardware. Right now the situation is pretty crap for consumers when you have to pick side.

post-216-0-37548200-1439675128.png

Link to comment
Share on other sites

Link to post
Share on other sites

I am surprised nobody has posted this yet:

attachicon.giffreesync-or-gsync.png

 

These are the results from the blind tests.

60% says G-Sync was better.

21% says FreeSync was better.

19% says they were the same.

 

I don't think the results here matters that much though, since variable refresh rate seems to be a few years away from becoming common, and by then we probably have far better monitors on the market with variable refresh rate (at least on the AMD side). Hopefully by then both of then we will have monitors capable of variable refresh rate on both AMD and Nvidia hardware. Right now the situation is pretty crap for consumers when you have to pick side.

 

I thought about it, but I didn't want to have a five page discussion with someone who didn't like the outcome.

 

As far as results go they are a good solid result for a preliminary study.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That makes absolutely no sense to me... they had ways of setting AA on hairworks from the beginning... but not tessellation? You see how that doesn't make sense right? I think it was patch 1.7 that added a slider for "NVIDIA Hairworks Preset" which is probably the tessellation slider.

Of course I haven't bought a COD game in years lol so I have no idea how I think COD Ghost handles hairworks neither but maybe someone can give some insight into that???

Sadly I have no idea how to code beyond basic script from GTA III... so I have no idea tbh how easy or hard it is to implement and tweak.

After reading over what frame rate control is? That's the same as using Rivia Tuner Statistics Server and setting max fps in that... NVIDIA has it built into their driver as well but it's a hidden tweak that can be used via NVIDIA Inspector. It's still a decent idea but it's nothing new tbh. I found using a framerate cap of 30fps on Final Fantasy XIII and XIII-2 made the game a lot more playable as it wouldn't constantly flip between slower and faster animations in combat from frametime variance. I even plopped down a guide to help those with the same issue as me.

http://steamcommunity.com/sharedfiles/filedetails/?id=388731782

True, NVIDIA could very well adopt AMD's Freesync... but ultimately? It's their choice not to. In the end? A better solution would have been for VESA to make an industry standard and publish it themselves instead of what we have now of "here's the basics on how it works, go make your own solution with it" which has caused this splinter...

 

Well we know devs can get certain degrees of GameWorks access. The lowest one, being basic access, where GameWorks effects are just black boxed DLL's. Those effects are just shaders called as functions, here you call the effect using certain variables, like colour, anchor points, etc. CDPR has publicly stated that they cannot optimize the effect, which inclines that they only have basic access to GameWorks. If so it is very easy to assume they cannot themselves set the tessellation factor. I believe we have not seen such a setting in any HairWorks game.

The presets are always a predefined setting covering several smaller settings. We know HairWorks has 3 main settings (Off-Geralt-All) and (I guess) 4 AA settings (Off-2x-4x-8x). Those presets seems to be a variation of only those two. Otherwise we should have seen a dedicated tessellation setting after all (remember that the patch only added a GUI to these settings, that already existed in INI files).

 

No one cares about COD anyways :lol:

 

I believe there are two levels more to GameWorks: Access to source code and license to directly change that code. But I don't remember, so feel free to correct me (believe it was in the no bs podcast with NVidia).

 

Yeah I doubt it is anything revolutionary, but it makes sense to have it and should be used as standard. I always use it, because getting 5000 + fps in loading screens is dumb and bad for you capacitors on the GPU/PSU. The problem is that Gsync will never go above the max VRR windows, but force VSync internally in the monitor (so it throws away frames afaik)(Edit: Forgot that NVidia changed the driver settings to include a fluid framerate, that could exceed max VRR, but NVidia put VSync on for these tests ->). In this test certain vsync settings was done on GSync (in BF4), but not set on AMD system.

I generally like the testing done by Tomshardware, and they did a really nice job about it, almost scientifically. I LOVE the constructive criticism they give themselves and that they gave AMD the word to give their two cents. I think everyone should read the last parts of this test, as it brings up some good points. After all, any game going above 90fps would give tearing on the Freesync system. Anyone with such a system should use frame rate control to not go above 90fps (although it's a genuine critique that FRC only works on DX10+, so no good on Valve titles for instance).

 

Freesync is AMD's proprietary driver that utilized Adaptive Sync. Adaptive Sync is the hardware standard adopted by VESA, so they did what you ask of them. All NVidia needs to do is write a driver to utilize this hardware standard. Something that seems to be fairly easy for NVidia, as they did a laptop driver that did just this.

 

Unless you want to run SLI or Xfire at either a) higher than 8x 8x or b ) 3 way.

 

No gamer should ever use more than 2 GPU's, as the gain diminishes dramatically above 2 cards. Even 2 cards are not useful in a lot of games. Also not sure any single GPU card is able to utilize more than 8x PCI 3.0. Either way such a setup is still prosumer/professional (workstation). Again most people only have dual core and a tiny part has more than 4 cores (most of which are probably some sort of AMD system, I assume).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The question was whether they were fans. That's basically the same kind of thing as fanboy, the latter's just the extreme version where people go internet crusading for their team.

Yep.

Link to comment
Share on other sites

Link to post
Share on other sites

The question was whether they were fans. That's basically the same kind of thing as fanboy, the latter's just the extreme version where people go internet crusading for their team.

Yep.

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

And I though War Thunder had bad performance.

Lol fuck no. War Thunder runs great and also pushes a lot more higher detail to the screen. DirectX 11 optimized very nicely on War Thunder... where-as World of Tanks running on Bugworld engine really needs to kill DirectX 9. :P 

 

These are the results from the blind tests.

60% says G-Sync was better.

21% says FreeSync was better.

19% says they were the same.

 

I don't think the results here matters that much though, since variable refresh rate seems to be a few years away from becoming common, and by then we probably have far better monitors on the market with variable refresh rate (at least on the AMD side). Hopefully by then both of then we will have monitors capable of variable refresh rate on both AMD and Nvidia hardware. Right now the situation is pretty crap for consumers when you have to pick side.

That is true as well. I keep thinking about upgrading but I know in the future if I upgrade? I'll want a 2560x1440 IPS Panel that is 27" or so with 144Hz refresh and a form of adaptive sync... but I'm probably another year or two off from that. I'll probably wait and see how NVIDIA's Pascal works out or AMD's HBM2-based GPUs are like... heck, by then? Maybe the AMD Fury air/x will drop in price, same with the GTX 980 Ti... and as you said Lawlz, selection of adaptive-sync monitors should improve too.

That's the main thing holding me back tbh, costs and of course limitations with certain monitors. Heck, so far in life? I've been happy even with 60Hz Vsync... I might only eventually move up to a 144Hz monitor regardless of adaptive sync and just keep Vsync on depending on pricing. :P

 

Well we know devs can get certain degrees of GameWorks access. The lowest one, being basic access, where GameWorks effects are just black boxed DLL's. Those effects are just shaders called as functions, here you call the effect using certain variables, like colour, anchor points, etc. CDPR has publicly stated that they cannot optimize the effect, which inclines that they only have basic access to GameWorks. If so it is very easy to assume they cannot themselves set the tessellation factor. I believe we have not seen such a setting in any HairWorks game.

The presets are always a predefined setting covering several smaller settings. We know HairWorks has 3 main settings (Off-Geralt-All) and (I guess) 4 AA settings (Off-2x-4x-8x). Those presets seems to be a variation of only those two. Otherwise we should have seen a dedicated tessellation setting after all (remember that the patch only added a GUI to these settings, that already existed in INI files).

After more searching? "NVIDIA Hairworks Preset" is indeed the in-game settings for tessellation... so CD Projekt Red can make up all the excuses they want... but just like Warner Bros and the PC version of Batman: Arkham Knight's performance in general? I'm sure they knew how terrible the performance was on NVIDIA hairworks in The Witcher 3 and yet they still release the game like that? Do you really think their game testers would have missed NVIDIA hairworks terrible performance issues? If so? Then they should have fired those guys lol...

Heck! Look at how they didn't even mention they scaled back the graphics in The Witcher 3 compared to all the demos they put out! Same shit Ubisoft did with Watch_Dogs... I regret not demanding a refund on Watch_Dogs... $60 I'll never get back. What a freaking mess that game was. Learnt my lesson the hard way about preordering Ubisoft games...

Another prime example about wtf CD Projekt Red? You ever play The Witcher 2 when it released? It probably made Batman: Arkham Knight look like a half-decent port hahaha... insanely terrible performance woes... I remember having a lot less grief with The Witcher 1 on release and that one was buggy and performance wasn't very good on neither... Witcher 1 is still my fav in the series and I'm sure I'll be the only one here on the forums with that opinion lol, but it definitely intrigued me the most especially storyline-wise of the three thus far.

After thinking more about The Witcher 2? Yeah, CD Projekt Red should definitely take a closer look at their game testers... because dayum! That game was garbage at release... I especially remember the broken combat roll... the animation would start, and Geralt would lazily start turning AS he rolled instead of starting his roll in the direction you chose... so you got hit by whichever attack you were trying to dodge EVERY TIME and of course Geralt was busy trying to finish his combat roll so you couldn't even retaliate!!! HOW DID THEY MESS THAT UP!?!? :(

No one cares about COD anyways :lol:

Lol! That is true, but I'm kind of curious about Hairworks on that and if it also suffered the same issues with lack of graphical settings or not.

I believe there are two levels more to GameWorks: Access to source code and license to directly change that code. But I don't remember, so feel free to correct me (believe it was in the no bs podcast with NVidia).

Yeah I doubt it is anything revolutionary, but it makes sense to have it and should be used as standard. I always use it, because getting 5000 + fps in loading screens is dumb and bad for you capacitors on the GPU/PSU. The problem is that Gsync will never go above the max VRR windows, but force VSync internally in the monitor (so it throws away frames afaik)(Edit: Forgot that NVidia changed the driver settings to include a fluid framerate, that could exceed max VRR, but NVidia put VSync on for these tests ->). In this test certain vsync settings was done on GSync (in BF4), but not set on AMD system.

I generally like the testing done by Tomshardware, and they did a really nice job about it, almost scientifically. I LOVE the constructive criticism they give themselves and that they gave AMD the word to give their two cents. I think everyone should read the last parts of this test, as it brings up some good points. After all, any game going above 90fps would give tearing on the Freesync system. Anyone with such a system should use frame rate control to not go above 90fps (although it's a genuine critique that FRC only works on DX10+, so no good on Valve titles for instance).

I agree with this... I think AMD should even have a popup or something to even say "The scalar in *insert plugged in monitor name* only supports AMD Freesync up to *insert Hz*. Enabling AMD Freesync by default will cap framerate at *insert equivalent fps* via framerate control. Do you want the default settings applied for maximum Freesync compatibility?" It would probably solve quite a few issues... well, unless Firefox's CEO got butt-hurt again over default settings option... HAAYYYOOOOO!!! C'mon Firefox, tell me again how it's not cool about default settings on Windows 10 upgrade yet you can change my default search engine in Firefox update 34 WITHOUT MY PERMISSION OR ANY OPTION to Yahoo? What a hypocrite. :P

Freesync is AMD's proprietary driver that utilized Adaptive Sync. Adaptive Sync is the hardware standard adopted by VESA, so they did what you ask of them. All NVidia needs to do is write a driver to utilize this hardware standard. Something that seems to be fairly easy for NVidia, as they did a laptop driver that did just this.

I think I didn't make myself quite clear on it. I'm saying? Instead of relying on hardware vendors to make their own interface? I think VESA should have made it so "you use your system only for Displayport Adpative-sync. No proprietary systems." That way? If NVIDIA really wanted G-Sync? They'd have to create their own interface... would they have gone so far as to create their own interface? Probably not... so right then and there VESA could have prevented this splinter much like in the past with media formats like Microsoft's HD players versus Sony's Blu Ray. The difference between adaptive-sync and those media format wars? I doubt there will be a clear victor and there shall forever be this diving gap which does suck for gamers in general... meh, I guess in the end it's better than the dirty stuff in PC desktops as in the past like prietary RAM that my old Compaq PC had that wouldn't work with just any brand of RAM... gross.

No gamer should ever use more than 2 GPU's, as the gain diminishes dramatically above 2 cards. Even 2 cards are not useful in a lot of games. Also not sure any single GPU card is able to utilize more than 8x PCI 3.0. Either way such a setup is still prosumer/professional (workstation). Again most people only have dual core and a tiny part has more than 4 cores (most of which are probably some sort of AMD system, I assume).

Now, those Steam hardware surveys for CPUs? That's how many PHYSICAL cores. So that means there's a crap-ton of people with Intel i3's and Pentium G3258's and such. Intel has been slapping out epic dual core CPU's since the Conroe in 2005 I think it was with stuff like the E6600. My guess though? Intel i3. They have two physical cores with hyper-threading to make them act like a quad core and it seems to work very nicely as Intel i3's can still dish out some good gaming performance. This would also include older AMD Athlon X2 series but those are quite old.

So yes, technically the Intel i3 is a dual core CPU, but it has hyper-threading which makes it perform about the same as an AMD hexa-core.. so I feel that Valve should have gone by how many threads of processing power do CPUs have as that would have made more sense. Bundling the Pentium G3258 in with the Intel i3-4160 doesn't seem like a very fair comparison. :P

or dat frametime variance in The Witcher 3 on the G3258. Stuttering a lot... then again it's a little bit of an unrealistic situation of a Titan X paired with a G3258 but you get the idea. :P

The Quad cores? That would be AMD's FX-4000 series and Athlon X4's and such yeah, but that also means Intel i5 and even their older Core 2 Quad series like 2006's Q6600.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

After more searching? "NVIDIA Hairworks Preset" is indeed the in-game settings for tessellation... so CD Projekt Red can make up all the excuses they want... but just like Warner Bros and the PC version of Batman: Arkham Knight's performance in general? I'm sure they knew how terrible the performance was on NVIDIA hairworks in The Witcher 3 and yet they still release the game like that? Do you really think their game testers would have missed NVIDIA hairworks terrible performance issues? If so? Then they should have fired those guys lol...

Heck! Look at how they didn't even mention they scaled back the graphics in The Witcher 3 compared to all the demos they put out! Same shit Ubisoft did with Watch_Dogs... I regret not demanding a refund on Watch_Dogs... $60 I'll never get back. What a freaking mess that game was. Learnt my lesson the hard way about preordering Ubisoft games...

Lol! That is true, but I'm kind of curious about Hairworks on that and if it also suffered the same issues with lack of graphical settings or not.

 

Do you have a source on that? Maybe NVidia got cold feat and gave them added access. I don't have the game, so I cannot confirm myself. CDPR is one of the most consumer friendly and open devs in the industry, so they tend to get the benefit of the doubt. Also GameWorks effects tends to be delivered in full close or very close to launch, so it's too easy to just blame CDPR for this. Arkham Knight was a disaster for many reasons, but those projects are not comparable tbh.

 

The problem with critiquing WD and W3 is that people don't seem to understand the difference between cinematography and actual graphical fidelity. Sure both titles have been downgraded, but not at all to the extent that people have claimed and criticized. The Witcher 3 was shown with heavy contrast and few colours, which looks nice as contrast naturally do, but causes eye fatigue within an hour of gaming or so. CDPR did however publicly talk about downgrading in the game: http://forums.cdprojektred.com/threads/33437-GRYOnline-pl-interview-with-CDPR-studio-lead-Adam-Badowski-translation?highlight=badowski

 

As for Witcher 2, idk, it's not really relevant. CDPR is/was a small studio, and what they have achieved is pretty impressive. W2 is massive compared to W1 from what I hear, so it takes a lot more, but at least CDPR doesn't abandon their games. They release lots of patches, even made W2 the enhanced edition free for all who had the standard version. No one else does that kind of thing.

 

Like Witcher 3, COD Ghost tanked the performance, removing 1/3rd of the fps: http://www.geforce.com/whats-new/guides/call-of-duty-ghosts-graphics-and-performance-guide#9

The difference in low and high is the number of hairs, not tessellation factor. Maybe that is the same for the Witcher 3 preset?

 

Odd that all 3 titles we've been talking about are GameWorks titles. Especially when talking about downgrading.

 

I agree with this... I think AMD should even have a popup or something to even say "The scalar in *insert plugged in monitor name* only supports AMD Freesync up to *insert Hz*. Enabling AMD Freesync by default will cap framerate at *insert equivalent fps* via framerate control. Do you want the default settings applied for maximum Freesync compatibility?" It would probably solve quite a few issues... well, unless Firefox's CEO got butt-hurt again over default settings option... HAAYYYOOOOO!!! C'mon Firefox, tell me again how it's not cool about default settings on Windows 10 upgrade yet you can change my default search engine in Firefox update 34 WITHOUT MY PERMISSION OR ANY OPTION to Yahoo? What a hypocrite. :P

 

They could just set FRC as standard on Freesync mode, but then people would whine about lower fps average and max. People are dumb, it's incredible. The point is that some (if not all) of the comparison was a little unfair, as Freesync would tear above 90fps, but Gsync would not. That can be changed in a simple setting. FRC not working on DX9 titles is a very valid critique though, as well as the max being 95 instead of 144/145.

 

I think I didn't make myself quite clear on it. I'm saying? Instead of relying on hardware vendors to make their own interface? I think VESA should have made it so "you use your system only for Displayport Adpative-sync. No proprietary systems." That way? If NVIDIA really wanted G-Sync? They'd have to create their own interface... would they have gone so far as to create their own interface? Probably not... so right then and there VESA could have prevented this splinter much like in the past with media formats like Microsoft's HD players versus Sony's Blu Ray. The difference between adaptive-sync and those media format wars? I doubt there will be a clear victor and there shall forever be this diving gap which does suck for gamers in general... meh, I guess in the end it's better than the dirty stuff in PC desktops as in the past like prietary RAM that my old Compaq PC had that wouldn't work with just any brand of RAM... gross.

 

Well NVidia is a VESA member and has full access to the DP platform/standard. They have to live up to those standards, but what NVidia did, was to go out of spec and send proprietary information over the DP connection. Not sure VESA can do anything about it. Only thing would be to sue NVidia and remove membership, but that is not going to happen, and it would make little sense. Also bear in mind that DP can carry a USB signal, so NVidia could just use that to send the extra Gsync info (who knows maybe they do).

 

AMD propose​d Adaptive Sync to VESA, which adopted it. This is the industry standard. I don't blame NVidia for focusing on their own proprietary solution they directly make money off of, but they are at fault for a split market and for not adopting the industry standard. Let's just hope AS becomes mandatory on 1.3+.

Now, those Steam hardware surveys for CPUs? That's how many PHYSICAL cores. So that means there's a crap-ton of people with Intel i3's and Pentium G3258's and such. Intel has been slapping out epic dual core CPU's since the Conroe in 2005 I think it was with stuff like the E6600. My guess though? Intel i3. They have two physical cores with hyper-threading to make them act like a quad core and it seems to work very nicely as Intel i3's can still dish out some good gaming performance. This would also include older AMD Athlon X2 series but those are quite old.

So yes, technically the Intel i3 is a dual core CPU, but it has hyper-threading which makes it perform about the same as an AMD hexa-core.. so I feel that Valve should have gone by how many threads of processing power do CPUs have as that would have made more sense. Bundling the Pentium G3258 in with the Intel i3-4160 doesn't seem like a very fair comparison. :P

or dat frametime variance in The Witcher 3 on the G3258. Stuttering a lot... then again it's a little bit of an unrealistic situation of a Titan X paired with a G3258 but you get the idea. :P

The Quad cores? That would be AMD's FX-4000 series and Athlon X4's and such yeah, but that also means Intel i5 and even their older Core 2 Quad series like 2006's Q6600.

 

Pentium's do not have Hyperthreading. I3's do, but just because an I3 have 4 logical processors, does not mean it can run games demanding quad core CPU's properly. Some can, but only because they are not demanding and/or doesn't use 3rd and 4th thread that much. If you are buying a budget PC today and you want to play games that are either out now in the AAA market or just future games, not getting an actual quad core CPU would be a huge mistake. I know the Pentium can be OC'd a lot and so, but people who bought it for gaming rigs are idiots, unless they planned for an upgrade to i5 down the line.

 

Look at this post from another thread: http://linustechtips.com/main/topic/431047-analyst-slams-amd-for-distasteful-multi-million-dollar-bonuses/page-3#entry5782736

 

A Pentium G3450 has a 9 fps minimum where the AMD 860K had 29fps minimum in GTAV, making the Pentium useless. Sure an I3 is better, but are they comparable in price? Because you have to compare apples to apples when it comes to price.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I just got my G-sync monitor two days ago. It's the predator 1440p 144hz IPS G-Sync. It's very nice and G-Sync is definitely worth it. The price is high but there is no reason to upgrade a monitor and to not get G-sync you would be stupid to not get one.

 

 

$500 aud for a 27" 144hz 1080p monitor or $1000 for a 27" 144hz 1440p g-sync monitor (or $750 for 24" 1080p g-sync 144hz). I chose the former as it is HALF THE PRICE for almost the same graphical clarity. 

 

There is no where near enough choice, yet.

This is what I think of Pre-Ordering video games: https://www.youtube.com/watch?v=wp98SH3vW2Y

Link to comment
Share on other sites

Link to post
Share on other sites

$500 aud for a 27" 144hz 1080p monitor or $1000 for a 27" 144hz 1440p g-sync monitor (or $750 for 24" 1080p g-sync 144hz). I chose the former as it is HALF THE PRICE for almost the same graphical clarity. 

 

There is no where near enough choice, yet.

There is a pretty noticeable difference for me. I have a 1080p monitor on my left and my 2560 in my centre. It's annoying to move my cursor onto a 60hz panel though going from the 144hz.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Do you have a source on that? Maybe NVidia got cold feat and gave them added access. I don't have the game, so I cannot confirm myself. CDPR is one of the most consumer friendly and open devs in the industry, so they tend to get the benefit of the doubt. Also GameWorks effects tends to be delivered in full close or very close to launch, so it's too easy to just blame CDPR for this. Arkham Knight was a disaster for many reasons, but those projects are not comparable tbh.

It is true that NVIDIA GameWorks seems to have been added with only a few months left in development and that's probably why they didn't bother with AMD's TresFX to include two different techs, unlike Rockstar Studios and GTA V...

http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/2/

So why did CD Projekt Red choose to include HairWorks but not AMD's TressFX? It's entirely possible to include tech from both companies; indeed, Rockstar's recent PC release of GTA V includes tech from both AMD and Nvidia.

The basic problem is that there's an additional amount of time and cost attached to including two very different types of technology that produce largely the same effect. According to AMD's Huddy, the company "specifically asked" CD Projekt Red if it wanted to put TressFX in the game following the revelation that HairWorks was causing such a large drop in performance, but apparently the developer said "it was too late."

So with that in mind? It really seems like NVIDIA Hairworks was a rush-job and they didn't test it properly and just said "fuck it" and release it the way it was... much like what I said they did with Witcher 2... but with Witcher 2? How did they not catch the terrible combat roll bugs before release? They focused so much on changing combat in The Witcher 2 from the first game... I dunno. I noticed it right away in that prologue/tutorial section when I figured out it didn't work lol. :P

I dunno, just historically? CD Projekt Red have had bad releases and had to patch heavily, hence the enhanced edition re-releases of the game.

http://www.pcgamer.com/the-witcher-2-review/

The new combat system is a more mixed bag. As before, the gimmick is that you use a steel sword against humans, a silver one against monsters, along with several simple magic spells to stun, burn and otherwise tip the balance in your favour. Between fights, you mix magic potions to adjust your stats in various directions, and lay down traps. Instead of pointing and selecting like before though, every attack is a direct interaction with the game: mouse-clicks for fast and slow strikes, and hotkeys to hurl magic and bombs, parry attacks and roll. This works well against one or two opponents at once, but a mix of long, non-interruptible animations and bad targeting can make fighting groups a pain.

Oddly, this is especially problematic early on, when Geralt has almost no stamina, his spells are weak, you can't block more than a couple of hits at a time, rear attacks deal 200% damage, and you can easily be obliterated by random encounters. Many early skills aren't about making Geralt a better fighter but stopping him being a crap one. This means that combat can be much harder at the start of the first chapter than anywhere else in the game, with little sense of escalation outside of specific boss fights.

LOL! I totally forgot about that too! The terrible stun-lock animation when you got hit! In a patch (looking back I think it was v1.2) they essentially gave the "skill" to you for free that prevents stun-lock animations on Geralt. :P

But before that "skill"? You'd get stun-locked and you couldn't do the combat roll to get away because Geralt took too long to even turn around hahaha... :P

 

The problem with critiquing WD and W3 is that people don't seem to understand the difference between cinematography and actual graphical fidelity. Sure both titles have been downgraded, but not at all to the extent that people have claimed and criticized. The Witcher 3 was shown with heavy contrast and few colours, which looks nice as contrast naturally do, but causes eye fatigue within an hour of gaming or so. CDPR did however publicly talk about downgrading in the game: http://forums.cdprojektred.com/threads/33437-GRYOnline-pl-interview-with-CDPR-studio-lead-Adam-Badowski-translation?highlight=badowski

 

As for Witcher 2, idk, it's not really relevant. CDPR is/was a small studio, and what they have achieved is pretty impressive. W2 is massive compared to W1 from what I hear, so it takes a lot more, but at least CDPR doesn't abandon their games. They release lots of patches, even made W2 the enhanced edition free for all who had the standard version. No one else does that kind of thing.

Well, the whole Enhanced Edition thing started because The Witcher 1's reviews sucked and they wanted a do-over to try and get review companies to have a second look and potentially fix those bad review scores. The Witcher 2 didn't have nearly as bad of problems with release review scores (tbh that surprised me after playing it myself) and I did end up rage quitting the game until a few patches came out... I still think they should have gotten rid of the meditation potion drinking and revert back to potions in combat... which they eventually did in The Witcher 3 and to be honest? That is my favorite improvement to that game. Making alchemy useful again.

Like Witcher 3, COD Ghost tanked the performance, removing 1/3rd of the fps: http://www.geforce.com/whats-new/guides/call-of-duty-ghosts-graphics-and-performance-guide#9

The difference in low and high is the number of hairs, not tessellation factor. Maybe that is the same for the Witcher 3 preset?

 

Odd that all 3 titles we've been talking about are GameWorks titles. Especially when talking about downgrading.

What about Witcher 1 and 2? They have nothing GameWorks and they were broken as fuck on release. Both by CD Projekt Red. :P

They could just set FRC as standard on Freesync mode, but then people would whine about lower fps average and max. People are dumb, it's incredible. The point is that some (if not all) of the comparison was a little unfair, as Freesync would tear above 90fps, but Gsync would not. That can be changed in a simple setting. FRC not working on DX9 titles is a very valid critique though, as well as the max being 95 instead of 144/145.

I'm sure AMD can fix FRC for DirectX 9 though. Rivia Tuner Statistics Server does it on everything so it should be a matter of back-porting or something. Then again, one of the only recent titles I can think of with DirectX 9 is World of Tanks and they've talked about porting their game To DirectX 11... of course that comes right around when Microsoft was talking about DirectX 12 lol. :P

 

Well NVidia is a VESA member and has full access to the DP platform/standard. They have to live up to those standards, but what NVidia did, was to go out of spec and send proprietary information over the DP connection. Not sure VESA can do anything about it. Only thing would be to sue NVidia and remove membership, but that is not going to happen, and it would make little sense. Also bear in mind that DP can carry a USB signal, so NVidia could just use that to send the extra Gsync info (who knows maybe they do).

 

AMD propose​d Adaptive Sync to VESA, which adopted it. This is the industry standard. I don't blame NVidia for focusing on their own proprietary solution they directly make money off of, but they are at fault for a split market and for not adopting the industry standard. Let's just hope AS becomes mandatory on 1.3+.

What I'm saying is VESA should have come up with it long before NVIDIA started even teasing G-Sync. I'm sure once NVIDIA dumped so many funds into R&D for G-Sync they thought it would be a poor move to ditch all that effort. If you think back to it? Why has it taken so long for adaptive-sync to even become a thing? Relying on AMD and NVIDIA to implement their own systems is where the flaw occurred... that, or maybe Intel should have come up with it, much like they did with USB and how that became the hardware standard of device connectivity instead of the earlier mess with serial and parallel and such. Yuck.

Pentium's do not have Hyperthreading. I3's do, but just because an I3 have 4 logical processors, does not mean it can run games demanding quad core CPU's properly. Some can, but only because they are not demanding and/or doesn't use 3rd and 4th thread that much. If you are buying a budget PC today and you want to play games that are either out now in the AAA market or just future games, not getting an actual quad core CPU would be a huge mistake. I know the Pentium can be OC'd a lot and so, but people who bought it for gaming rigs are idiots, unless they planned for an upgrade to i5 down the line.

 

Look at this post from another thread: http://linustechtips.com/main/topic/431047-analyst-slams-amd-for-distasteful-multi-million-dollar-bonuses/page-3#entry5782736

 

A Pentium G3450 has a 9 fps minimum where the AMD 860K had 29fps minimum in GTAV, making the Pentium useless. Sure an I3 is better, but are they comparable in price? Because you have to compare apples to apples when it comes to price.

Yeah that's true, but I'm saying that throwing in the i3 series into the same CPU group as the Pentium dual cores is an unfair assessment. The i3-4160 matches an overclocked AMD FX-9590... but then again, who knows what patch version they were running on that test. I'm sure with the latest patches an AMD FX-9590 could beat a dual-core hyper-threaded i3-4160. I'm pretty sure Digital Foundry even did some benchmarks as can be found on YouTube where an FX-8350 was keeping up with i5's or even beating them in certain games that were properly multi-threaded. I found GTA V to use all eight logical processors without issue on my Intel i7-3770k.

I dunno, maybe AMD could have also setup some kind of bronze/silver/gold standard with their Freesync too. I find it misleading that the freesync range on monitors isn't openly advertised by manufacturers of these monitors and that depending on the range of VRR support? They could have a badge system in place. Yeah Freesync isn't a monetized system and AMD put it out there for others to use, but they should have some kind of optional certification or something to make it clearer for the end-user... kind of like 80 Plus certification on PSU's. A voluntary certification for efficiency. At least then the public would have more confidence in their purchase choices.

I myself don't really favor NVIDIA over AMD, I'm just saying this situation could have played out differently and heck, it potentially still could... but I don't see NVIDIA G-Sync going anywhere and if anything? Depending on the improvements on scalars to better match the VRR for Freesync monitors to hit the full range like NVIDIA's G-Sync? The price of higher performance AMD freesync monitors could potentially match G-Sync's costs and then it'll come down to a matter of GPU preference.

I still keep thinking about that Amazon Canada's VisionTek AMD Fury X for $700 CAD I saw before and pair that with heck, a half-decent 2560x1440 TN (I've been using TN all my life so I don't overly care about the cost markup for IPS tbh lol I care more about aliasing) freesync monitor and even cap the framerate to 5fps below the VRR maximum to play it smoothly as I'm sure anything over 60fps would look great and I don't think running my GPU at max load all the time is what I want to do as my room heats up as it is with my current Multi-GPU setup hitting 75ºC with my reverence GTX 680 coolers... oh damn, just checked it meow and they're showing as out of stock... oh well. Maybe next year once HBM2 comes out and the price of the Fury X drops I might reconsider it again.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm saying is VESA should have come up with it long before NVIDIA started even teasing G-Sync. I'm sure once NVIDIA dumped so many funds into R&D for G-Sync they thought it would be a poor move to ditch all that effort. If you think back to it? Why has it taken so long for adaptive-sync to even become a thing? Relying on AMD and NVIDIA to implement their own systems is where the flaw occurred... that, or maybe Intel should have come up with it, much like they did with USB and how that became the hardware standard of device connectivity instead of the earlier mess with serial and parallel and such. Yuck.

 

 

It seems like a pretty easy fix now, but it wasn't an entirely trivial problem. And it probably wasn't ever going to be addressed unless someone saw an opportunity to make money off it. Nvidia did. It's unfortunate that their solution is completely proprietary, but it's what it took to convince them they could make money off the investment. I just wish they'd give it a certain period of time, then either embrace Adaptive-Sync or somehow make G-Sync an open standard. Locking us all into either "AMD-optimized" or "Nvidia-optimized" monitors from now on is just sad.

Link to comment
Share on other sites

Link to post
Share on other sites

It is true that NVIDIA GameWorks seems to have been added with only a few months left in development and that's probably why they didn't bother with AMD's TresFX to include two different techs, unlike Rockstar Studios and GTA V...

http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/2/

So with that in mind? It really seems like NVIDIA Hairworks was a rush-job and they didn't test it properly and just said "fuck it" and release it the way it was... much like what I said they did with Witcher 2... but with Witcher 2? How did they not catch the terrible combat roll bugs before release? They focused so much on changing combat in The Witcher 2 from the first game... I dunno. I noticed it right away in that prologue/tutorial section when I figured out it didn't work lol. :P

I dunno, just historically? CD Projekt Red have had bad releases and had to patch heavily, hence the enhanced edition re-releases of the game.

http://www.pcgamer.com/the-witcher-2-review/

 

You have to bear in mind that Witcher 3 is an NVidia sponsored title, meaning they probably have a contract dictating the usage of GameWorks (nothing wrong with that per se). Considering W3 was already half a year delayed (as was Watch Dogs interestingly as well), it's simply not realistic to believe the game would be delayed even more, for full testing of GameWorks effects. After all they do work as intended, just poorly optimized. Something which still seems like CDPR was not able/allowed to do anything to fix (basic GameWorks access).

 

Still don't know anything about Witcher 2, but afaik that was an NVidia title as well. But CDPR was still a small developer back then, so them making errors can be forgiven, especially considering their excellent patch work, and free enhanced edition upgrades. Witcher 1 and 2 might have sucked at release but they don't seem to anymore.

 

 

I'm sure AMD can fix FRC for DirectX 9 though. Rivia Tuner Statistics Server does it on everything so it should be a matter of back-porting or something. Then again, one of the only recent titles I can think of with DirectX 9 is World of Tanks and they've talked about porting their game To DirectX 11... of course that comes right around when Microsoft was talking about DirectX 12 lol. :P

 

Maybe they can, but honestly I doubt AMD will waste ressources on an obsolete API like DX9. I mean we are 3 iterations past with DX12 now.

Not all games will be DX 12, as it requires a lot of hard work, working closer to the hardware on a more low level API like DX12. That is why Microsoft released DX 11.3 too, for developers still preferring a higher level API.

 

What I'm saying is VESA should have come up with it long before NVIDIA started even teasing G-Sync. I'm sure once NVIDIA dumped so many funds into R&D for G-Sync they thought it would be a poor move to ditch all that effort. If you think back to it? Why has it taken so long for adaptive-sync to even become a thing? Relying on AMD and NVIDIA to implement their own systems is where the flaw occurred... that, or maybe Intel should have come up with it, much like they did with USB and how that became the hardware standard of device connectivity instead of the earlier mess with serial and parallel and such. Yuck.

 

Remember that most new technologies made in VESA, are technologies proposed by its members like AMD. NVidia innovated synced framerates to begin (and kudos to NVidia for that), but since they chose to keep it for themselves as a proprietary locked technology, there really wasn't anything VESA could do. They did what they could by adopting AMD's version, AMD proposed to VESA.

 

Remember that Adaptive Sync IS the industry standard that is needed to be implemented in hardware. That is all there is to it. Controlling it is done via drivers, that instructs the GPU to send the correct signals and commands. AMD's Freesync is no more proprietary than AMD's DisplayPort driver.

 

Yeah that's true, but I'm saying that throwing in the i3 series into the same CPU group as the Pentium dual cores is an unfair assessment. The i3-4160 matches an overclocked AMD FX-9590... but then again, who knows what patch version they were running on that test. I'm sure with the latest patches an AMD FX-9590 could beat a dual-core hyper-threaded i3-4160. I'm pretty sure Digital Foundry even did some benchmarks as can be found on YouTube where an FX-8350 was keeping up with i5's or even beating them in certain games that were properly multi-threaded. I found GTA V to use all eight logical processors without issue on my Intel i7-3770k.

I dunno, maybe AMD could have also setup some kind of bronze/silver/gold standard with their Freesync too. I find it misleading that the freesync range on monitors isn't openly advertised by manufacturers of these monitors and that depending on the range of VRR support? They could have a badge system in place. Yeah Freesync isn't a monetized system and AMD put it out there for others to use, but they should have some kind of optional certification or something to make it clearer for the end-user... kind of like 80 Plus certification on PSU's. A voluntary certification for efficiency. At least then the public would have more confidence in their purchase choices.

 

Yeah but I wasn't the one who did that. All I'm saying is that the Pentium is useless as a gaming chip.

Not sure if the I3 performance is due to GTAV preferring Intel CPU's or what. It's just a single game, so people have a tendency to overreact I guess.

 

AMD does have a certification program, which allows monitor vendors to put the Freesync logo on the box. This is to prevent that a monitor vendor makes an Adaptive Sync monitor with a VRR window of like 4 hz, as 40-44hz, which would be useless. I think the problem will fix itself as the monitor controller vendors, makes better controllers/scalers with better AS support.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

 

It seems like a pretty easy fix now, but it wasn't an entirely trivial problem. And it probably wasn't ever going to be addressed unless someone saw an opportunity to make money off it. Nvidia did. It's unfortunate that their solution is completely proprietary, but it's what it took to convince them they could make money off the investment. I just wish they'd give it a certain period of time, then either embrace Adaptive-Sync or somehow make G-Sync an open standard. Locking us all into either "AMD-optimized" or "Nvidia-optimized" monitors from now on is just sad.

Meh, even if it was proprietary? It's true too that NVIDIA could have licensed it out with of course royalties much like Intel does with the x86 architecture for CPUs to AMD... and then promptly blocked NVIDIA from making x86 CPUs which is a damn shame as that could have lead to an even more interesting CPU market. We've all seen how NVIDIA's Tegra SoC's are badass running on ARM with the NVIDIA Shield series of products... makes you truly wonder how an NVIDIA x86 CPU (with of course an integrated GPU) could have fared...

@Notional , Another thing I don't like is how AMD blames the tesselation settings months before the game released yet users figured out lowering tesselation via Catalyst Control Centre fixed it? Why couldn't AMD just put out a driver that did that automatically on specific GPUs?

http://www.forbes.com/sites/jasonevangelho/2015/05/21/amd-is-wrong-about-the-witcher-3-and-nvidias-hairworks/

 

Let’s assume Huddy’s claim of working with the developer “from the beginning” is true. The Witcher 3 was announced February 2013. Was 2+ years not long enough to approach CD Projekt Red with the possibility of implementing TressFX? Let’s assume AMD somehow wasn’t brought into the loop until as late as Gamescom 2014 in August. Is 9 months not enough time to properly optimize HairWorks for their hardware? (Apparently Reddit user “FriedBongWater” only needed 48 hours after the game’s release to publish a workaround enabling better performance of HairWorks on AMD hardware, so there’s that.)

Of course AMD's reply here:

http://www.twitch.tv/amd/v/5335751

Right around 11 minutes they even talk about lowering the Tessellation in the CCC! I dunno, instead of blaming NVIDIA for sabotaging AMD GPUs I really think they should have put out a fix or an announcement either on release day or before that "hey the default x64 tessellation settings on The Witcher 3 are fucked so we recommend using x16 via the CCC" instead of what actually happened and them all "the problem is NVIDIA didn't release the source code," or "offer them the option to go into their CCC and set their own tessellation settings"... But if that's the case? Why didn't they even tell people about that option instead of waiting for them to realize it's fucked from the get-go? It honestly sounds like AMD were more focused on trying to convince them to put in TressFX two months before the game came out than even bother to make a fix or announce a way for AMD users to enjoy hairworks... show them the option kind-of thing. At this point it honestly sounds more like a publicity stunt to try and mar the image of NVIDIA's GameWorks than fix the issue *grumbles*.

Another thing even from the Twitch link? Robert Hallock even says around 12 minutes in how GTA V was fine due to no exclusive partnership... but NVIDIA never barred CD Projekt Red from working with AMD... so to me it sounds like Rockstar Studios did a better job of working with hardware vendors on their game than CD Projekt Red.

... but that whole optimization thing makes me think of ID Software's RAGE and the whole broken ATi drivers on release.

http://kotaku.com/5847761/why-was-the-pc-launch-of-rage-such-a-cluster

 

"We knew that all older AMD drivers, and some Nvidia drivers would have problems with the game, but we were running well in-house on all of our test systems. When launch day came around and the wrong driver got released, half of our PC customers got a product that basically didn't work. The fact that the working driver has incompatibilities with other titles doesn't help either. Issues with older / lower end /exotic setups are to be expected on a PC release, but we were not happy with the experience on what should be prime platforms."

So AMD knew months before-hand that older drivers wouldn't work with RAGE yet didn't get a correct driver out in time? I ran RAGE no issue on my GTX 460 on release day with the latest drivers, yet the latest drivers from ATi? What you see in that screenshot of that kotaku article. My two friends with ATi cards (one with a crossfire setup) couldn't get a half-decent working game at release and had to wait for ATi to put out drivers.

Of course there was the whole argument of Project Cars too and PhysX how AMD didn't even bother talking with Slightly Mad Studios... and later AMD finally said they're working with them after the game came out:

http://steamcommunity.com/app/234630/discussions/0/613957600528900678/

https://twitter.com/amd_roy/status/596361439016685569

So who do you believe? AMD or the game developer? What about the whole Skyrim issues on AMD GPUs on release too? Having broken Crossfire drivers for four months?

I dunno, to me it sounds so-so-so much like game developers having either issues with hardware vendors... but it mainly just seems to be AMD. I'm sure a massive part of it is the smaller-sized driver development team. I just seriously don't like the way AMD handled the whole Witcher 3 situation. I don't think CD Projekt Red has made any statements on this but that would be interesting to hear their take... but my guess? They probably don't want to say anything and potentially burn any bridges since both AMD and NVIDIA are the only choice for gaming hardware on PC and don't want to risk alienating their audience which is us, the consumers.

Anywho? I better not derail this thread anymore. I don't think it's healthy that we spun it off with a discussion of gameworks. I'm not gonna reply to anymore of that since I was more interested at first of talking about freesync and Gsync anyways. :P

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

 

Meh, even if it was proprietary? It's true too that NVIDIA could have licensed it out with of course royalties much like Intel does with the x86 architecture for CPUs to AMD... and then promptly blocked NVIDIA from making x86 CPUs which is a damn shame as that could have lead to an even more interesting CPU market. We've all seen how NVIDIA's Tegra SoC's are badass running on ARM with the NVIDIA Shield series of products... makes you truly wonder how an NVIDIA x86 CPU (with of course an integrated GPU) could have fared...

 

AMD doesn't pay royalties to Intel for the x86 license. They have a cross-licensing agreement, because Intel uses AMD extensions to x86, like x86-64. Intel actually paid AMD a lump sum when they settled things in 2009.

 

Nvidia's Denver CPU cores were a bit of flop really. Inconsistent performance on Tegra K1, and they dropped their own fancy CPU architecture for Tegra X1.

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional , Another thing I don't like is how AMD blames the tesselation settings months before the game released yet users figured out lowering tesselation via Catalyst Control Centre fixed it? Why couldn't AMD just put out a driver that did that automatically on specific GPUs?

http://www.forbes.com/sites/jasonevangelho/2015/05/21/amd-is-wrong-about-the-witcher-3-and-nvidias-hairworks/

 

Of course AMD's reply here:

http://www.twitch.tv/amd/v/5335751

Right around 11 minutes they even talk about lowering the Tessellation in the CCC! I dunno, instead of blaming NVIDIA for sabotaging AMD GPUs I really think they should have put out a fix or an announcement either on release day or before that "hey the default x64 tessellation settings on The Witcher 3 are fucked so we recommend using x16 via the CCC" instead of what actually happened and them all "the problem is NVIDIA didn't release the source code," or "offer them the option to go into their CCC and set their own tessellation settings"... But if that's the case? Why didn't they even tell people about that option instead of waiting for them to realize it's fucked from the get-go? It honestly sounds like AMD were more focused on trying to convince them to put in TressFX two months before the game came out than even bother to make a fix or announce a way for AMD users to enjoy hairworks... show them the option kind-of thing. At this point it honestly sounds more like a publicity stunt to try and mar the image of NVIDIA's GameWorks than fix the issue *grumbles*.

 

The reason why AMD implemented a manual tessellation setting in Catalyst to begin with, was due to NVidia titles like Batman Origins and Crysis 2 DX11 patch, which suffered from extreme over tessellation: http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/2

 

You also need to bear i​n mind that AMD working with CDPR, does not give them access to contracts between CDPR an any third party company. Those contracts are under NDA, so AMD would have no way of knowing any implementation of any thing GameWorks in Witcher 3.

 

This is just speculation, but maybe setting the tessellation multiplier manually in CCC, would dictate that multiplier for ALL tessellation, not just HairWorks. In that case, it might change the performance in a negative manner. I usually set it to AMD optimized. They might have done exactly what you ask, but AMD's critique is not just about the massively excessive multiplier, but also the lack of source code access. The latter is the important part, as AMD cannot optimize for any of these tessellation based GameWorks effects.

 

I'm not entirely sure what it is you want AMD to do, that is actually possible? But of course AMD also use it as an opportunity to criticize their competitor.

 

Another thing even from the Twitch link? Robert Hallock even says around 12 minutes in how GTA V was fine due to no exclusive partnership... but NVIDIA never barred CD Projekt Red from working with AMD... so to me it sounds like Rockstar Studios did a better job of working with hardware vendors on their game than CD Projekt Red.

 

GTAV is not a GameWorks title, and only uses simple nvidia things like HBAO+, and they include AMD's similar technologies, so they are quite vendor agnostic. Also your claim on CDPR is unfounded. Of course AMD got to work with CDPR (and did), but that never gave AMD access to anything GameWorks, due to NDA's.​

It's not so much a case of better work, but more a case of not being sponsored, and relying on a lot of proprietary third party middleware. After all Rockstar don't need any companies money; they have enough.

 

... but that whole optimization thing makes me think of ID Software's RAGE and the whole broken ATi drivers on release.

http://kotaku.com/5847761/why-was-the-pc-launch-of-rage-such-a-cluster

 

So AMD knew months before-hand that older drivers wouldn't work with RAGE yet didn't get a correct driver out in time? I ran RAGE no issue on my GTX 460 on release day with the latest drivers, yet the latest drivers from ATi? What you see in that screenshot of that kotaku article. My two friends with ATi cards (one with a crossfire setup) couldn't get a half-decent working game at release and had to wait for ATi to put out drivers.

 

RAGE is a completely different beast, just like Wolfenstein A new Order. Both of those games uses the ID engine, and the problem is that they are OpenGL based. OpenGL is a horrible buggy, redundant, obsolete pile of crap. NVidia loves it though, as they tend to rewrite entire shaders for specific games, thus having specific proprietary code for those specific games. That is the reason NVidia does so well in those two games, and also why OpenGL is so shit, that no one ever uses it. Vulkan should change all of that though.

 

This article is excelle​nt on the subject, well worth the read: http://www.extremetech.com/gaming/182343-why-we-cant-have-nice-things-valve-programmer-discusses-wretched-state-of-opengl

 

 

Anywho? I better not derail this thread anymore. I don't think it's healthy that we spun it off with a discussion of gameworks. I'm not gonna reply to anymore of that since I was more interested at first of talking about freesync and Gsync anyways. :P

 

These are the kinds of discussions that everyone can learn a lot from. Unless OP or a moderator is annoyed, it should be ok. They are somewhat relevant anyways :)

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

 

AMD doesn't pay royalties to Intel for the x86 license. They have a cross-licensing agreement, because Intel uses AMD extensions to x86, like x86-64. Intel actually paid AMD a lump sum when they settled things in 2009.

 

Nvidia's Denver CPU cores were a bit of flop really. Inconsistent performance on Tegra K1, and they dropped their own fancy CPU architecture for Tegra X1.

It's true nowadays yeah that AMD doesn't bay royalties but that's not why Intel paid AMD... it was due to them using bribery to try and limit AMD CPU availability. That's some dirty laundry.

http://www.fudzilla.com/13300-amd-x86-is-now-royalty-free

We all know that NVIDIA is protecting their investments with G-Sync, but I still say at least for this they could have licensed it out like Intel did anyways... it's just that NVIDIA shouldn't follow suit with Intel try to use bribery to limit AMD GPU stock lol. :P

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×