Search the Community
Showing results for tags 'gameworks'.
-
Source: http://www.guru3d.com/news-story/nvidia-gameworks-sdk-3-1-released.html Earlier today, NVIDIA announced that their new GameWorks SDK (Version 3.1) is now available to developers. The update added new techniques for lighting and shadows: The update also added two new PhysX physical simulations in beta form: But the big news is the following: It looks like NVIDIA is allowing developers to finally access the dreaded "black box" portion of GameWorks libraries, albeit only select ones as of now. Even so, this is still fantastic news because it allows developers to optimise their games better with NVIDIA's lighting and physics effects. It also allows AMD to finally be able to optimise their drivers before a game that includes NVIDIA's GameWorks is released. My take on this is the following: I personally feel that NVIDIA has done this to combat AMD's GPUOpen intiative and further wants to improve the image of GameWorks within the gaming and game developer communities. Whilst this is a good first step, I believe NVIDIA should start to open up all of their GameWorks libraries as soon as possible and stop with these constant controversies regarding GameWorks and it's performance related issues on both AMD and NVIDIA cards.
-
I wanna learn game dev and so I thought I'd learn gameworks and gpuopen. Will cards from both camps play nice together for this? I heard PhysX turns off if an AMD card is detected and I'm guessing similar things happen for other NVIDIA specific features. Links would be appreciated.
-
Hey guys! I wasn't sure where to post this, but I guess it's a bit of a news, since no one has done this before (at least not to my knowledge). And it's one of those PC power showcases, since real-time ocean simulation and buoyancy is not a simple thing to accomplish efficiently. For an in depth look, check out my blog post about this subject. UPDATE: NVIDIA wrote about it on their blog. So, I made this real-time ocean simulation with one thousand ships. There are no visual tricks involved, every single ship is its own physical object, with physical buoyancy simulation. Details: Running at 1920x1080 with 133% render target (screen percentage). Graphics settings are all on Epic. (max in Unreal Engine) Runs at: avg. ~45 FPS, min. ~30 FPS, max ~ 60 FPS. At this point, 95% of the time, my CPU is the bottleneck. Yes it uses NVIDIA GameWorks (WaveWorks) and I do have a couple of other NV features integrated in my branch of the engine to really optimize it for VR, but honestly GameWorks is really only as good as is it's implementation, khm Batman khm, at it's core is a very polished library of algorithms. Made and ran on this machine: Motherboard: Z97-DELUXE(NFC & WLC) PSU: Cooler Master "V Series" V850 CPU: Intel® Core™ i7-4790K 4.6Ghz CPU cooler: Hydro Series™ H100i Extreme Performance GPU: GeForce GTX TITAN X RAM: G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 2400 Mhz SSD0: Samsung SSD 850 PRO 256GB SSD1: Samsung SSD 850 EVO 500GB HDD: WD Green 2TB CASE: NZXT. Phantom 410 Black Thanks!
- 34 replies
-
- unreal engine
- waveworks
-
(and 2 more)
Tagged with:
-
Hi! For a while I have been planning to get a 390X or 380X, but whenever I bring up the topic, my brother keeps shooting down those options and tells me to buy an Nvidia card because of software and unlocked-parts of the card that lets the developer give the game better graphics and textures and such. I remember him throwing out the words like 'Nvidia Gameworks' and 'PhysX.' Do those things actually give better textures and such in reality? Thanks!
-
Edit: Alright, evil is not the correct word here. But there are issues with Gameworks you should know about Saw this video today. It's a bit long, but watch the whole thing, definitely worth your time. https://www.youtube.com/watch?feature=player_embedded&v=O7fA_JC_R5s I think we need AMD in the race now more than ever... Been using Nvidia for a while, I will most likely go AMD for my 2016 GPU upgrade.
-
Nvidia Gameworks is 'throttling' performance on AMD cards, causing some of the graphics like PhysX to render on the cpu rather on the AMD card itself. Nvidia is also slowing down performance on older GPU cards than run modern games developed by the help of Gameworks. Here is the Link to some insight on what Gameworks is actually doing and how it's affecting the overall pc gaming experience.
- 22 replies
-
- graphics cards
- amd
-
(and 5 more)
Tagged with:
-
I have no opinion on the matter, these are just the accusations made against NVIDIA. Gameworks NVIDIA's Gameworks API is a proprietary software that game developers can use to create games that can be supported by many GPUs and the SKUs of each chip. The issue with Gameworks is that it infamously makes games run like s**t on AMD graphics cards. As a result of this many high end AMD cards such as the R9 390X can be beat out by a GTX 960 in certain games. (PROJECT CARS) This game will probably be mentioned a lot in this post. PhysX PhysX is a physics program that is built directly into NVIDIA GPUs, in order for AMD users to utilize this they need to run PhysX on their CPU. Advanced PhysX also can't be run on any AMD card no matter what, NVIDIA have been accused of also preventing AMD graphics card users to use NVIDIA GPUs solely to run PhysX on their cards. PhysX is a fair use of competitive advantages as there are very effective ways to accomplish the same thing on other products. Planned Obsolescence and AMD slowdowns Supposedly NVIDIA have been finding ways to make older graphics cards run slower than Maxwell cards, mainly in a Gameworks title called Project Cars. It should be noted that the GTX 780 is being beat by the GTX 960, this should not be occuring. It should also be noted that the 290X can't even maintain 60FPS in the game, despite the 290X being in the same tier as the GTX 980 which smashes the 290X with 70 FPS using the benchmarker's configuration. It should also be noted that the AMD results are slightly less consistent in terms of minimum to average FPS. Benchmarks are credit to Techspot.com, the URL is at the bottom of the post. Overuse of Tesselation Way back in 2009, when Crysis 2 was released, the game for some reason in certain scenes used extreme tessellation on objects that either didn't require it or couldn't be seen at all, examples of this being the Jersey barriers and the water being rendered under the map. The reason this was done is that NVIDIA knew that their cards could handle tessellation better than AMD/ATI cards, so using this knowledge the overused tessellation to tank the competition's FPS. It should also be noted that this also reduced the potential FPS of NVIDIA's own cards. More recently, Gameworks game The Witcher 3 by CD PROJEKT RED was discovered to use 64X tessellation in the hair of Geralt, this made no visual difference but did however affect performance on AMD cards more adversely than NVIDIA's, as there is no reason to do this it can be assumed that NVIDIA knew that Extreme tessellation would cause a bigger FPS drop in the competition than in their own cards. Any user having issues with this on AMD's side can use the AMD Crimson program to change maximum tessellation to a more reasonable factor such as 16X. If you found the article interesting, maybe you would be interested in participating in the poll I made, I want to get everyone's opinion, I realise the post seems biased but in truth this was mainly about the shady practises of NVIDIA. AMD may not be exempt from shady business practises themselves, this post is about NVIDIA however. This post took forever to make so bear that in mind before you rip me to shreds calling me and AMD fanboy despite me owning an NVIDIA SHIELD tablet and a GTX 760. Benchmark source: http://www.techspot.com/review/1000-project-cars-benchmarks/page2.html
-
http://www.overclock3d.net/articles/gpu_displays/nvidia_gameworks_comes_to_war_thunder/1 The engine appears to only be single threaded for me, so somehow they managed to get gameworks features in there... Looking forwards to see how this plays out, but my fps is bad enough due to the game only using one core, so I'll likely leave it off... Q&A on the WT forums: http://forum.warthunder.com/index.php?/topic/269416-development-dagor-engine-40/page-9#entry5234982
-
Hello LLT Forum, I have been pretty extensively testing the performance of Rise of Tomb Raider on my personal rig (details in profile) and have been dumbfounded with the performance. A quick rundown of hardware: 5930K processor OC'd to 4ghz all cores, two EVGA 980 TIs in SLI OC'd to +103mhz, GSync 3440x1440 monitor at 95hz, 2x Samsung 850 Pro in Raid 0, 2.8ghz Quadchannel RAM at C14|1T. I downloaded the GameReady driver (at the time of writing) 361.75 WHQL, launched the title, and set every setting to its maximum possible slider to assess what frame rate looks like for a baseline and begin to downscale features from there. Much to my surprise, all max settings ran at ~20 fps in the menus of the game. This is all due to SSAAx4 which creates a completely impossible workload of rendering 6880x2880 to then downsample to the monitor's resolution. I was honestly surprised it was able to even render the 20 frames and can't help but wonder what kind of machine this option is even there for. Once I set the AA to a non-intensive FXAA (SMAA was not available at the time of testing), framerate became a stable 70-75 and I decided to start the game to begin checking real-time performance. There is an extensive FPS cost comparison for every feature of the graphical options available on GeForce.com here GeForce Tomb Raider FPS Cost Comparison. So here is where things get interesting. The game runs an AVG FPS at a respectable 65, however, it dips in almost all close-up scenarios down to 48 fps due to PureHair which renders 30k strands of physics objects at Very High. When setting this to just "On", the number of strands is significantly cut but gives little to no performance increase unless cinematically close to Lara. The cinematics with multiple characters on screen (in close up) can chug the framerate down all the way to the low 40's, sometimes even less. Gsync does an amazing job keeping up with the immersion, but this shouldn't really be a problem for dual GPUs running at Gen3 16x with a processor that has the lanes to support the bandwidth. Tessellation is another interesting animal. First of all, you HAVE to have tessellation enabled if you want even a semblance of immersion, flat textures look ancient and unappealing in modern gaming. It is fairly dishonest of GeForce.com to suggest disabling it if you're "in need of extra performance". Tessellated textures are everywhere in this game, including where you cannot see them (which goes to the planned obsolescence arguments made many times against Nvidia). Turning off PureHair and Tessellation runs the game at 95fps at 95%, with some occasional dips to high 80's when many characters are on the screen in an open environment. When I turn off one of my graphics cards and run the game on a single one, this same setting runs at 80 fps at 95% (with occasional dips to low 70's). Turning on PureHair and Tessellation brings the framerate to 58fps on a single card and dipping to below 30 in some cinematics. What in the actual hell Nvidia? Don't get me wrong, the game looks great. Completely next-gen feeling, especially with HBAO+. It's some of the best lighting I have ever seen, and dark environments look immersive and detailed (using an IPS monitor). I have looked into independent benchmarks on AMD cards are they are pitiful (unless you disable PureHair and Tessellation of course). I have looked into 970 benchmarks and they too are not much better. I have never seen top end cards fail to dominate current gen titles (with exception to ridiculous things like Crysis and synthetic benchmarks). With the way AA is going, 6GB will very soon not be enough for even Med/High texture quality, forget things like DSR (what kind of dreamer even came up with that super sampling concept?). If I on a monstrous rig cannot run a maxed out current gen title, I can imagine the pain others are going through whom purchased $500 cards less than a year ago and can't maintain a decent framerate, let alone 60+. Amanwithplan
- 22 replies
-
Source: http://wccftech.com/nvidia-responds-witcher-3-gameworks-controversy/ (...) It’s quite unfortunate that Nvidia has taken this recent turn with GameWorks towards locking code and limiting control. A future where the competitors’ only choice is to fill the game with even more proprietary code of their own to compete is not one that gamers or developers will want or appreciate. Read more: http://wccftech.com/nvidia-responds-witcher-3-gameworks-controversy/#ixzz3aUcXvQ2v The way I see it, proprietary technology and its use in games, is taking a dangerous turn. What has been a nuisance in the past, now is becoming an actual problem. Especially that previously any "day one" performance issues could be usually fixed with an after release patch. Like when AMD featured TressFX in Tomb Raider (2013), that was patched to be handled efficiently on Nvidia cards. Now tables have turned and as CDPR representative stated in the article, in the case of Witcher 3, the usage of basically Nvidia locked features, on both AMD and Nvidia cards with similar results is not possible. A shame really, that we are forced to choose one vendor over the other in order to get "the full experience".
-
source: https://youtu.be/2Fi1QHhdqV4?t=1h14m35s@1H14M he basically said this: if you are able to deliver a driver week(s)/month(s) later, for The Witcher 3, then you could've done that driver day 1 .. you just didn't! --- and he is right! if AMD would've invested time into optimizing the drivers for TW3 and have it ready at launch, like they did with GTA5 for example, the whole issue would've not existed in the 1st place GTA5 is a GameWorks title and yet you haven't heard a peep from AMD, they even had a day 1 driver - what exactly prevented them to invest the same time and resources into Project CARS and TW3, because in the end, they did release a optimized driver for these games even more damning, is the fact "the fix" originated in the community and later spread by AMD GameWorks was a "black box" before and still remained a "black box" when the optimized drivers were ready <_< in the end, it's not a matter of AMD doesn't have access to it, rather a matter of AMD's willing to provide proper support to their customers
- 234 replies
-
- tom petersen
- amd
-
(and 2 more)
Tagged with:
-
Nvidia has been hard at work advertising gameworks and not allowing AMD to optimise their games on their hardware. A good example being Project Cars which is integrated with PhysX [and somewhat witcher 3]. What do you guys think about their aggressive marketing? I personally think we need to stop nvidia's monopolization. Also note, I am a nvidia user, loving shadowplay.
- 110 replies
-
- discussion
- nvidia
- (and 4 more)
-
According to a Arstechnica post( http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/ ) The Perfomance of the Witcher 3: Wild hunt is very much crippled due to the fact that AMD cards can't run Hairworks properly because the technology source code is unavailable to them AMD. This does not only cause the witcher 3 to run better on nvidia cards but also give nvidia an unfair advantage, me not owning an nvidia card went on the web and found that performance does actually differ quite between AMD and Nvidia. Not only HairWorks is crippling but also GameWorks itself. as many project cars fans who owned amd cards where dissapointed because of the very,very,very poor performance of the game and many seek to blame AMD even though AMD was not the culprit because it whas Nvidia's GameWorks Project cars was build with in Gameworks which is technology which is exclusive to Nvidia meaning that AMD cannot acces the code and thus not optimize it's drivers for it which means that there is a lot of unnecessary processing happening which intern led to worse performance. The team behind Project cars knew this but still opped to hop on the Nvidia bandwagon and optimize for Nvidia only. again I feel this is very uncompetetive like the Intel compiler lawsuit.
-
http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/ "I asked AMD's chief gaming scientist Richard Huddy, a vocal critic of Nvidia's GameWorks technology, about AMD's involvement with CD Projekt Red, and the support it had reportedly failed to provide to the developer: "That's an unfortunate view, but that doesn't follow from what we're seeing," said Huddy. "We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that...it's wrecked our performance, almost as if it was put in to achieve that goal.""
-
According to AMD TressFX running on Nvidia is faster than HairWorks running on Nvidia. HairWorks is part of Nvidia's closed DLL GameWorks library that developers have been criticizing. The source code to HairWorks isn't public so no one can get their hands on it to optimize it except with permission from Nvidia. In contrast the latest version of TressFX (2.2) can be downloaded for free on AMD's website. The code is completely open to anyone including Nvidia, which is why it runs well on Nvidia's hardware. TressFX 2.0 Demo can be downloaded for free by anyone here. Full article
-
This is somewhat of a follow-up to Extremetech's previous coverage of Nvidia's GameWorks program . To those who don't know GameWorks is an Nvidia program by which developers are offered Nvidia optimized implementations for specific in-game effects through graphics libraries. The effects include things like Tessellation, lighting & ambient occlusion. And these libraries are effectively "blackboxes" by which the developers are prohibited not only from optimizing these libraries for AMD or Intel GPUs but also from allowing any vendor other than Nvidia to implement its own optimizations or rendering techniques. A number of prominent developers have criticized Nvidia's GameWorks program describing it as a "BlackBox" and some went as far as to call it "unusable" where developer control over the game's code has been stripped & hidden away from them and only given back under strict circumstances by which no modifications are allowed to benefit the hardware of any company except Nvidia. The list of developers include Kostas Anagnostou the Senior graphics programmer at Radiant Worlds, John W Kloetzli, Jr a graphics programmer on the Civilization team at Firaxis Games, AngeloPesce a Rendering Technical Director & Johan Andersson the Technical Director on Frostbite at Electronic Arts. http://forums.overclockers.co.uk/showthread.php?t=18592187 SOURCE
-
Ubisoft’s Watch Dogs is the latest PC title to take advantage of Nvidia’s GameWorks, a robust collection of tools that allow game developers to produce a visual experience which epitomizes Nvidia’s rallying cry: “The Way It’s Meant To Be Played.” Developers license these proprietary Nvidia technologies like TXAA and ShadowWorks to deliver a wide range of realistic graphical enhancements to things like smoke, lighting, and textures. Nvidia engineers typically work closely with the developers on the best execution of their final code. Recent examples of Nvidia GameWorks titles include Batman: Arkham Origins, Assassin’s Creed IV: Black Flag, and this week’s highly anticipated Watch Dogs. As you’re suspecting from the headline, Nvidia’s GameWorks is only good news for Nvidia, their development partners, and their GPU users. That’s logical, and it serves a sizable slice of the market. According to AMD’s Robert Hallock, it’s terrible news for the PC gaming ecosystem on the whole. “Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products,” Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: “Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.” So a partner studio like Ubisoft can suggest or write enhancements to the GameWorks libraries, but AMD isn’t allowed to see those changes or suggest their own. “The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines,” Hallock continues. “This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or condone such activities.” I asked Robert Hallock about this specifically, and he explains that they had “very limited time with the title and [we've] been able to implement some respectable performance improvements thanks to the skill of our driver engineers. Careful performance analysis with a variety of internal tools have allowed us to profile this title, despite deliberate obfuscation attempts, to improve the experience for users.” AMD will release a new driver to the public this week which reflects those improvements. (It’s the same driver I conducted my testing with.) Unfortunately my conversation with Hallock didn’t end with a silver lining: “I am uncertain if we will be able to achieve additional gains due to the unfortunate practices of the Gameworks program,” he remarked. http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/
-
So up till now we had only heard AMD's side of the story. Forbes got in touch with Nvidia's Director of Engineering, who explains their technology does in no way purposefully make games run worse on AMD cards. He claims their contracts do not restrict access to anyone. To be honest, I expected this, I just don't know why AMD would jump like this. The full article is on Forbes: http://www.forbes.com/sites/jasonevangelho/2014/05/28/nvidia-fires-back-the-truth-about-gameworks-amd-optimization-and-watch-dogs/
-
Greetings, today I'm gonna talk about Nvidia's new feature called Gameworks. It's some kind of step against AMD to compete, step good for us, Nvidia gamers To say in short, NVIDIA GameWorks™ represents pack of Nvidia technologies such as VIZUAL FX, PHYSX, CORE SDK, OPTIX FOR GAMING, SAMPLE CODE & TOOLS, with which help modern games can be considered as a next gen & becomes more photo realistic & interactive. Also Nvidia provides code optimization with debug tools & creating profiles with software support. Now, let's talk about in details about technologies used in Gameworks. I will start with components included in VIZUAL FX which are used for rendering & visual effects in games. These are: HBAO + (Enhanced Horizon Based Ambient Occlusion) - It itself represents part of NVIDIA ShadowWorks. HBAO uses a physically-based algorithm that approximates an integral with depth buffer sampling. In other words, the upgrade enables HBAO to generate higher-quality SSAO whilst increasing the definition, quality, and visibility of the AO shadowing. improves upon existing Ambient Occlusion techniques to add richer, more detailed, more realistic shadows around objects that occlude rays of light. Compared to previous techniques, HBAO+ is faster, more efficient, and significantly better. Also another technology which is included in ShadowWorks is Advanced Soft Shadows, which provides in game high quality shadows. It renders both cascade shadows & many lighting source simultaneously. For more details about HBAO+ visit this link: http://www.geforce.com/hardware/technology/hbao-plus TXAA or Temporal Anti-aliasing - It's part of NVIDIA PostWorks which also includes other post-process technologies like Depth of Field. TXAA is movielike AA which's main goal is to devour temporal aliasing in games, especially while moving (sometimes called crawling & flickering). TXAA in most cases is superior to other high-end AA alternatives. It gives MSAA 8X & better visuals without loss of performance. As FXAA is meant for best performance AA, TXAA is meant for best visual without loss of performance. For more details, visit this link: http://www.geforce.com/hardware/technology/txaa/technology NVidia faceworks - another great technology from Nvidia. It's supposed to give realistic face in games (eyes, skin). I hope everyone remembers famous "DIGITAL IRA" in Nvidia demo, which showed stunning graphics of man's face with help of DX11. For more details visit: http://www.nvidia.com/coolstuff/demos#!/lifelike-human-face-rendering You can download from here & enjoy this demo: http://us.download.nvidia.com/downloads/cool_stuff/demos/SetupFaceWorks.exe NVIDIA WaveWorks - This feature provides maximum realism of waves & water in general in games. It includes such things like wind, it's speed & trajectory, just like in real world. NVIDIA HairWorks - Title speakse itself. It provides realistic fur & hair rendering. In this video you can see, that on Riley & wolves there are 10k furs being rendered. This technology also will be used in Witcher 3. NVIDIA GI Works (Global illumination) - This one is used in scenes which include indirect illumination & such objects which are illuminated by various sources. This feature gives graphics amazing visuals. NVIDIA Turbulence - This one is used for realistic fog, dust, smoke for dynamic moving simulation. Conclusion: All these features give Nvidia simply the best option for gamers. Nvidia - The way it's meant to be played!