Jump to content

[Mini-News] Metal Gear Solid V will be bundled with Nvidia graphics cards

Bouzoo

That's pretty much what I'm saying. Basically, don't touch a GW title withing 6 months of release.

 

Well there was nothing wrong with Witcher 3.

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

Grr, Nvida! Must be angry because wheels in head are stuck!

.

Link to comment
Share on other sites

Link to post
Share on other sites

this might actually sell me a 970 :(

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well there was nothing wrong with Witcher 3.

The only issues with the game for me is their latest performance killing patch 1.07.

 

I re-rolled back to 1.06, installed the latest free DLC and having a blast again. Even on launch I had no issues playing the game at 40fps Medium setting at 1200p on a GTX 580.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well there was nothing wrong with Witcher 3.

The exception that proves the rule, since it and GTA V (sorta) are the only real examples of it working out well recently. TW3 was made by a dev with plenty of PC experience though.

Link to comment
Share on other sites

Link to post
Share on other sites

The exception that proves the rule, since it and GTA V (sorta) are the only real examples of it working out well recently. TW3 was made by a dev with plenty of PC experience though.

I think that's the important thing. Some companies out there ( Iron Galaxy ), and others are just terrible anyway at getting games to run well.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think that's the important thing. Some companies out there ( Iron Galaxy ), and others are just terrible anyway at getting games to run well.

Yes. My point is that it seems like lazier devs turn to Nvidia for help (though for Knight, the blame falls 100% on WB's shoulders).

Link to comment
Share on other sites

Link to post
Share on other sites

The game was obviously not finished at the time of the first SS. Perhaps they implemented something later on that increased draw calls. We can't be sure they increased the amount of draw calls on purpose to reduce performance. Unless you can prove they did something like Extremetech found in Crysis 2.

 

No they were all beta builds. The point is that there was a huge performance drop out of nowhere on AMD cards alone. This despite AMD having worked with the dev and made beta drivers that worked great. There is seemingly no graphical change on the builds, so whatever got changed under the hood, rendered all optimizations from AMD void. The dev then went on to blame everything on AMD. I'm not saying they purposely sabotaged AMD, only that they blamed AMD for their own performance decrease.

 

API Overhead Test tests API performance by looking at the balance between frame rates and draw calls. It increases the amount of draw calls until you hit 30 fps and then displays the result, which is draw calls per second. 13k per frame would mean that to be able to get 60 fps you need 780000 draw calls per second. And from my API Overhead tests in my previous post, my 4670K can do around 900k at 30 fps, which means that number would be significantly lower at 60 fps, right? So 900k vs 1m 200k is still a big difference and that it matters.

 

But that does not explain the massive performance loss though. I agree there is a large difference in favour of NVidia. That should change with DX12 though. In the case of PC there aren't any GameWorks effects, so it is probably just wasteful draw calls in the graphics engine. PC should have been a DX12 game.

 

You're paying 60$ for the game, not for GW effects. Those effects are Nvidia's tech and they are optional. Nvidia invested a lot of money and effort into creating it to provide something cool for their users. They even send out their engineers to help game devs implement it. They are in no way obliged to even let AMD users use the effects. Yet they do, and then AMD users blame them for higher performance impact, which is 100% AMD's fault and not Nvidia's. Ridiculous if you ask me.

 

I disagree, those GameWorks effects are part of the game, and the consumer pays for those too. Otherwise you could just say you only pay for low graphics settings, and everything above are optional. Sure they are, but you still pay for them. If we talk about GameWorks being NVidia exclusive and should be viewed as such, what we end up with is market segregation based on graphics card vendors. That is very console territory, and not at all PCMR. I would hate for the market to end up like that. It just creates vendor lock in and/or ruins competition. Neither is good for anyone but NVidia.

 

Obligated? No not at all, NVidia can do whatever they want with GW, but I as a consumer has every right to criticize them for it.

 

That's completely different. This has nothing to do with the source code, but tessellation. How many times do I have to repeat this? I've already provided proof that shows that 290X is as slow as GTX 660 in tessellation. You can't make 290X perform as 980 in tessellation with optmization. That's impossible. You can't fix hardware issues with software.

 

Not really. Maybe I'm misunderstanding what you are saying, but it sounds like you think that any effect that is based off of a DX11 technology cannot be optimized for at all. That simply is not true. Just because AMD's tessellation performance hasn't been up to NVidia's speed in their previous gen cards, does not mean you cannot optimize how the game uses these effects. When you have a 7000 line shader like HairWorks, there are plenty ways to make it run better. Not all of it is tessellation after all. Resource handling, draw calls, etc also matters.

 

The exception that proves the rule, since it and GTA V (sorta) are the only real examples of it working out well recently. TW3 was made by a dev with plenty of PC experience though.

 

Pretty much this. But GTAV does not use GameWorks effects per se. Just some optimized things like HBAO+ and similar.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry for half necro-ing this, but does anyone know Newegg/Nvidia's policy on getting a code if a promo starts soon after you bought the card? Just ordered two 980's yesterday and I wonder if there's any way I can get in on this.

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry for half necro-ing this, but does anyone know Newegg/Nvidia's policy on getting a code if a promo starts soon after you bought the card? Just ordered two 980's yesterday and I wonder if there's any way I can get in on this.

I believe the promo is handled by Nvidia, so Nvidia actually give out keys with the graphics cards. I don't think that they will have extras for cards that were not bundled with it. NVidia's promos have always been very strict as to what the date of purchase is.

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

Any ape optimizing a Gameworks feature would turn the tessellation factor down, effectively making the performance impact much less than what it is now. Devs can't do it because the lack of access or the right, nVidia won't do it because of obvious reasons. It also makes heavy use of AA and some other factors which I could guess are a bit on the extreme side as well. It incentivizes on buying new hardware in droves, since to run it at max you need pretty ridiculous hardware.

 

Tessellation beyond a certain point produces no performance improvement. It's great if your card is good at tessellation, but it needs to have a real use. Even if I want to have a well-lit room that doesn't mean that I use 100 lamps to do it.

 

Not that I care. Minor smoke patches and such are of no big interest to me. Stuff like Crysis 2 and the garbage which we had in Arkham Origins, I do care a little bit more about. Because I can't stop the negative performance impact by turning off gameworks effects.

 

I'm wary off this game. Ground Zeroes ran well for me, but the news at the end of this game's development and the microtransaction debacle are holding me off. All the same, this bundle benefit me as well, since I can buy the game code off for ~20-25€ from somebody just like I could with the Witcher 3 since not everyone buying the GPUs want the game.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×