Jump to content

Batman: Arkham Knight doesn't like AMD cards... or any PC at all

Techhog

I think the entire industry would prefer to wipe the launch of BF4 from their minds. That was a proper disaster top to bottom. 

But yet Watch Dogs was not and does not get the same treatment?

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

But yet Watch Dogs was not and does not get the same treatment?

 

Well, double standards and all that, which I don't agree with. 

 

AMD sponsored titles can be just as screwy as Nvidia sponsored titles, its not one or the other and if we are being very honest, simply because of the money and resources that Nvidia has, they can get to fixes faster. Its not a insult towards AMD, I'm rooting them to succeed, but its just how the cookie crumbles. 

Link to comment
Share on other sites

Link to post
Share on other sites

What about BF4, Crysis 3, BFH, GTA V (For the first week or 2 on AMD)? Not to mention for the first month I had major stability issues with Civ BE...  

 

Do those games don't count? 

 

I'm talking about performance here. All those games ran just fine at launch, fps wise. BF4 crashed a lot, because it was rushed to market several months early, because dingleberry of an EA exec, got it into his mind, that people just buy 1 war game a year, so they had to launch straight after COD did.

 

As for GTAV that was just waiting for a driver to optimize (sure that shouldn't be necessary, but welcome to gaming 2008+). Game runs very well, and on both NVidia and AMD.

 

Not sure what you mean about Crysis 3? That game has always run really well and efficient.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, just change fucking tessellation to 4x or 8x if you cannot turn off gameworks stuff. Also I expect this reports to be the usual whining from Hurdy and AMD when they perform equally as bad on Nvidia cards, just like the stupid cars game and the witcher 3. 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I hope this doesn't turn into a fight, but I know that it will. WB Games has warned AMD users that Arkham Knight does not run well on AMD cards. How badly? Just take a look at the revised minimum requirements:

 

 

 

Yes, you read that right. A 7950 will perform like a 660 in this game. My opinion? Both AMD and Rocksteady are at fault. There is no way that things could get that bad if only one party is to blame. AMD users should cancel preorders immediately, as it looks like most of you would have a better experience on PS4 than on PC. Of course, PS4 uses AMD hardware. I'll leave it to you to think about what that means.

 

Source: https://www.pixeldynamo.com/news/gaming/2015/06/22/69607/batman-arkham-knight-pc-suffers-amd-card-issues-system-requirements-updated/

AMD isnt at fault, the publisher  who picks the cancer that is gamework is the problem, the devs and AMD could be working together for a long time to optimize the game, and when it's done Nvidia introduces it's gamework build that fuks up the performance, which means AMD have to optimize it Again from scratch for the new build, i bet that AMD spends more resources/men hours optimizing gamework title than Nvidia does into integrating it. that's just not gonna work in the long term.

who ever buys Gameworks title  is killing PC gaming, thats my opinion!

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, just change fucking tessellation to 4x or 8x if you cannot turn off gameworks stuff. Also I expect this reports to be the usual whining from Hurdy and AMD when they perform equally as bad on Nvidia cards, just like the stupid cars game and the witcher 3. 

 

"equally as bad"?

 

ArkhamOrigins-DX111.png

 

 

It was actually Batman Origin, that lead to AMD including a tessellation multiplier setting in Catalyst (that and NVidia's "helpful" involvement in DX11 for Crysis 2, with the worlds most complex concrete slab. That slab literally had more polygons than a character).

 

Lets not beat around the bush. This only happens on GameWorks titles. And it happens far too often to be a coincidence. air enough that NVidia screws over their own users with 64x tessellation on HairWorks, but it's worse for AMD users, with all the crap.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The only exception is Battlefield 4, but as most know that was EA's fault for launching a game when it wasn't finished

 

They're all fucking not finished when they release. That's why most games have a multi-gigabyte day-one patch and the day-one gamers are basically paying full retail to be beta testers. The assets aren't ready, shit is still bugged to hell, if it is online then the infrastructure isn't going to run smoothly, drivers aren't ready and they decide to lock the last third of the game behind another 20 bucks.

 

I'm done with paying full price for games. I'll wait for them to go on sale on steam or amazon, or end up in the bargain bin at the electronics store. They don't give me the experience I expect when I'm handing them 50 odd euro.

 

The issues specifically in this topic always happen with titles that include Gameworks. People criticized Mantle heavily for segmenting the market (and they perhaps had a point). Gameworks does exactly the same thing. It exists purely to stop people buying AMD cards and to force nVidia card owners to upgrade every generation, as the latest technology runs like dog shite on the older cards and nVidia (who are supposed to always have great launch day drivers) drag their heels with support for older cards.

 

There is no reason whatsoever why the GTX 700 series performance issues slipped through QA for the Witcher and Project Cars driver releases. None whatsoever. Any company that gives a damn about the hardware currently deployed makes sure it runs properly with new software. All it would have taken to catch those issues would be somebody installing a 770, a 780 and a 780ti and seeing how shitty it performed compared to the 960. That would have taken about an hour. Not even an intern on their first day would have been able to miss those issues. I work in QA for server software and we support various appliances. We make damn sure it fucking works before the software hits our cdn.

 

What I took away from that fiasco, is that nVidia only cares about the current generation of cards. They only acknowledged the issues after about half the internet kicked up a fuss. The fix took them a couple of days to develop, review, pass QA and deploy. That means it could have been in the release version if they cared enough to check the older hardware.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Haha @EarthboundHero

Have fun with your R9 280 and 4690K now.......

execpt my system will prob run the game more shitter than his. 

Eh, I'm sure it'll run fine at low settings. Not gonna cancel my pre-order because it might run like shit for a month or 2. Plus, I"m planning on upgrading to a 970 soon.

Spoiler

Prometheus (Main Rig)

CPU-Z Verification

Laptop: 

Spoiler

Intel Core i3-5005U, 8GB RAM, Crucial MX 100 128GB, Touch-Screen, Intel 7260 WiFi/Bluetooth card.

 Phone:

 Game Consoles:

Spoiler

Softmodded Fat PS2 w/ 80GB HDD, and a Dreamcast.

 

If you want my attention quote my post, or tag me. If you don't use PCPartPicker I will ignore your build.

Link to comment
Share on other sites

Link to post
Share on other sites

"amd can't afford to optimize and help devs so obviously it's nvidia's fault"

 

this thinking lol

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

We still don't know what the issue is. If it's tessellation again, then the problem isn't going to really apply to the Fury X.

it takes about 10 seconds to put a cap on tessellation. It's not exactly hard to fix the issues.

 

 

n2mm3oC.png

 

PS: This is my settings for Ark this is why it's completely shut down :P

Link to comment
Share on other sites

Link to post
Share on other sites

"amd can't afford to optimize and help devs so obviously it's nvidia's fault"

 

this thinking lol

 

No, it's the developers' fault for being shit at their jobs.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Not a 660 Ti; a 660. The 7950 these days should be going up against the 670/760.

its between a 760 and a 770

Thats that. If you need to get in touch chances are you can find someone that knows me that can get in touch.

Link to comment
Share on other sites

Link to post
Share on other sites

No, it's the developers' fault for being shit at their jobs.

No, it's also AMD's fault. AMD and NVIDIA both fix up the majority of the mess of code that developers leave in their games through driver updates (why do major releases get day one drivers, hm?), however NVIDIA does it faster, and better, because they aren't in the same hole AMD got themselves into.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

its between a 760 and a 770

More between a 660 Ti and 670. The 770 matches the R9 280X/HD 7970 GHz, to put the 7950 between those cards is like saying it's between a 7870 GHz and 7970 GHz, like no shit.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

2/3GB for minimum memory requirements? I'm calling bullshit. So many devs seem to have ridiculous minimum specs, but they can be run on much, much less. 

 

I wouldn't say that 2GB of memory is ridiculous.

The biggest  BURNOUT  fanboy on this forum.

 

And probably the world.

Link to comment
Share on other sites

Link to post
Share on other sites

More between a 660 Ti and 670. The 770 matches the R9 280X/HD 7970 GHz, to put the 7950 between those cards is like saying it's between a 7870 GHz and 7970 GHz, like no shit.

43897.png

its a 7950 not a 7850

Thats that. If you need to get in touch chances are you can find someone that knows me that can get in touch.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Lets not beat around the bush. This only happens on GameWorks titles. And it happens far too often to be a coincidence. air enough that NVidia screws over their own users with 64x tessellation on HairWorks, but it's worse for AMD users, with all the crap.

 

You DO know that it had nothing to do with tesselation and all about drivers right?  The 390x drivers got backported to the 290x and they can have full HairWorks at default tesselation comparable to a 970 and 980 in terms of FPS.  Talk about screwing customers over.  AMD rebranded their 290x into a 390x, then released drivers that made the 390x perform as well as a 980 on Witcher 3 WITH HAIRWORKS and users have to hack those drivers to make it compatible with the 290x.  AMD is screwing you so much you don't even know they are drilling your behind any more.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

No, it's also AMD's fault. AMD and NVIDIA both fix up the majority of the mess of code that developers leave in their games through driver updates (why do major releases get day one drivers, hm?), however NVIDIA does it faster, and better, because they aren't in the same hole AMD got themselves into.

 

And whose fault is it that the code needs to be 1. fixed and 2. refactored so it actually executes in a reasonable amount of time with less wasted resources in the first place? That would be the guys that wrote the code in the first place.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

And whose fault is it that the code needs to be 1. fixed and 2. refactored so it actually executes in a reasonable amount of time with less wasted resources in the first place? That would be the guys that wrote the code in the first place.

You're forgetting that this is just about every game developer ever, not just one. It's reasons like this that only AMD and NVIDIA can remain competitive in the dedicated GPU market.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

"amd can't afford to optimize and help devs so obviously it's nvidia's fault"

 

this thinking lol

 

that comment might be right, if this happens to all games, but obviously it doesnt, this only happens on gameworks, need to be blind to not see the link...

Link to comment
Share on other sites

Link to post
Share on other sites

that comment might be right, if this happens to all games, but obviously it doesnt, this only happens on gameworks, need to be blind to not see the link...

 

There's none as blind as those that don't want to see, and none as deaf as those that don't want to hear.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Same can be said for the devs. Without the devs, you expect AMD to blindly optimize for a game?

 

EDIT: Say you have a trainer and an athlete. Without the athlete, the trainer is meaningless. Without the trainer, the athlete can only do so much...

OMG how much of a fan boy do you need to be not to get this....if it was lazy devs both vendors would be having issues, its not its only AMD that appears to be having issues implying that nvidia are helping the devs and devs are working with nvidia.  the devs job and skill set is to make games not deal with drivers, the vendors skill set is much higher and  i guarantee that the guys who work for the vendors on drivers could walk through the code for a game with ease, given driver code is far more complex and unless im mistaken is assembly code, the code that is used to write the code that games are written in

 

This is not how it works on every other platform, like androids, Mac or even the consoles.

Thats_nice.jpg

 

 

 

 

 

 

ill just sit here and wait for a point

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

that comment might be right, if this happens to all games, but obviously it doesnt, this only happens on gameworks, need to be blind to not see the link...

The problem with the assumption that it's Gameworks is that there are actually Gamework titles that AMD cards do perfectly fine in, if not better than NVIDIA cards in some cases. If AMD cards managed to run fine in other Gamework titles, why is this not one of those titles? Lazy AMD or reluctant developer, whichever.

 

I understand AMD is small and NVIDIA has gotten rather large, but it's bothersome that people shift the blame to the biggest entity, not the ones that make the most sense.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×