Jump to content

[Forbes] Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

"Open" is not strictly inherently "Open Source". You can have a programming API that has plenty of documentation and support, yet still have its source code undisclosed.

And yet still if Nvidia said Yes to Mantle, AMD would give them all the tools they needed. But they said No, and so AMD's API is proprietary? I guess i dont know how to think either, because i cant get the logic behind that either...

When will Nvidia make an offer to AMD about TXAA? Never?

Link to comment
Share on other sites

Link to post
Share on other sites

It's just ONE game!

 

I think you just found the perfect phrase to be engraved at the top of the slippery slope.

 

"It's just ONE game!"

 

             -Game publishers, prior to the videogame crash of 2015-16

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand why devs would hinder AMD in helping them, when the consoles themselves have AMD harware in them! This just goes to show that most devs don't know what the fuck they are doing and/or don't have the gamers' best interests in at heart.

"Same rules since the first man picked up the first stick and beat the second man's ass with it."

Link to comment
Share on other sites

Link to post
Share on other sites

so with that logic, you would like to see games like to see future games that AMD partners with to be crippled for NV gpus right?  you would be fine if 780ti were to struggle to keep up with a R9 280X for those types of games?

I don't care 280x & AMD gpus in general, I have Nvidia.

| CPU: i7 3770k | MOTHERBOARD: MSI Z77A-G45 Gaming | GPU: GTX 770 | RAM: 16GB G.Skill Trident X | PSU: XFX PRO 1050w | STORAGE: SSD 120GB PQI +  6TB HDD | COOLER: Thermaltake: Water 2.0 | CASE: Cooler Master: HAF 912 Plus |

Link to comment
Share on other sites

Link to post
Share on other sites

By the way if AMD wants and escape plan they should concentrate their energies on the following: Make a damn impressive Linux driver and partner with Valve to make a similar, yet open source program, for SteamOS games and possibly the Source 2 engine. Just go with the indies and Valve, it will eventually pay off.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

It's just ONE game!

Batman: Arkham OriginsAssassin’s Creed IV: Black Flag, and this week’s highly anticipated Watch Dogs.

 

 

Three games.

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This just shows how immature Nvidia actually is.

Link to comment
Share on other sites

Link to post
Share on other sites

I think you just found the perfect phrase to be engraved at the top of the slippery slope.

 

"It's just ONE game!"

 

             -Game publishers, prior to the videogame crash of 2015-16

 

LOL!  :lol:

 

 

... Oh wait! NO please GOD NO!  :(

"Experience is what you get when you don't get what you want" - Dan Stanford

Project Sandrock (Ended): http://linustechtips.com/main/gallery/album/75-project-sandrock/

Project Murasame Liger (WIP): http://linustechtips.com/main/gallery/album/400-project-murasame-liger/

Link to comment
Share on other sites

Link to post
Share on other sites

LOL!  :lol:

 

 

... Oh wait! NO please GOD NO!  :(

 

This is terrible for big budget titles but for us gamers, it probably means that you'll see a permanent steam sale on ALL titles. Plus indies are pretty resilient against this type of stuff.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

And yet still if Nvidia said Yes to Mantle, AMD would give them all the tools they needed. But they said No, and so AMD's API is proprietary? I guess i dont know how to think either, because i cant get the logic behind that either...

When will Nvidia make an offer to AMD about TXAA? Never?

If NVidia had said "Yes" to Mantle, there would probably have been an agreement to keep the source code undisclosed to third parties. The API would still be proprietary, yes, but it would be owned/managed by two companies instead of one. And that's if they planned to keep it closed source forever, there's no reason they couldn't switch it to an Open Source model later as well, but I digress.

 

TXAA will likely never be offered to AMD, because this demonstrated that NVidia is being selfish and greedy (kinda ironic since "invidia" means "jealousy").

Link to comment
Share on other sites

Link to post
Share on other sites

I guess I don't know how to read then:

"Open" is an extremely vague term and does not mean the same thing as "open source". Both the drivers required to run Mantle as well as the APIs themselves are strictly closed source, just like GameWorks for Nvidia.

We don't know how "open" Mantle was to Nvidia either. AMD might have demanded licensing fees, or cross licensing Nvidia wasn't willing to do, or maybe Nvidia would have needed to make big changes to the architecture that wouldn't have been worth it.

Bottom line, saying that Mantle is open source is like saying PhysX is open source, a very untrue statement. I wouldn't even call Mantle "open".

Link to comment
Share on other sites

Link to post
Share on other sites

this is a big BS

 

i have a r9 290 card and i will pay for the game the same price that a nvidia card owner will pay and i am being f**** by nvidia.

 

in the case of MANTLE games, if you play MANTLE games in DX11 nvidia cards will provide the same gaming experience of an AMD card( ex BF4 and THIEF), so you guys that have a nvidia card and are trying to defend the nvidia move in this cut the BS. this is no good for all gaming world

Link to comment
Share on other sites

Link to post
Share on other sites

My gosh what have I started?  :D

OK... I can't enable any of those effects on my low budget Nvidia or AMD card without HUGE penatlies! And what I think is that the average gamer is just like me! :)

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

No need to worry since the game itself is utter shit.

You sir are shit the game is fantastic. But Nvidia is being a huge dick hogging the industry. I am very angry atm.

Link to comment
Share on other sites

Link to post
Share on other sites

Tbh, someone needs to say AMD - deal with it.

 

Afaik, developers don't have to use Gameworks if they hate it (as stated in posts), so I don't know what's the big deal about it. AMD should start developing quality drivers that don't crash my Photoshop and Premiere just because they can (happy Nvidia user since then) instead of crying about what's NVIDIA doing. God forbid someone invest time and money into something and then they're not offering it for free.

 

I'd like to see market share data stating that AMD is holding 40% of GPU market share. If someone has it, I'd like to have a look at it.

 

So, I guess fanboys and haters should take over from here.  :P

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

You sir are shit the game is fantastic. But Nvidia is being a huge dick hogging the industry. I am very angry atm.

 

I agree with your sentiment on the dick hogging (pun intended) however

 

1) Tone it down, no need to call anybody "shit"

 

2) The game does look like shit from all the footage I've seen on Twitch. All the pretty light effects only hide the hideous fucking textures and the oh-so-acclaimed-gameplay (#ResolutionISJustANumber) yeah, fucking boring as shit.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with your sentiment on the dick hogging (pun intended) however

 

1) Tone it down, no need to call anybody "shit"

 

2) The game does look like shit from all the footage I've seen on Twitch. All the pretty light effects only hide the hideous fucking textures and the oh-so-acclaimed-gameplay (#ResolutionISJustANumber) yeah, fucking boring as shit.

Well i've been playing it on ultra with the highest textures i tell you that your wrong.You have to play game to see not judge by a stream or video.

Link to comment
Share on other sites

Link to post
Share on other sites

Well i've been playing it on ultra with the highest textures i tell you that your wrong.You have to play game to see not judge by a stream or video.

 

Look at the other thread the test results actually confirm this: the game is just an awful fucking broken mess that doesn't looks nearly as good as you would expect for the performance it devours:

 

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

And nothing's wrong with these news. If you want to play Nvidia optimized games with better performance, buy Nvidia card. If not, don't complain.

It is one thing to pay a dev to implement performance improvements of your architecture. It's an entirely different matter, when you make your improvements outright sabotage the competition. Nvidia pays devs to use gameworks, and in doing so, seems to sign an exclusivity clause with nvidia, meaning AMD cannot gain access to optimization of a new game until very close to the release. That is essentially anti consumer and anti trust.

 

I don't care 280x & AMD gpus in general, I have Nvidia.

Wow you missed the point so bad, you almost went into orbit. If a 300$ AMD card starts beating a 700$ Nvidia card, because AMD payed devs to implement code, that would not just improve performance on AMD, but also sabotage performance on Nvidia, you would not be very pleased would you?

 

----

Gameworks is very bad news. For the entire industry at the cost of us, the consumers. Even the devs getting payed hate it:

http://linustechtips.com/main/topic/137965-developers-criticze-nvidias-gameworks-program-on-twitter-for-its-blackbox-nature/?hl=gameworks

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Look at the other thread the test results actually confirm this: the game is just an awful fucking broken mess that doesn't looks nearly as good as you would expect for the performance it devours:

 

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html

Well you know when a game gets played early without proper patch and drivers this happens right? I tell you this i make games and drivers play a big part in performance of the game.But as i said your judging it the wrong way i honestly telling you right now it is wonderful looking game.All those streams and video's aren't running ultra nor have good quality.

Link to comment
Share on other sites

Link to post
Share on other sites

So Nvidia has a program where developers put some extra time into making the game look good and run well only on Nvidia hardware? Yeah, AMD would NEVER do anything like that. I mean, say, inventing a whole new API that's only compatible with AMD GPUs but boosts performance significantly for games that use it... totally not something AMD would do.

The difference is in the propriety closed nature of the technology. Mantle is Open, Nvidia is free to support it if they want to put in the time and effort to do so. They just don't want to, and with DX12 supposedly doing the same types of things I don't blame them> It just doesn't change the fact that Gameworks is an intentionally very closed and gaurded program, AMD couldn't work on it if they wanted to. 

Link to comment
Share on other sites

Link to post
Share on other sites

OMG AMD DON'T GET NO HBAO+, PhsyX, AND TXAA. WHY YOU DO DIS TO US NVIDIAR.

 

that's what this sounds like...Nvidia got their hands on this game, AMD gets their hands on games too.  The only difference is that it's easier to hate on Nvidia all the time and champion AMD.  I'm not saying that Nvidia is not a scumbag company, but that doesn't mean that AMD hasn't been a scumbag as well about other games.

 

 

Look at the other thread the test results actually confirm this: the game is just an awful fucking broken mess that doesn't looks nearly as good as you would expect for the performance it devours:

 

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html

These benchmarks are eerily very different from the one shown in the article, here the 770 clearly loses to the 290x. Why? When clearly this is a Nvidia Gameworks and the article fervently uses it's GTX 770 is 5-6 fps above the 290x argument.

AD2000x Review  Fitear To Go! 334 Review

Speakers - KEF LSX

Headphones - Sennheiser HD650, Kumitate Labs KL-Lakh

Link to comment
Share on other sites

Link to post
Share on other sites

I hope this isn't a repeat of AC4. That was bs. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×