Jump to content

Watch Dogs Graphics On PS4, Xbox One Are Equal To PC's High Settings

JAKEBAB

Remeber kids: FPS and resolution is just a number.

ヽ༼ຈل͜ຈ༽ノ raise your dongers ヽ༼ຈل͜ຈ༽ノ


It feels as though no games ever leave the BETA stage anymore, until about 3 years after it officially releases. - Shd0w2 2014

Link to comment
Share on other sites

Link to post
Share on other sites

Technically, next gen will look as good as PC since current gen is now Xbone/PS4 lel

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

For me the graphics are good enough even if it isn't on par with the E3 2012 reveal.
Also Far Cry 3 was way worse and nobody made a big deal about it and the game was still my favorite of 2012.
Fc3.jpg?t=original&k=af8b9bdb
I bought the game for the gameplay and for funny things like this xD
u6RJ3oq.png
http://imgur.com/u6RJ3oq

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

The day one patch isn't out and the optimized nvidia drivers aren't out either. Imgur is also compressed a ton.

Link to comment
Share on other sites

Link to post
Share on other sites

The day one patch isn't out and the optimized nvidia drivers aren't out either. Imgur is also compressed a ton.

Drivers have been out for a while now, the 337.81 drivers, there's driver tweaks and a SLI profile...

Something has to be wrong with leaked build.

 

My pictures aren't compressed since i downsampled them to 1080p and they're under the imgur size limit so they're not compressed...

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

The game looks good enough.

I have only one thing to complain : poor optimization.

 

Really,my CPU usage is at 30% while playing this game.The framerate very very often drops to 1-5ish and goes back up to 60.

Also,it uses 6.5GB of RAM.That's quite alot.

 

Really,it's just a console port, and a badly optimized one.Hopefully a patch will fix these problems.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

I was playng crysis 3 ultra settings on 2 GB RAM with 64 bit windows and they were using together 1.5-1.7 GB-s and that game used my CPU cores and Hyperthreading very well. (screen is down here)

this shit is using 2 GB of my VRAM 6.5 GB of my RAM da 25% of my CPU and still not better graphics then in crysis 3

crytec knows how to make best games. i wish this game was on cryengine (crytec would make this game look better then reality)

 

23ecb0c2fc93.jpg

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

Drivers have been out for a while now, the 337.81 drivers, there's driver tweaks and a SLI profile...

Something has to be wrong with leaked build.

 

My pictures aren't compressed since i downsampled them to 1080p and they're under the imgur size limit so they're not compressed...

Those drivers are about 2 months old, I'm sure nvidia are testing some newer ones right now. Your pictures are also fucking beautiful dude.

Link to comment
Share on other sites

Link to post
Share on other sites

Those drivers are about 2 months old, I'm sure nvidia are testing some newer ones right now. Your pictures are also fucking beautiful dude.

Are you sure that you read correctly? 337.81 was released a couple weeks ago, it's been hidden on the website but you can download it with the direct link.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure that you read correctly? 337.81 was released a couple weeks ago, it's been hidden on the website but you can download it with the direct link.

I meant stable versions, not beta :P

Link to comment
Share on other sites

Link to post
Share on other sites

I meant stable versions, not beta :P

That doesn't matter at all, games almost always get optimized drivers in Beta form...

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

The game looks good enough.

I have only one thing to complain : poor optimization.

 

Really,my CPU usage is at 30% while playing this game.The framerate very very often drops to 1-5ish and goes back up to 60.

Also,it uses 6.5GB of RAM.That's quite alot.

 

Really,it's just a console port, and a badly optimized one.Hopefully a patch will fix these problems.

Obvously it should not drop to 1-5fps. But we do not know if this is the final retail version, or if there is a day 1 patch out solving things like this. We do not know if the settings actually work properly either.

 

However. This game does have advanced AI for the NPC's (compared to the ultra low AI in most open world sandbox games) and a wind simulation system, integrated into the physics engine. That requires CPU power. What is the point of having a powerful CPU, if it's not used?

 

Using as much ram as possible is so nice. Finally 64 bit games are coming out that can use a lot of ram. This is especially nice in open world sandbox games, where you want loading to be at a minimum. What is the point of having 8GB ram if you don't use it?

I remember playing Crysis 1 in 2008 in 64 bit. I had 4GB Ram back then and could use 3,5GB for the game and ½GB for Vista 64 bit. To think it would take 6 years, for 64 bit to start becoming normal, is a little pathetic really.

 

This is made on PC first and ported to consoles. Improving AI, introducing wind simulation system and advanced interactions with your surroundings, thus becoming truly next gen, obviously means it will utilize more resources. That is not a bad thing.

 

I was playng crysis 3 ultra settings on 2 GB RAM with 64 bit windows and they were using together 1.5-1.7 GB-s and that game used my CPU cores and Hyperthreading very well. (screen is down here)

this shit is using 2 GB of my VRAM 6.5 GB of my RAM da 25% of my CPU and still not better graphics then in crysis 3

crytec knows how to make best games. i wish this game was on cryengine (crytec would make this game look better then reality)

 

 

Read what I wrote above ^

 

Also why compare an open world sandbox game with Crysis 3? Did crisis have wind simulation, many NPC's with fairly good AI in it, and no loading screens, driving from one end of the game to the other? I think not. Crysis 3 is beautiful, but it is also a linear FPS game, not an open world sandbox game.

If Watch dogs can utilize that much Ram, it will result in smoother gameplay, and more things drawable on the screen.

 

----

I simply do not understand the criticism people are subjecting this game for. Most people in here are hardware enthusiasts; loving their high end PC's with lots of power. But somehow it seems people are annoyed that games, thanks to next gen consoles, are now starting to actually take advantage of the very expensive resources we payed for in our computers. What gives fellow Linues Tech Tippers?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

 But somehow it seems people are annoyed that games, thanks to next gen consoles, are now starting to actually take advantage of the very expensive resources we payed for in our computers.

It could be done better. If this game was well optimized it would not need 3 GB videocards and use 6.5-7 GB of RAM. They are making not games but money. That's all they want. Remember Thief if you had 2 GB video memory it used all of them, but if you had 3 GB of videomemory it use all of them too on same settings. that game used RAM as watch dogs does 6.5-7 GB. That's how they made us to buy new hardware. why do they write in requirements CPU i7 if games can not use hyperthreading (i already tested it) i5 = i7 in majority of games! but i7 are more expensive so they write that recommended requirements are i7 not i5

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

It could be done better. If this game was well optimized it would not need 3 GB videocards and use 6.5-7 GB of RAM. They are making not games but money. That's all they want. Remember Thief if you had 2 GB video memory it used all of them, but if you had 3 GB of videomemory it use all of them too on same settings. that game used RAM as watch dogs does 6.5-7 GB. That's how they made us to buy new hardware. why do they write in requirements CPU i7 if games does not gain from hyperthreading (i already tested it) i5 = i7 in majority of games! but i7 are more expensive so they write that recommended requirements are i7 not i5

I think the important thing here, is to distinguish between requirements and utilization:

 

I'm getting a 290 next week, along with Watch dogs. It has 4gb vram. If WD can utilize all 4gb, why wouldn't I want that (it can probably only use 3, but still)? I have 8GB Ram in my system; if WD can utilize all, that the system is not using, how is that a bad thing? Of course these games should utilize all they can use.

 

The problem is if it requires it, as in a fake enforced minimum threshold, like COD Ghost 2 did with it's pointless RAM and CPU requirement, just to camouflage it as next gen. 

 

I don't know if WD can utilize Hyper threading or not, but remember that HT is not extra cores. It's simply a tiny potential boost, by utilizing a higher percentage of each core, so less of it is inactive at any given time. An I7 is usually at a higher frequency than an I5, which could explain the recommended requirement.

 

I don't know if the Ultra texture setting in WD can run on 2GB. People could try? The requirements developers state, are to ensure a smooth acceptable experience for the player. Enthusiasts, can easily tweak the games to run better with less, or nicer looking with the same. Do you think a 660 has enough power to even handle ultra textures? If it can, then the progress of PC gaming has truly moved at a snail pace.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know if the Ultra texture setting in WD can run on 2GB. People could try? The requirements developers state, are to ensure a smooth acceptable experience for the player. Enthusiasts, can easily tweak the games to run better with less, or nicer looking with the same. Do you think a 660 has enough power to even handle ultra textures? If it can, then the progress of PC gaming has truly moved at a snail pace.

I do not think i know that 660 can. becouse i have GTX 660 and i was playing watch dogs all day long (in my country it's 4:00 clock AM 

i am playing on these settings GTX 660 + i7 3770 (4.3 GHZ) 

if my 660 was 3 GB this game would use 3 GB VRAM that's what i am talking about. it uses all you have but performance and FPS are the same! 

I get 30-38 FPS

in game v-sync is turned off i use nvidia control panel adaptive sync

c5422da245dc.jpg

507c7e88edbf.jpg

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

I was playng crysis 3 ultra settings on 2 GB RAM with 64 bit windows and they were using together 1.5-1.7 GB-s and that game used my CPU cores and Hyperthreading very well. (screen is down here)

this shit is using 2 GB of my VRAM 6.5 GB of my RAM da 25% of my CPU and still not better graphics then in crysis 3

crytec knows how to make best games. i wish this game was on cryengine (crytec would make this game look better then reality)

 

 

Saw this in another thread, could have something to do with it... ;)

 

http://linustechtips.com/main/topic/156664-rumor4chanpsa-watchdogs-torrent-has-hidden-mining-software-running-in-the-background/

Link to comment
Share on other sites

Link to post
Share on other sites

I saw that too but i always know what i am downloading from the internet. I am advanced user :)

 

thankes though

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

Its a game, get over it. Jesus christ

CPU AMD FX 8350 @5GHz. Motherboard Asus Crosshair V Formula Z. RAM 8GB G.Skill Sniper. GPU Reference Sapphire Radeon R9 290X. Case Fractal Design Define XL R2. Storage Seagate Barracuda 1TB HDD and 120GB Kingston HyperX 3K. PSU XFX 850BEFX Pro 850W 80+ Gold. Cooler XSPC RayStorm

Link to comment
Share on other sites

Link to post
Share on other sites

I saw that too but i always know what i am downloading from the internet. I am advanced user :)

 

thankes though

You don't always know what you're downloading from the internet. Even the most reliable "sources" can screw you

IdeaCentre K450 Review - My favorite (pre-built) gaming desktop under $1000.

Gaming PC Under $500 = Gaming on a budget.

Link to comment
Share on other sites

Link to post
Share on other sites

ITT, PC and Console fanboys getting upset and bent about fucking numbers on a spec sheet and forgetting that we play games for the whole "playing games" part of things. Remember those days? Where you actually judged the game on how it played and not how many visual orgasms it gave? 

 

What do I care. I'm a Glorious God Gamer. I'll get it on whatever platform makes my laziness feel better about itself. I really don't care. This continual argument gets tiresome to even see. 

 

If you care about graphics and resolution so much, get off your chairs and step outside and take a look. The graphics and resolution are killer. Fuck gameplay, its all about eye candy right? 

Link to comment
Share on other sites

Link to post
Share on other sites

ITT, PC and Console fanboys getting upset and bent about fucking numbers on a spec sheet and forgetting that we play games for the whole "playing games" part of things.

Ahh yes, the old "resolution and frame rate is just a number" argument.

 

Remember those days? Where you actually judged the game on how it played and not how many visual orgasms it gave?

No I don't. In Ocarina of time my mind was blown about how beautiful it was and how big the world was. In Majora's Mask I loved the details like how Link angles his legs instead of just clipping right through things when they are at different elevations.

I also remember that console wars have always been going on, with how powerful the system is as a main argument for one side (all the way back to the Genesis in the 80's).

Graphics seem to be just as important today as they were 30 years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

792P and PS4 is 900P

 

So the XBoxOne runs the game at the same resolution that my 8 year old 12" netbook runs at?

 

Yep, "next-gen".

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

ITT, PC and Console fanboys getting upset and bent about fucking numbers on a spec sheet and forgetting that we play games for the whole "playing games" part of things. Remember those days? Where you actually judged the game on how it played and not how many visual orgasms it gave? 

 

What do I care. I'm a Glorious God Gamer. I'll get it on whatever platform makes my laziness feel better about itself. I really don't care. This continual argument gets tiresome to even see. 

 

If you care about graphics and resolution so much, get off your chairs and step outside and take a look. The graphics and resolution are killer. Fuck gameplay, its all about eye candy right? 

The only reason people are mad is because Ubisoft showed the game with way better graphics.

Without the E3 2012,Ps4 reveal, and E3 2013 gameplay nobody would be complaining.

It's Ubisofts own fault.

I will still enjoy the game and have fun but Ubisoft messed up big time.

If they do the same thing with The Division people will flip and they will become the new EA.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I like that the game is set in a Foggy town to hide the shorter draw distance of consoles

Link to comment
Share on other sites

Link to post
Share on other sites

I think you're forgetting how bad GTA V looks it doesn't even have grass lol

GTA V has reeeeeally bad graphics it only has a good artstyle.

GTA V lack high poly count,soft shadows,DX11,high res textures,complex animations,good sound,reflections,complex AI and pretty much every department in graphics.

 

gtav.jpg

 

Jesus... I didn't know it looked that bad..

Like E-Sports? Check out the E-Sports forum for competitive click click pew pew

Like Anime? Check out Heaven Society the forums local Anime club

I was only living because it was too much trouble to die.

R9 7950x | RTX4090

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×