Jump to content

Watch Dogs Graphics On PS4, Xbox One Are Equal To PC's High Settings

JAKEBAB

It's not a port PC is the main platform.

The PC version better look good or else people will be mad.

Also I don't think Youtube does the game justice.

This is a PS4 screenshot and it looks really good:

-snip-

 

yeah it looks good...up close, look at the flat blank edges on teh steps on the left hand side, and wheres the detail distance? I see lots of fog in this game.....bit like minecraft

Falcon: Corsair 750D 8320at4.6ghz 1.3v | 4GB MSI Gaming R9-290 @1000/1250 | 2x8GB 2400mhz Kingston HyperX Beast | Asus ROG Crosshair V Formula | Antec H620 | Corsair RM750w | Crucial M500 240GB, Toshiba 2TB, DarkThemeMasterRace, my G3258 has an upgrade path, my fx8320 doesn't need one...total cost £840=cpu£105, board£65, ram£105, Cooler £20, GPU£200, PSU£88, SSD£75, HDD£57, case£125.

 CASE:-NZXT S340 Black, CPU:-FX8120 @4.2Ghz, COOLER:-CM Hyper 212 EVO, BOARD:-MSI 970 Gaming, RAM:-2x4gb 2400mhz Corsair Vengeance Pro, GPU: SLI EVGA GTX480's @700/1000, PSU:-Corsair CX600m, HDD:-WD green 160GB+2TB toshiba
CASE:-(probably) Cooltek U1, CPU:-G3258 @4.5ghx, COOLER:-stock(soon "MSI Dragon" AiO likely), BOARD:-MSI z87i ITX Gaming, RAM:-1x4gb 1333mhz Patriot, GPU: Asus DCU2 r9-270 OC@1000/1500mem, PSU:-Sweex 350w.., HDD:-WD Caviar Blue 640GB
CASE:-TBD, CPU:-Core2Quad QX9650 @4Ghz, COOLER:-OCZ 92mm tower thing, BOARD:-MSI p43-c51, RAM:-4x1GB 800mhz Corsair XMS2, GPU: Zotac GTX460se @800/1000, PSU:-OCZ600sxs, HDD:-WD green 160GBBlueJean-A
 CASE:-Black/Blue Sharkoon T9, CPU:-Phenom2 x4 B55 @3.6Ghz/1.4v, COOLER:-FX8320 Stock HSF, BOARD:-M5A78L-M/USB3, RAM:-4GB 1333mhz Kingston low profile at 1600mhz, GPU:-EVGA GTX285, PSU:-Antec TP550w modu, STORAGE:-240gb  M500+2TB Toshiba
CASE:-icute zl02-3g-bb, CPU:-Phenom2 X6 1055t @3.5Ghz, COOLER:-Stock, BOARD:-Asrock m3a UCC, RAM:2x2GB 1333mhz Zeppelin (thats yellow!), GPU: XFX 1GB HD6870xxx, PSU:-some 450 POS, HDD:-WD Scorpio blue 120GB
CASE:-Packard Bell iMedia X2424, Custom black/red Aerocool Xpredator fulltower, CPU's:-E5200, C2D [email protected]<script cf-hash='f9e31' type="text/javascript"> /* */</script>(so e8500), COOLER:-Scythe Big shuriken2 Rev B, BFG gtx260 sp216 OC, RAM:-tons..
Gigabyte GTX460, Gigabyte gt430,
GPU's:-GT210 1GB,  asus hd6670 1GB gddr5, XFX XXX 9600gt 512mb Alpha dog edition, few q6600's
PICTURES CASE:-CIT mars black+red, CPU:-Athlon K6 650mhz slot A, COOLER:-Stock, BOARD:-QDI Kinetiz 7a, RAM:-256+256+256MB 133mhz SDram, GPU:-inno3d geforce4 mx440 64mb, PSU:-E-Zcool 450w, STORAGE:-2x WD 40gb "black" drives,
CASE:-silver/red raidmax cobra, CPU:-Athlon64 4000+, COOLER:-BIG stock one, BOARD:-MSI something*, RAM:-(matched pair)2x1GB 400mhz ECC transcend, GPU:-ati 9800se@375core/325mem, PSU:-pfft, HDD:-2x maxtor 80gb,
PICTURES CASE:-silver/red raidmax cobra (another), CPU:-Pentium4 2.8ghz prescott, COOLER:-Artic Coolering Freezer4, BOARD:-DFI lanparty infinity 865 R2, RAM:-(matched pair)2x1GB 400mhz kingston, GPU:-ati 9550@375core/325mem, PSU:-pfft, HDD:-another 2x WD 80gb,
CASE:-ML110 G4, CPU:-xeon 4030, COOLER:-stock leaf blower, BOARD:-stock raid 771 board, RAM:-2x2GB 666mhz kingston ECC ddr2, GPU:-9400GT 1GB, PSU:-stock delta, RAID:-JMicron JMB363 card+onboard raid controller, HDD:-320gb hitachi OS, 2xMaxtor 160gb raid1, 500gb samsungSP, 160gb WD, LAPTOP:-Dell n5030, CPU:-replaced s*** cel900 with awesome C2D E8100, RAM:-2x2GB 1333mhz ddr3, HDD:-320gb, PHONE's:-LG optimus 3D (p920) on 2.3.5@300-600mhz de-clock (batteryFTW)
Link to comment
Share on other sites

Link to post
Share on other sites

but we can run 4k ultra :P see consoles in 7years time only just 1440p :D

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, 900p on PS4 and 792p on XboxOne (upscaled aka blurred to 1080p) on "high" settings at a jittery 30fps is NOT the same as true 1080p "high" settings at a buttery-smooth 60fps + on PC. 

 

I'm normally not one to nit-pick, but it won't look the same, nor will it 'feel' the same. Especially if you're stuck using a game pad. Mouse + KB FTW! :P

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guys here is PC gameplay at  Ultra settings (Don't forget Youtube has strong compression):

Also Ultra textures need 3GB VRam.

 

This is quite the dog and pony show: On the one hand the lighting effects and such are really done well, so on a dark night it looks nice. And some of the textures do have at least a decent degree of fidelity like the asphalt on the streets having that linear pattern if you look while the headlights are on it.

 

But then when he's hiding by the hotel you see the wall and plants textures that look like fucking shit because it happens to be a much better lit area. You can also find examples of this on the NPCs: in very typical sandbox fashion, the player character is very detailed but the NPCs are using a dramatically smaller polygon count.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Well bit the bullet n got it for £24 + day one dlc lol :D

 

Hope it's decent :c

 Motherboard  ROG Strix B350-F Gaming | CPU Ryzen 5 1600 | GPU Sapphire Radeon RX 480 Nitro+ OC  | RAM Corsair Vengeance DDR4 3000MHz 2x8Gb | OS Drive  Crucial MX300 525Gb M.2 | WiFi Card  ASUS PCE-AC68 | Case Switch 810 Gunmetal Grey SE | Storage WD 1.5tb, SanDisk Ultra 3D 500Gb, Samsung 840 EVO 120Gb | NAS Solution Synology 413j 8TB (6TB with 2TB redundancy using Synology Hybrid RAID) | Keyboard SteelSeries APEX | Mouse Razer Naga MMO Edition Green | Fan Controller Sentry LXE | Screens Sony 43" TV | Sound Logitech 5.1 X530

Link to comment
Share on other sites

Link to post
Share on other sites

So an HD 7850 can run Watchdogs in 900p using high settings? Not too bad, but i had a stuttery experience with 30 fps.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe it'll just be really easy to run.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I wont use anti aliasing, HBAO+ and motion blur, who gives a fk

Link to comment
Share on other sites

Link to post
Share on other sites

Well bit the bullet n got it for £24 + day one dlc lol :D

 

Hope it's decent :c

wtf how did u get it so cheap??

 

Here its :(

post-1006-0-07337900-1400865992.png

Link to comment
Share on other sites

Link to post
Share on other sites

wtf how did u get it so cheap??

 

Here its :(

attachicon.gifFML.PNG

Lol that sucks. Well i dont know the ToS of the forum so pm me if you want to know the site i got it at.

 Motherboard  ROG Strix B350-F Gaming | CPU Ryzen 5 1600 | GPU Sapphire Radeon RX 480 Nitro+ OC  | RAM Corsair Vengeance DDR4 3000MHz 2x8Gb | OS Drive  Crucial MX300 525Gb M.2 | WiFi Card  ASUS PCE-AC68 | Case Switch 810 Gunmetal Grey SE | Storage WD 1.5tb, SanDisk Ultra 3D 500Gb, Samsung 840 EVO 120Gb | NAS Solution Synology 413j 8TB (6TB with 2TB redundancy using Synology Hybrid RAID) | Keyboard SteelSeries APEX | Mouse Razer Naga MMO Edition Green | Fan Controller Sentry LXE | Screens Sony 43" TV | Sound Logitech 5.1 X530

Link to comment
Share on other sites

Link to post
Share on other sites

This is quite the dog and pony show: On the one hand the lighting effects and such are really done well, so on a dark night it looks nice. And some of the textures do have at least a decent degree of fidelity like the asphalt on the streets having that linear pattern if you look while the headlights are on it.

 

But then when he's hiding by the hotel you see the wall and plants textures that look like fucking shit because it happens to be a much better lit area. You can also find examples of this on the NPCs: in very typical sandbox fashion, the player character is very detailed but the NPCs are using a dramatically smaller polygon count.

They can't do it better.

Those textures already use over 2GB vram.

Imagine everything full of 2k textures like in Crysis 3 than you would need 4-6Gb vram.

And if you have 10-20 NPC on screen it would destroy your GPU if their to high poly.

That's why in Battlefield 4 the soldiers in multiplayer have way less detail than the soldiers in the campaign.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

They can't do it better.

Those textures are already use over 2GB vram.

Imagine everything full of 2k textures like in Crysis 3 than you would need 4-6Gb vram.

And if you have 10-20 NPC on screen it would destroy your GPU if their to high poly.

That's why in Battlefield 4 the soldiers in multiplayer have way less detail than the soldiers in the campaign.

 

This is why it pisses me off when people say oh who needs a 6gbvram gpu -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Guys here is PC gameplay at  Ultra settings (Don't forget Youtube has strong compression):

Also Ultra textures need 3GB VRam.

im not exited from what i see compared to the hardware it needs 

a moded skyrim with 1 gb vram and 4 cores looks the same (or at least that's my opinion)

Link to comment
Share on other sites

Link to post
Share on other sites

They can't do it better.

Those textures are already use over 2GB vram.

Imagine everything full of 2k textures like in Crysis 3 than you would need 4-6Gb vram.

And if you have 10-20 NPC on screen it would destroy your GPU if their to high poly.

That's why in Battlefield 4 the soldiers in multiplayer have way less detail than the soldiers in the campaign.

 

Umm, yes they can: We have cards with 6gb of vram and people with SLI and Crossfire setups. The only reason why they "can't" it's because they fucking lied: The main version was never the PC version, it was the console versions and those puny pieces of shit of course can't handle better textures and such. Again this qualifies the game as a port.

 

If there's enough interest in the game I'm sure the PC gaming community will produce proper textures with mods, though that remains to be seen and the game looks dull as shit from what I was able to see on twitch. Though I might be completely wrong on that.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

This is what Sleeping Dogs looked like in 2012 :

12060496226_30c363a55e_h.jpg

So yeah...

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

Resolution is just a number. FPS is just a number. Graphics is just a word. 

 

But those are just sentences. Console people don't do sentences.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think the "recommended system requirements" for the PC would reflect what we see on the consoles:

 

Recommended Requirements

Core i7-3770 4-Core 3.4GHz

FX-8350

GD CPU hardware score: 10

Login

GeForce GTX 560 Ti

Radeon HD 7850

GD GPU hardware score: 9

Login

8 GB GD RAM hardware score: 9

Win 7 64

DX 11

20 GB

 

Those specs are at least 1 order of magnitude above the PS4. Meaning that yes even with the "overhead" on the PC it should still be able to handle something that looks a hell of a lot fucking better than the PS4 version. So if you look at this numbers it just doesn't add up: Either the game is gimped on the PC and it will look like the consoles and this requirements are bullshit, or this statement was just outright fucking lies to appease console owners.

 

I can speak at least for ram and CPU, and the potential justification for the recommended settings (and why the PS4 could muster that version)

 

First let me take the ram.  Speaking as a ram heavy user, I know the limitations of ram on PC vs counting ram on a console.  On the console you have a section of ram which is guaranteed, on the PC it is not the case.  A good example behind this would be my system as an example.  It has 8 GB of ram, and I have written programs that have needed 2GB of ram (not total, but one memory chunk of 2GB *to those programmers out there, I tried a solution that required less ram, but it was too slow*).  So a 2GB chunk, and the program worked perfectly.  Then after using the computer for a while, and having a few browsers open, I could no longer get that 2GB chunk.  So the point is, 8GB of system ram, can be greatly "smaller" or less ideal than a console ram.  Afterall, they know roughly how much they can get away with.

 

Another point with the ram, which I am less confident talking about but will give it a shot anyways.  The consoles do have unified ram (or equivalent), which can make a great difference in ram usage.  All the models in the game don't have to be replicated over the vram, and constantly swapped out.  You can literally load up all the models you need, and textures and run the collision physics on them, without having to replicate them over.

 

 

So the next thing is the CPU.  This is more speculation, as I don't know how they wrote the game, but I do know about multi-core processing.  The biggest thing I would like to note is that programs that were designed to use 8 cores, cannot always be brought down to 4 stronger cores.  The first thing is if you did have 8 threads running on 4 cores there will be overhead in managing the threads so no one thread starves for too long.  8 threads running on a guaranteed 8 cores requires less overhead, as you know they won't starve, and will run at the given speeds.   This might be a dumb analogy, but I am going to say it still.  I can juggle with one hand, and it doesn't matter which hand I use.  If I had to juggle 100 times with 2 separate sets of balls and I need to make sure each set gets juggle equally (I can't just juggle one set 100 times and then the other 100 times, I need to go set a,a, b,b,a,b,..etc).    It would take a while switching the sets out each time, and I can't juggle both sets at the same time as I can only do one hand at a time.  If I brought my brother in though, we could do the task in a lot less time, since I could just juggle one and he juggles the other.  Okay bad analogy aside, the fact is if someone was specifically designed to exploit 8 cores, you will need those 8 cores, or sacrifice things when you have less cores.  (I think it was in one of the test benches of Linus that when you reduced from quad core to 2 cores you lost an AI or two, but now 8 to 4 is a lot more extreme case).

 

So this next point might be completely wrong, but might be correct (and it is unlikely anyone on this forum would know the answer).  Both consoles upon launch, announced they had modified Jaguar units (And by the looks of the articles PS4 might be using 2 quad-core units to get 8 cores).  The issue that I can see is they don't mention also how they modified the CPU's.  Not all CPU's are equal, one CPU might be capable of 10 float point add/subtract/divide/multiply but only 5 integer adds vs another where it is 5 fp adds vs 10 integer adds each cycle.   My point is, with the consoles, their CPU might have been tweaked to handle more gaming typical processing.  Or they could have added extra instruction sets which greatly improve their performance (yes it is x86, but being x86 compliant just means they have a minimum instruction set they have to use).  Another option is they could be leveraging the GPU's processing to do extra stuff  (Yes you have things like opencl, but the PS4's implementation at least is a lot more seamless).

 

*As an interesting note, 10*0.5 is faster than 10/2....in some cases it can be 30% faster depending on the language*  So console makers like Sony might figure out which are the most common calls to the CPU while gaming and optimize them, while sacrificing less common calls.

 

Anyways, I am not sure how correct all this might be when comparing Watch Dog's, but sometimes I feel that PC people overlook the fact that some major optimizations can come from slight tweaks made to the hardware.  Saying it is based on a certain CPU/GPU model can be a good baseline, but it doesn't mean that the numbers could be completely off the real world performance.

 

Sorry for writing so much...I starting writing and it got away from me

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

So in other words: despite claims of the PC being the "main" version the game is intentionally gimped to accommodate consoles, which makes this a second class port at best then. It certainly looks like that if you watch the video so you might be on to something.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Are they using the same cover system as Splinter Cell: Blacklist?  Looks similar.

My PC specifications are in my profile.

Link to comment
Share on other sites

Link to post
Share on other sites

I can speak at least for ram and CPU, and the potential justification for the recommended settings (and why the PS4 could muster that version)

 

First let me take the ram.  Speaking as a ram heavy user, I know the limitations of ram on PC vs counting ram on a console.  On the console you have a section of ram which is guaranteed, on the PC it is not the case.  A good example behind this would be my system as an example.  It has 8 GB of ram, and I have written programs that have needed 2GB of ram (not total, but one memory chunk of 2GB *to those programmers out there, I tried a solution that required less ram, but it was too slow*).  So a 2GB chunk, and the program worked perfectly.  Then after using the computer for a while, and having a few browsers open, I could no longer get that 2GB chunk.  So the point is, 8GB of system ram, can be greatly "smaller" or less ideal than a console ram.  Afterall, they know roughly how much they can get away with.

 

Another point with the ram, which I am less confident talking about but will give it a shot anyways.  The consoles do have unified ram (or equivalent), which can make a great difference in ram usage.  All the models in the game don't have to be replicated over the vram, and constantly swapped out.  You can literally load up all the models you need, and textures and run the collision physics on them, without having to replicate them over.

 

 

So the next thing is the CPU.  This is more speculation, as I don't know how they wrote the game, but I do know about multi-core processing.  The biggest thing I would like to note is that programs that were designed to use 8 cores, cannot always be brought down to 4 stronger cores.  The first thing is if you did have 8 threads running on 4 cores there will be overhead in managing the threads so no one thread starves for too long.  8 threads running on a guaranteed 8 cores requires less overhead, as you know they won't starve, and will run at the given speeds.   This might be a dumb analogy, but I am going to say it still.  I can juggle with one hand, and it doesn't matter which hand I use.  If I had to juggle 100 times with 2 separate sets of balls and I need to make sure each set gets juggle equally (I can't just juggle one set 100 times and then the other 100 times, I need to go set a,a, b,b,a,b,..etc).    It would take a while switching the sets out each time, and I can't juggle both sets at the same time as I can only do one hand at a time.  If I brought my brother in though, we could do the task in a lot less time, since I could just juggle one and he juggles the other.  Okay bad analogy aside, the fact is if someone was specifically designed to exploit 8 cores, you will need those 8 cores, or sacrifice things when you have less cores.  (I think it was in one of the test benches of Linus that when you reduced from quad core to 2 cores you lost an AI or two, but now 8 to 4 is a lot more extreme case).

 

So this next point might be completely wrong, but might be correct (and it is unlikely anyone on this forum would know the answer).  Both consoles upon launch, announced they had modified Jaguar units (And by the looks of the articles PS4 might be using 2 quad-core units to get 8 cores).  The issue that I can see is they don't mention also how they modified the CPU's.  Not all CPU's are equal, one CPU might be capable of 10 float point add/subtract/divide/multiply but only 5 integer adds vs another where it is 5 fp adds vs 10 integer adds each cycle.   My point is, with the consoles, their CPU might have been tweaked to handle more gaming typical processing.  Or they could have added extra instruction sets which greatly improve their performance (yes it is x86, but being x86 compliant just means they have a minimum instruction set they have to use).  Another option is they could be leveraging the GPU's processing to do extra stuff  (Yes you have things like opencl, but the PS4's implementation at least is a lot more seamless).

 

*As an interesting note, 10*0.5 is faster than 10/2....in some cases it can be 30% faster depending on the language*  So console makers like Sony might figure out which are the most common calls to the CPU while gaming and optimize them, while sacrificing less common calls.

 

Anyways, I am not sure how correct all this might be when comparing Watch Dog's, but sometimes I feel that PC people overlook the fact that some major optimizations can come from slight tweaks made to the hardware.  Saying it is based on a certain CPU/GPU model can be a good baseline, but it doesn't mean that the numbers could be completely off the real world performance.

 

Sorry for writing so much...I starting writing and it got away from me

 

1) The SDK's available to devs on next gen consoles can only use 6 cores, so that 8 core thing in nonsense. Dualshockers ran a tech article on Infamous Second Son and shows 6 cores being used.  2 cores are forever locked to the OS and to their dumb ad venture *cough* online service. 

2) They are garbage AMD mobile chips with gimped cache and clocked at 1.6 and 1.7ghz. These CPU's suck.

3) We already have people posting screenshots of over 60 FPS on I5's, and I read on skidrow's facebook that it plays great on an I3 (so that statement by Ubisoft was BS as well) which shows that the 4770k recommendation was BS, just like it was in Thief that recommended an I7 and played better on a Ivy I3 then a 8350, or Wolfenstein that claimed an I7 then ran just as well on a I5 and the 6300fx was well below it. 

 

Textures were wrecked in this game because the consoles have garbage GPU's. It is really that simple. For the same reason vanilla Skyrim looked like @$@^. 

 

" their CPU might have been tweaked to handle more gaming typical processing". Uh huh. Which is why Titanfall looked like this on XB1 (see below), and why ESO is struggling to even release on consoles, and why Planetside 2 still has yet to be seen on the PS4 even though it was optimized for half a year on the PC with the OMFG patch to use more cores/threads (and is still slower on an 8350 then a I5). Much definitive gaming CPU on the Xbox One. Look at those leet FPS lows in Titanfall. I think a SFF Dell for 100 bucks off Craig's list and a 750 ti would blow the Xbox One away.

 

i1S2NDI5sWkgH.gif

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Textures were wrecked in this game because the consoles have garbage GPU's. It is really that simple. For the same reason vanilla Skyrim looked like @$@^. 

 

The textures have nothing to do with the consoles.

But the low amount of vram GPUs have.

The ultra setting textures use over 2GB vram which is already more than most PC gamers have.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

The textures have nothing to do with the consoles.

But the low amount of vram GPUs have.

The ultra setting textures use over 2GB vram which is already more than most PC gamers have.

 

This is the game on max settings. AA, Filtering can make it look ok, but you can do that with any crap texture game. Pc looks like the downgrade (from the earlier PC version) with better AA/AF. I see nothing close to the pre release footage here. Ubisoft saying a GTX 670 maxing the game says it all. 

 

http://www.youtube.com/watch?v=AF-mAwbMAmg

 

Add to this the game is gigantic in scope and is 14 gigs? That includes tons and tons of audio? The textures in this game were downgraded because of the crappy consoles. Period. This game is the same size as Mass Effect 3.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Not when I can run the game at 4K on Ultra settings and inject AA 32X and Anisotropic filtering. Also mods. I hope the game works with mods.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

1) The SDK's available to devs on next gen consoles can only use 6 cores, so that 8 core thing in nonsense. Dualshockers ran a tech article on Infamous Second Son and shows 6 cores being used.  2 cores are forever locked to the OS and to their dumb ad venture *cough* online service. 

2) They are garbage AMD mobile chips with gimped cache and clocked at 1.6 and 1.7ghz. These CPU's suck.

3) We already have people posting screenshots of over 60 FPS on I5's, and I read on skidrow's facebook that it plays great on an I3 (so that statement by Ubisoft was BS as well) which shows that the 4770k recommendation was BS, just like it was in Thief that recommended an I7 and played better on a Ivy I3 then a 8350, or Wolfenstein that claimed an I7 then ran just as well on a I5 and the 6300fx was well below it. 

 

Textures were wrecked in this game because the consoles have garbage GPU's. It is really that simple. For the same reason vanilla Skyrim looked like @$@^. 

 

" their CPU might have been tweaked to handle more gaming typical processing". Uh huh. Which is why Titanfall looked like this on XB1 (see below), and why ESO is struggling to even release on consoles, and why Planetside 2 still has yet to be seen on the PS4 even though it was optimized for half a year on the PC with the OMFG patch to use more cores/threads (and is still slower on an 8350 then a I5). Much definitive gaming CPU on the Xbox One. Look at those leet FPS lows in Titanfall. I think a SFF Dell for 100 bucks off Craig's list and a 750 ti would blow the Xbox One away.

 

I am going to talk about the PS4 vs Xbox One here because I will admit One is garbage (the Kinect apparently is why they sacrificed on the other components)

1) Okay, so 6 cores...I will use the same justification as the ram.  You still can design for 6 full cores and have a problem trying to run it on 4 cores, and at the same time you can claim the OS and such on Windows will be consuming partially a core (or they could be running things like music in the background and consuming cores)

 

2) From the specs I have read, PS4 L1 cache 32KB per core, L2 cache 2MB per module (so 4 MB total), and L3...I can't find an answer.  Comparing this to an haswell L1 64KB, L2 256KB per core (8 cores = 2MB total), L3 2-8MB.  So I ask, how is the PS4 cache nuked?  Also speed is only a factor if you know the Instructions per cycle (or more importantly for games, most likely floating points per cycle.  I would much rather a system that could handle 10 floats every cycle at 1ghz over one that is 4 floats every cycle at 2 ghz). 

 

3) Like I was also trying to express in my first post, but perhaps unsuccessfully, companies need to overprovision those specs because they have no clue what an user is doing with their computer.  So they probably could do 8GB and standard CPU...but if you get someone like me, who has a few program up in the background doing processing work as well then it won't actually run.  (In my case though I would hit the RAM limit most likely).

Just because it runs smoothly on an i3 doesn't mean you are sacrificing things.  I can't remember the video, but on one Linus video I believe Slick commented that less AI's would appear when you have less cores....so who is to say that they aren't removing features (and it just isn't obvious if you don't compare one running the recommended specs).  Or perhaps they aren't running it in an intense area.  When there are a lot of things going on, maybe those cases of i3's running it won't be running as smoothly as others have said.

 

To address your textures comment, that has nothing to do with consoles at all....textures are by far one of the easiest thing to swap out in a game when porting, so don't blame consoles for that...perhaps blame the people who still complain about 20GB file sizes.

 

To address the comment about CPU modifications.  Not only did I say might, pointing out examples of games that aren't working doesn't mean they don't exist.  At the beginning of the PS3's life cycle, many people didn't use the full potential of the Cell (yes for different reasons), but eventually people figured out how to use the power of it.  If extra instruction sets have been added into CPU's, it could take a while before companies start using them (as they are still just getting use to the PS4's sdk).  Also games that were developed with only a few cores in mind are even harder to convert into 6 cores then. (Especially if those games need to do calculations and end up with the exact same result).  And I would like to point out that many studios didn't get a PS4 development kit until right before the launch (they used a PC implementation of the SDK, which can be good to port a game, but can't be used for optimizations).

 

I am not saying that a PS4 can beat a high-end system, but I am stating that I don't think that it is holding back games.  I also contend that it is bringing up the standard as many people's pc's can't handle even PS4 level graphics http://store.steampowered.com/hwsurvey/

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

I am going to talk about the PS4 vs Xbox One here because I will admit One is garbage (the Kinect apparently is why they sacrificed on the other components)

1) Okay, so 6 cores...I will use the same justification as the ram.  You still can design for 6 full cores and have a problem trying to run it on 4 cores, and at the same time you can claim the OS and such on Windows will be consuming partially a core (or they could be running things like music in the background and consuming cores)

 

2) From the specs I have read, PS4 L1 cache 32KB per core, L2 cache 2MB per module (so 4 MB total), and L3...I can't find an answer.  Comparing this to an haswell L1 64KB, L2 256KB per core (8 cores = 2MB total), L3 2-8MB.  So I ask, how is the PS4 cache nuked?  Also speed is only a factor if you know the Instructions per cycle (or more importantly for games, most likely floating points per cycle.  I would much rather a system that could handle 10 floats every cycle at 1ghz over one that is 4 floats every cycle at 2 ghz). 

 

3) Like I was also trying to express in my first post, but perhaps unsuccessfully, companies need to overprovision those specs because they have no clue what an user is doing with their computer.  So they probably could do 8GB and standard CPU...but if you get someone like me, who has a few program up in the background doing processing work as well then it won't actually run.  (In my case though I would hit the RAM limit most likely).

Just because it runs smoothly on an i3 doesn't mean you are sacrificing things.  I can't remember the video, but on one Linus video I believe Slick commented that less AI's would appear when you have less cores....so who is to say that they aren't removing features (and it just isn't obvious if you don't compare one running the recommended specs).  Or perhaps they aren't running it in an intense area.  When there are a lot of things going on, maybe those cases of i3's running it won't be running as smoothly as others have said.

 

To address your textures comment, that has nothing to do with consoles at all....textures are by far one of the easiest thing to swap out in a game when porting, so don't blame consoles for that...perhaps blame the people who still complain about 20GB file sizes.

 

To address the comment about CPU modifications.  Not only did I say might, pointing out examples of games that aren't working doesn't mean they don't exist.  At the beginning of the PS3's life cycle, many people didn't use the full potential of the Cell (yes for different reasons), but eventually people figured out how to use the power of it.  If extra instruction sets have been added into CPU's, it could take a while before companies start using them (as they are still just getting use to the PS4's sdk).  Also games that were developed with only a few cores in mind are even harder to convert into 6 cores then. (Especially if those games need to do calculations and end up with the exact same result).  And I would like to point out that many studios didn't get a PS4 development kit until right before the launch (they used a PC implementation of the SDK, which can be good to port a game, but can't be used for optimizations).

 

I am not saying that a PS4 can beat a high-end system, but I am stating that I don't think that it is holding back games.  I also contend that it is bringing up the standard as many people's pc's can't handle even PS4 level graphics http://store.steampowered.com/hwsurvey/

 

Oh please. The console cpu is like a 6300fx in a game clocked at 1.6 ghz and it can't even match that. These consoles are garbage. Their CPU's are garbage and we can see that from their framerate on games, and their GPU's are garbage and the ram used as VRAM in the Xbox One is garbage squared. 

 

The high ram requirement is due to laziness by the publishers on the port.The newer direct x does that memory tiling crap which makes ram a non issue, but they aren't going to make the game DirectX 11.2 since the vast minority of potential buyers uses Win 8.

 

The only reason these pathetic consoles can even run this game is a low level API. An I3 kicks butt in BF4 on Mantle on a R9 290 which is like 3-4 times the GPU power of these consoles. Stop selling these consoles as awesome tech. They are like 2008 mid range PC's with a low level API. On a low level API a I3 with a r9 270 will kill these things. So would a 6300FX. On an I5? It doesn't matter. It is so far ahead it didn't need the low level API, but it is getting it anyways. A Steam Box with a I3, DX 12 and a 850ti (should be out by then) will make these consoles look like a joke in value and performance.

 

http://pclab.pl/art55953-3.html

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×