Jump to content

Watch Dogs Graphics On PS4, Xbox One Are Equal To PC's High Settings

JAKEBAB

Oh please. The console cpu is like a 6300fx in a game clocked at 1.6 ghz and it can't even match that. These consoles are garbage. Their CPU's are garbage and we can see that from their framerate on games, and their GPU's are garbage and the ram used as VRAM in the Xbox One is garbage squared. 

 

The high ram requirement is due to laziness by the publishers on the port.The newer direct x does that memory tiling crap which makes ram a non issue, but they aren't going to make the game DirectX 11.2 since the vast minority of potential buyers uses Win 8.

 

The only reason these pathetic consoles can even run this game is a low level API. An I3 kicks butt in BF4 on Mantle on a R9 290 which is like 3-4 times the GPU power of these consoles. Stop selling these consoles as awesome tech. They are like 2008 mid range PC's with a low level API. On a low level API a I3 with a r9 270 will kill these things. So would a 6300FX. On an I5? It doesn't matter. It is so far ahead it didn't need the low level API, but it is getting it anyways. A Steam Box with a I3, DX 12 and a 850ti (should be out by then) will make these consoles look like a joke in value and performance.

 

http://pclab.pl/art55953-3.html

Again, I know One is horrible (even with optimizations their ram implementation alone is such a large bottleneck that it doesn't matter even if they did tricks on the CPU/GPU level).

 

I really hate it when people call developers lazy, it is all about cost for reward.  Why should developers spend extra money to appease the vast minority?  As I have also said, there can actually be a good reason for higher ram requirements...it isn't necessarily because of porting, I would argue it more has to do with wanting to provision extra headroom and then using that headroom for more objects on screen.

 

And I am not selling consoles as being awesome tech, I actually clearly stated that I believe consoles just set a new "low" standard which is higher than the average community out there (which if you read the steam hardware there are lots of people who play with inadequate machines, and I could just as much blame those people then).  I just want people to stop assuming consoles are horrible and blindly blaming any gaming problems on them (and then complaining that developers aren't pumping money into the upper echelon of computer enthusiasts who don't nearly make up the dollars to justify it).  The fact is, you yourself have already attributed the consoles to horrible textures, and high ram usage, when I that is not likely the real reason.  I am merely defending the consoles from fallacies surrounding them.

 

Oh, by the way, lower framerates are usually caused by the GPU not the CPU, so stop blaming the "lower powered" CPU's for that....my mentioning of the CPU has nothing to do with the resolution, or framerate.  My arguments are centering around the CPU is that sometimes 8 weaker cores can outperform 4 very strong cores.  AI/Physics is a very good example of this, depending on the complexity many calculations need to be done for each AI, while a strong CPU might be able to muscle through it, you loose the dynamicness of the AI because you are doing the calculations sequentially instead of parallel.  But notice how I never once argued about the GPU?

 

I know consoles have their pitfalls, but at least the PS4 is not pathetic as you make it out to be.  I would say if the PS4/One were to not have been released, you might not see games like Watch Dog, or if you did they would be catering to a gaming experience even worse than it is now.  Look at the steam hardware numbers, people are gaming on less than adequate systems.

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

The textures have nothing to do with the consoles.

But the low amount of vram GPUs have.

The ultra setting textures use over 2GB vram which is already more than most PC gamers have.

 

 

NO! I already replied to you: This cards exist. The Ultra settings should be for GPUs that are truly worthy of the fucking word "ultra" not just a 760 or other mid range cards, thats what the "mid" and "high" settings are for. There is no excuse for continuing to use this crap textures on a PC game other than not giving a single fuck about PC gamers and spending 1 or 2 more weeks replacing the textures for the uncompressed ones they already have as assets for development.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Again, I know One is horrible (even with optimizations their ram implementation alone is such a large bottleneck that it doesn't matter even if they did tricks on the CPU/GPU level).

 

I really hate it when people call developers lazy, it is all about cost for reward.  Why should developers spend extra money to appease the vast minority?  As I have also said, there can actually be a good reason for higher ram requirements...it isn't necessarily because of porting, I would argue it more has to do with wanting to provision extra headroom and then using that headroom for more objects on screen.

 

And I am not selling consoles as being awesome tech, I actually clearly stated that I believe consoles just set a new "low" standard which is higher than the average community out there (which if you read the steam hardware there are lots of people who play with inadequate machines, and I could just as much blame those people then).  I just want people to stop assuming consoles are horrible and blindly blaming any gaming problems on them (and then complaining that developers aren't pumping money into the upper echelon of computer enthusiasts who don't nearly make up the dollars to justify it).  The fact is, you yourself have already attributed the consoles to horrible textures, and high ram usage, when I that is not likely the real reason.  I am merely defending the consoles from fallacies surrounding them.

 

Oh, by the way, lower framerates are usually caused by the GPU not the CPU, so stop blaming the "lower powered" CPU's for that....my mentioning of the CPU has nothing to do with the resolution, or framerate.  My arguments are centering around the CPU is that sometimes 8 weaker cores can outperform 4 very strong cores.  AI/Physics is a very good example of this, depending on the complexity many calculations need to be done for each AI, while a strong CPU might be able to muscle through it, you loose the dynamicness of the AI because you are doing the calculations sequentially instead of parallel.  But notice how I never once argued about the GPU?

 

I know consoles have their pitfalls, but at least the PS4 is not pathetic as you make it out to be.  I would say if the PS4/One were to not have been released, you might not see games like Watch Dog, or if you did they would be catering to a gaming experience even worse than it is now.  Look at the steam hardware numbers, people are gaming on less than adequate systems.

 

Not getting the game to work with lower ram was lazy. BF4 with 64 players is way harder on a system and they were just lazy with the ram transition. Watch Dog's appears to run completely fine per people playing it atm on a I3 or a 6300fx, and people have over 60 fps on nehalem I5's, and the recommendations were absolute BS to hype up the power of these consoles,  and yes the PS4 is as pathetic as I make it out to be.

 

Big deal if the PS5 has GDDR5 RAM. Overclocked sys ram makes little difference and the CPU is the same as XB1 except it is clocked 100mhz lower. The PS4 has a 7790 in TFLOP performance at 1.84. The Xb1 a 7770 at 1.31. A r9 270 at stock is like 2.6 so these things are junk. The GDDR5 helps a lot as VRAM but the GPU sucks and all PC GPU's have GDDR5 VRAM. The XB1 GPU sucks so much that it doesn't need GDDR5 VRAM. The console cards couldn't run gigantic textures anyways, or run tri screen or run 1440p or 4 k so it makes no difference. You can put 20 gigs of VRAM with those things and they are still doing 1.84 and 1.31 TFLOPS. The Final Fantasy MMO runs at low FPS, Tomb Raider has severe dips and they can't even get Planetside 2 to run on the thing and when they do it will be gutted and have culling of visible players. Titanfall runs like garbage with FPS drops.

 

So the developers aren't only lazy, they are complete liars as far as these PC requirements. Thief devs lied about an I7 being needed, Wolfenstein devs also lied. Ubisoft lied, just like they lied when they said you needed a Quad to play AC 4. New COD devs didn't lie they said I7, then it was found out they said I7-930 which is a 2008 CPU under 3ghz...

 

Also Watch Dog's is on Xbox 360 and PS3 so there goes that spiel. If the consoles died tomorrow we would have just as many games and they could use the money spent on advertising, lies and BS on game development instead of promoting the game, getting Snoop Dogg in COD, and paying off sites like IGN to hype the hell out of Titanfall and ESO before and after release.So yeah consoles, thanks for ruining Watch Dog's, Elder Scrolls Online. Thanks also for the 50 versions of Watch Dog's at release. We appreciate it. Thanks also for the Witcher 3 delay so that the game will run on those bricks. What else should I thank consoles for. Oh yeah. MS holding off low level API's so they can sell their dumb consoles. MS buying up PC gaming devs and keeping PC games off PC. 

 

What can I thank console gamers for? Ridiculous CE editions, Day 1 DLC, pay to play online, pre ordering to get a complete game. These consoles aren't freaking Nintendo creating new genres. They just play PC games badly, hold back tech, and are paying third party devs to keep games off other PC's then passing those bribes on to the customer in more expensive games and idiotic DLC that is getting out of hand.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

I really hate it when people call developers lazy, it is all about cost for reward.  Why should developers spend extra money to appease the vast minority?

 

1) PC gaming is not the vast minority anymore. In fact it has been shown to be bigger in overall sales than consoles. 

 

2) There's no need to spend "extra money" since you don't create an intentionally blocky and low-res texture and then work your way up, you do it the opposite way: You start with high quality images then compress and adjust as needed, meaning that they already have the assets to include in a much better version of the game and it would literally take them almost no effort or money. 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1) PC gaming is not the vast minority anymore. In fact it has been shown to be bigger in overall sales than consoles. 

 

2) There's no need to spend "extra money" since you don't create an intentionally blocky and low-res texture and then work your way up, you do it the opposite way: You start with high quality images then compress and adjust as needed, meaning that they already have the assets to include in a much better version of the game and it would literally take them almost no effort or money. 

 

Bingo. I bet EA/Bioware has the higher textures for the Mass Effect Series just sitting in a computer somewhere, before they got "consolitis" . It is a prevalent disease today found in many games. 

 

Does EA/Bioware give us those textures as optional like Crysis 3 did? Nope. Why? Because then people would see how bad their consoles really were. I bet you they add some of those textures to their re release of the Mass Effect series though on next gen.

 

Watch Dog's has the high textures in their computers and they could easily make it a digital download. This game looks nothing like it's reveal and everything that has come out of Ubisoft's mouth lately has been 100 percent BS, from recommendations, to trying to make gamers think it wouldn't work on an I3, to initially saying you needed a Titan/780ti and a 4770k for Ultra (which IGN ran with and used to promote the expensive computer myth). 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

1) PC gaming is not the vast minority anymore. In fact it has been shown to be bigger in overall sales than consoles. 

 

2) There's no need to spend "extra money" since you don't create an intentionally blocky and low-res texture and then work your way up, you do it the opposite way: You start with high quality images then compress and adjust as needed, meaning that they already have the assets to include in a much better version of the game and it would literally take them almost no effort or money. 

1) I was referring to deathjester's remark of laziness of developers for not developing it to include the newest directx which is on windows 8 and reducing the ram.  You can bet that they aren't going to spend extra money on implementing a new directx when there is only small amount of people using it will be able to have reduced system requirements

 

2) I know how it works, that is why I said it is crazy to blame texturing problems on the consoles (From deathjesters comments), because they could easily have included it in the PC "port".

 

 

 

deathjester The PS3, and xbox 360 versions are talking about very very scaled down textures and games models and likely will have less AI controlled objects.  So my case about ram requirements and needing the cores could still be quite accurate.  I am not saying that the numbers are on the high side, compared to you most likely need...but I am saying it is wrong to attribute it to the PS4.  As I have said, look at the steam hardware numbers....and then tell me that compared to the average PC on there that it sucks.  The fact is you can't compare the PS4 to the upper echelon of computers, you need to compare it to where the majority sits.  Because developers will almost always target the market that will give them the most money.  So I still stand by my argument that like it or not the PS4 set a new "low" standard for games.  And like I said, they add in extra buffer room for the recommended settings so the vast majority who barely meet the specs won't complain that it performs horribly (because again people like me, who have 8GB ram and an decent CPU would not be able to play the game due to having other things that are constantly in the background of my computer).  As we speak my 8GB system has 3GB of ram left, which is not a lot.  So my argument still stands, they need to add in those buffers.

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

All these images and videos has made me realize that most games has a handful of things that looks like absolute crap.

Trees in the distance looks like some paper decoration you would get in a fancy drink.

All buildings are apparently empty and all have the same color on the walls/ceiling.

Rocks look like they are made out of cardboard.

Some leafs looks like they are painted on the ground rather than lying on top of it.

I could probably find a few more things that looks really cheaply made if I looked for it some more. We are a long long way away from having a realistic looking and dynamic world.

Link to comment
Share on other sites

Link to post
Share on other sites

1) I was referring to deathjester's remark of laziness of developers for not developing it to include the newest directx which is on windows 8 and reducing the ram.  You can bet that they aren't going to spend extra money on implementing a new directx when there is only small amount of people using it will be able to have reduced system requirements

 

2) I know how it works, that is why I said it is crazy to blame texturing problems on the consoles (From deathjesters comments), because they could easily have included it in the PC "port".

 

 

 

deathjester The PS3, and xbox 360 versions are talking about very very scaled down textures and games models and likely will have less AI controlled objects.  So my case about ram requirements and needing the cores could still be quite accurate.  I am not saying that the numbers are on the high side, compared to you most likely need...but I am saying it is wrong to attribute it to the PS4.  As I have said, look at the steam hardware numbers....and then tell me that compared to the average PC on there that it sucks.  The fact is you can't compare the PS4 to the upper echelon of computers, you need to compare it to where the majority sits.  Because developers will almost always target the market that will give them the most money.  So I still stand by my argument that like it or not the PS4 set a new "low" standard for games.  And like I said, they add in extra buffer room for the recommended settings so the vast majority who barely meet the specs won't complain that it performs horribly (because again people like me, who have 8GB ram and an decent CPU would not be able to play the game due to having other things that are constantly in the background of my computer).  As we speak my 8GB system has 3GB of ram left, which is not a lot.  So my argument still stands, they need to add in those buffers.

 

I could care less about Steam hardware numbers. This isn't IGN. Go pander consoles there. Everyone here knows what is in the consoles. How many users are on steam and how many people own a PS4 and a Xbox One that are actually in peoples homes and not shipped? A lot of people play dumb little non graphic intensive games on a potato laptop and buy their games off steam. That doesn't mean you can compare a gaming PC with a console. 

 

An I3/6300fx and a R9 270 will run this game better than a console even without a low level API. So would a 2008 I7-920. The r9 270 is putting out twice the graphics power of a Xbox One and with a OC to 270x speeds (which they all do) would put out a TFLOP more than the PS4. 

 

This is what I think of the consoles. They can't even run 1080p in a downgraded game, with neutered textures. I can only hope Witcher 3 doesn't neuter their game and the consoles run the game at 720 and 800p. That is what should have happened with Watch Dog's. 

 

https://www.youtube.com/watch?v=ZS1QOBRH4Ac

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Umm, yes they can: We have cards with 6gb of vram and people with SLI and Crossfire setups. The only reason why they "can't" it's because they fucking lied: The main version was never the PC version, it was the console versions and those puny pieces of shit of course can't handle better textures and such. Again this qualifies the game as a port.

 

If there's enough interest in the game I'm sure the PC gaming community will produce proper textures with mods, though that remains to be seen and the game looks dull as shit from what I was able to see on twitch. Though I might be completely wrong on that.

You cannot "port" a game that runs on the same exact platform in my opinion. The only difference is the operating system and API calls (software level). The same code will be used for both the PC and consoles. I guess the "port" term will still stick, tho people will use it still thinking there is a architectural difference between consoles and PC's.

Link to comment
Share on other sites

Link to post
Share on other sites

I could care less about Steam hardware numbers. This isn't IGN. Go pander consoles there. Everyone here knows what is in the consoles. How many users are on steam and how many people own a PS4 and a Xbox One that are actually in peoples homes and not shipped? A lot of people play dumb little non graphic intensive games on a potato laptop and buy their games off steam. That doesn't mean you can compare a gaming PC with a console. 

 

An I3/6300fx and a R9 270 will run this game better than a console even without a low level API. So would a 2008 I7-920. The r9 270 is putting out twice the graphics power of a Xbox One and with a OC to 270x speeds (which they all do) would put out a TFLOP more than the PS4. 

 

This is what I think of the consoles. They can't even run 1080p in a downgraded game, with neutered textures. I can only hope Witcher 3 doesn't neuter their game and the consoles run the game at 720 and 800p. That is what should have happened with Watch Dog's.

Whether you like it or not, the majority of people who purchase games, do not have a true "gaming PC".  Let me ask you this.  If consoles didn't exist, what makes you think that the developers would go towards the people with gaming machines, instead of those "potato laptop" people.  The fact is they often go where the money is, and whether you like it or not, the fact is the PS4 is strong than many people's PCs.  I am not saying that the PS4 can even remotely compete with a true gaming machine (I know it can't), but you can't be foolish enough to assume that developers will only make games for people with those machines

0b10111010 10101101 11110000 00001101

Link to comment
Share on other sites

Link to post
Share on other sites

meh I don't think it looks too impressive, gray and brown and boring. I'm gonna pass on this completely.

-The Bellerophon- Obsidian 550D-i5-3570k@4.5Ghz -Asus Sabertooth Z77-16GB Corsair Dominator Platinum 1866Mhz-x2 EVGA GTX 760 Dual FTW 4GB-Creative Sound Blaster XF-i Titanium-OCZ Vertex Plus 120GB-Seagate Barracuda 2TB- https://linustechtips.com/main/topic/60154-the-not-really-a-build-log-build-log/ Twofold http://linustechtips.com/main/topic/121043-twofold-a-dual-itx-system/ How great is EVGA? http://linustechtips.com/main/topic/110662-evga-how-great-are-they/#entry1478299

Link to comment
Share on other sites

Link to post
Share on other sites

You cannot "port" a game that runs on the same exact platform in my opinion. The only difference is the operating system and API calls (software level). The same code will be used for both the PC and consoles. I guess the "port" term will still stick, tho people will use it still thinking there is a architectural difference between consoles and PC's.

 

To clarify: Yes it wouldn't be technically a port, but I mockingly called it a port because of the same philosophy: Don't actually optimize it or make it as good as in can be for the PC, just copy/paste everything and get it to run and that's it/

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

It's not a port PC is the main platform.

The PC version better look good or else people will be mad.

Also I don't think Youtube does the game justice.

This is a PS4 screenshot and it looks really good:

 

-snip-

 

 

Looks like Call of duty: Modern Warfare 2 Graphics on Ultra and high Anti Aliasing

 

If not its like Modern Warfare 2 with the graphics mod.

 

 

Infinity ward must have some kind of alien technology, since Their Modern Warfare 2 game from 2009 has graphics just as good as 2014 games.

Link to comment
Share on other sites

Link to post
Share on other sites

meh I don't think it looks too impressive, gray and brown and boring. I'm gonna pass on this completely.

To each their own I guess. I personally think this looks outstanding

IdeaCentre K450 Review - My favorite (pre-built) gaming desktop under $1000.

Gaming PC Under $500 = Gaming on a budget.

Link to comment
Share on other sites

Link to post
Share on other sites

Whether you like it or not, the majority of people who purchase games, do not have a true "gaming PC".  Let me ask you this.  If consoles didn't exist, what makes you think that the developers would go towards the people with gaming machines, instead of those "potato laptop" people.  The fact is they often go where the money is, and whether you like it or not, the fact is the PS4 is strong than many people's PCs.  I am not saying that the PS4 can even remotely compete with a true gaming machine (I know it can't), but you can't be foolish enough to assume that developers will only make games for people with those machines

 

You know what? I am so sick of these mega companies pandering their crap all over forums? Order cancelled. I am not giving a dime more to Ubisoft until they stop lying and giving us downgraded crap with horrible optimization and textures that look worse than we had in Mass Effect. This game looks like absolute garbage and something I could have played in 2008. Seeing people are pirating the hell out of the last gen console versions and there are a whole 4 million Xbox One's in people home? I hope Ubisoft loses their @$% on the game for lying to PC gamers and telling us we did not get a downgraded version. That is false advertising and I am sick of Ubisoft's crap.

 

http://www.videogamer.com/pc/assassins_creed_4_black_flag/news/ubisoft_provides_statement_on_ac4_pc_optimisation_proud_of_pc_version.html

 

It isn't hard to have two texture packs. They start with good textures. They had those textures in the pre release footage. We got Xbox One textures where we are expected to try and make them look good with filtering/AA. You can only do so much with filtering and AA. If Ubisoft wants to cater to consoles and prop them up so that these "next gen consoles" sell, when one isn't selling at all? Fine. 

 

I have a 4770k and I have a GTX card that will run these idiotic "Nvidia exclusive" settings that have looked like garbage in every game I  tried them. All it is going to do is blur the hell out of the game.

 

PC gamers can create their own texture packs in their own free time and put them in the game, but Ubisoft can't put together a texture pack and give us a 20 gig download optional? Want me to buy a "deluxe edition"? How about giving me textures so that my game doesn't look like garbage instead of some stupid hat in game. 

 

The second I saw the game was 14 GB I knew the textures were a joke. Screenshots confirmed it. I don't even care about this game anymore. I was going to play Wild Star anyways and I don't even want Watch Dog's anymore.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

After reading some of the posts in this thread, It wouldn't be surprising if most of you don't play games for the story but for the graphics instead. 

CPU: AMD Ryzen 9 3900x  GPU: ASUS Strix rtx 2080 Super RAM: Corsair Vengeance Pro RGB 8gb x4 PSU: Corsair HX850i Motherboard: ASUS Strix x570-E Storage: Samsung 840 pro, Samsung 970 evo 1tb nvme, segate 2tb Case: NZXT H510I Cooling: Corsair h100i

Link to comment
Share on other sites

Link to post
Share on other sites

After reading some of the posts in this thread, It wouldn't be surprising if most of you don't play games for the story but for the graphics instead. 

Yeah it's like people wants to hate on this game for the sake of it. NO other open world sandbox game, looks better than this. (And no: insanely modded games that can only run on dual titans doesn't count)

 

So very ugly:

http://i7.minus.com/ibsWH7Wk93iftK.jpg

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Cool go buy the game on PS3/Xbox One instead of XB1/PS4. Oh wait...you  bought a PS4/Xbox One for better graphics. We don't own gaming PC's so that we can get crap ports from game developers who are notorious for this crap, that sell us false advertising before release to try to get us to preorder the game, only to find out that this is the same old Ubisoft shenanigans.

 

If Ubisoft wouldn't have done damage control claiming the game didn't have a huge downgrade and then made a 14gb game, we wouldn't be so irritated. Go to fit all that DLC on those 500 gb console drives though! Can't make the game too big and the GPU's on consoles suck, so let's just roll with bad textures.

 

This game artificially looks mediocre on the PC. That is the problem. They said the PC port was the pre release footage. That is a blatant lie. These are not the same textures. These are the exact same textures as the next gen consoles and we can just use higher AA/AF, but the game still looks mediocre. They also said this was a game developed for PC. Another lie. I have watched numerous I5 and I7 videos on youtube with the same GPU's. The performance is the same. 35-40 FPS lows driving, depending on clock and GPU doesn't help past like a 670/r9 280/760.  Oh and you somehow need 3 GB of VRAM for the garbage "Ultra" setting". They couldn't even optimize that. What a surprise that you are LOCKED to high with a 2gb GPU, which is what Ubisoft is shouting at the top of their lungs is what the consoles run. So even though a GTX 770 2 GB blows away a 7770 on the XB1 or a 7790 on a PS4? Ubisoft has that covered. Got to promote these consoles and limit the PC version in any way they can. I am sure optimization to get these laughable textures to work on a 2gb GTX 770 was impossible (LOL). You know because the Xbox One has DRR3 for VRAM/Sysram and the system bandwidth is a joke, and it runs the game with "magic". Somehow downgraded textures need 3gb of VRAM on a "pc game".

 

This game is threaded but somehow performs the same on an I5 and a I7. It performs even worse in the 8350 videos I watched. WTF is this game even threaded for. I see all these threads going in the videos, what the hell is the game doing? 

 

Just like AC4, this game is optimized for crap, and Ubisoft won't fix it. AC4 was pretty much optimized for 2 cores and had crap FPS no matter what GPU and CPU you used with it. This game somehow uses an I7's threads but can't do a stable 60 fps on a 4770k? Ubisoft had to delay the game to get it to run on consoles yet obviously didn't spend 1 second optimizing the game on PC.

 

Sorry but when a 4770k and a GTX 770 with 2gb of VRAM run the game worse then a Ivy I5 and a 4 GB GTX 760 (why the hell would anyone buy a 4 GB GTX 760), or a 200 dollar MSI R9 280 currently on sale for 200 bucks with promo,  then that is beyond stupid. Can't wait for the IGN articles claiming you need a GTX 770 4gb card to run ultra.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

*Sigh* What's with all this ULTRA GRAPHICS MUST LOOK BETTER nonsense? I play a game for the entertainment, which I derive mostly from the mechanics and second from the story, graphics should always take a back seat.

Do not  as I  do, and  not  as I say. Instead do as you may..

 

HSS Revenir: CPU=i7 5960x @4.5GHz Heatsink=Corsair H100i MOBO=ROG Rampage 5 RAM=Kingston HyperX Predator 16GB @3000MHz SSD=Corsair Neutron GTX 480GB GPU=R9 295x2 PSU=Corsair AX1500i OS=Windows 7 Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

*Sigh* What's with all this ULTRA GRAPHICS MUST LOOK BETTER nonsense? I play a game for the entertainment, which I derive mostly from the mechanics and second from the story, graphics should always take a back seat.

 

Ultra. You need  3GB VRAM to display these awesome textures. How? I have no idea. Must be the awesome hair. Makes Tomb Raider Tress FX looks like garbage, that could somehow run on a 2GB VRAM card.

 

vW8ohzn.jpg

 

This top you can go on a 2GB VRAM card...at high.

 

5wiVh5H.jpg

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Ultra. You need  3GB VRAM to display these awesome textures. How? I have no idea. Must be the awesome hair. Makes Tomb Raider Tress FX looks like garbage, that could somehow run on a 2GB VRAM card.

 

This top you can go on a 2GB VRAM card...at high

 

 

Looks pretty good to me. Then again I was still playing on the PS2 up until AT LEAST 2010 so....

Do not  as I  do, and  not  as I say. Instead do as you may..

 

HSS Revenir: CPU=i7 5960x @4.5GHz Heatsink=Corsair H100i MOBO=ROG Rampage 5 RAM=Kingston HyperX Predator 16GB @3000MHz SSD=Corsair Neutron GTX 480GB GPU=R9 295x2 PSU=Corsair AX1500i OS=Windows 7 Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

-The Bellerophon- Obsidian 550D-i5-3570k@4.5Ghz -Asus Sabertooth Z77-16GB Corsair Dominator Platinum 1866Mhz-x2 EVGA GTX 760 Dual FTW 4GB-Creative Sound Blaster XF-i Titanium-OCZ Vertex Plus 120GB-Seagate Barracuda 2TB- https://linustechtips.com/main/topic/60154-the-not-really-a-build-log-build-log/ Twofold http://linustechtips.com/main/topic/121043-twofold-a-dual-itx-system/ How great is EVGA? http://linustechtips.com/main/topic/110662-evga-how-great-are-they/#entry1478299

Link to comment
Share on other sites

Link to post
Share on other sites

Ultra. You need  3GB VRAM to display these awesome textures. How? I have no idea. Must be the awesome hair. Makes Tomb Raider Tress FX looks like garbage, that could somehow run on a 2GB VRAM card.

 

 

 

This top you can go on a 2GB VRAM card...at high.

 

 

You can easily see a difference in both detail and contrast/sharpness. What you cannot see on those pics are what the difference is in draw distance or what it looks like in open spaces.

This game looks really good for an open world sandbox game.

 

Sounds like you are mad that your 770 is out of vram. That's why I recommend people to get a 280x instead. Be mad at NVidia for skimping on vram; not developers for moving forward on pc "master race"

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like you are mad that your 770 is out of vram. That's why I recommend people to get a 280x instead. Be mad at NVidia for skimping on vram; not developers for moving forward on pc "master race"

 

Shots fired from Denmark

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×