Jump to content

New Tomb Raider is Unoptimized at Best

afyeung

MGSV, Mad Max, and Battlefront all run fine and are well optimized and use Denuvo anti-tampering.

I know, there's exceptions. But more than half of the games using Denuvo are clusterfucks.

 

Also why does Battlefront use Denuvo? That doesn't even make sense, nobody is going to pirate a multiplayer only game lol.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just disappointed that it's 2016 and people are surprised this happened.

How many times does this have to happen until people realize this is common as fuck now?

 

The second a game is revealed to have Denuvo there's a huge chance it's going to be a steaming pile of shit, just look at the titles with Denuvo.

Seems devs who have shoddy products are most likely to incorporate it, because they know nobody is going to pay for that shit.

 

"Instead of fixing our game so people actually buy it, lets not fix it and make it difficult to pirate so nobody can steal it" - lots of AAA dev mentalities these days

 

Every time I see Denuvo it's about the biggest red flag a dev can put up. Basically announcing they've fucked something up and don't care to fix it.

I'm really pissed off that this practice is still rampant and just keeps on getting worse.

Why should we be accustomed to purchasing games that are unoptimized pieces of shit from the start? (I'm gonna stick to games on this one because I could go on a tirade on other things like poor QC on monitors to even things like slippers and sandals, and even things like cars, so on and so forth)

And guess what? MGSV:TPP and Dragon Age: Inquisition have Denuvo and they aren't shitty by a long shot. Just because other games have Denuvo has more bad games than good ones, it shouldn't automatically relate to any game with Denuvo are automatically shit.

And if thats the mentality of alot of AAA devs these days, they're essentially taking one step forward and trips back. Thats one more way to not get a game sold.

 

 

 

Actually. That's another issue. SLI scales like crap when it's an Nvidia game. The fps was fine, unoptimized means when the FPS is all over the place for no good reason which is what digital foundry reported

 

Well, I wonder how a single Titan X would manage. Either way, if unoptimized means the FPS is all over the place for no good reason, then I dunno if eating that much power out of a GPU and CPU and giving that little FPS for not even a legit good reason means. I mean, what engine is that new Tomb Raider game REALLY using? Is it using CryEngine but just renamed it or something?

Link to comment
Share on other sites

Link to post
Share on other sites

Also why does Battlefront use Denuvo? That doesn't even make sense, nobody is going to pirate a multiplayer only game lol.

Cracked private servers.

Link to comment
Share on other sites

Link to post
Share on other sites

So i tested it. In 1440p with everything to the max and the newest driver, I get stable 30-40 frames, set the frame limit to 144. So the Game is extreme demanding atm when even my super fast Card cant get more Frames. CPU usage is almost not existing, its all about the GPU. People who dont have a very fast Card will not be able to play it with max settings atm.

But the Graphics are awesome. Looks outstanding.

CPU i7 6700k MB  MSI Z170A Pro Carbon GPU Zotac GTX980Ti amp!extreme RAM 16GB DDR4 Corsair Vengeance 3k CASE Corsair 760T PSU Corsair RM750i MOUSE Logitech G9x KB Logitech G910 HS Sennheiser GSP 500 SC Asus Xonar 7.1 MONITOR Acer Predator xb270hu Storage 1x1TB + 2x500GB Samsung 7200U/m - 2x500GB SSD Samsung 850EVO

Link to comment
Share on other sites

Link to post
Share on other sites

Two Titan X's on 1080p can only run THAT much FPS? Uhm... WHAT? That doesn't make any sense unless the engine it uses is something as taxing as the ones on Crysis 3 and Witcher 3. That doesn't seem that well optimized...

He actually has TWO Titan X's (check his "about" tab in his YouTube channel). That much FPS on TWO TITAN X'S AT 1080P! THAT ALONE speaks volumes, atleast to me, in terms of what kind of shit the developers have been doing with the game (can range from not giving much of a fuck when porting it to not getting adequate-enough contact with AMD and Nvidia to make sure it doesn't run like ass).

 

Two things you need to know: 

 

#1 - TB is not running the game at 1080p. He's running an ROG Swift 1440p G-Sync display (also listed in "about" on YT). 

 

*EDIT* He was running the game at 1080p. Not sure why, but just wanted to make that correction.  ;)

 

#2 - He was not running the game in SLI. He was only running the game on a single Titan X. If you watch his footage you'll notice GPU1 is showing 90%+ usage and GPU2 is showing 0% usage. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

I know, there's exceptions. But more than half of the games using Denuvo are clusterfucks.

 

Also why does Battlefront use Denuvo? That doesn't even make sense, nobody is going to pirate a multiplayer only game lol.

 

I guess I just don't see your logic in saying that Denuvo games are more likely to be crap.  I've seen nothing that suggests that Denovu is turning well optimized games into poorly optimized games.  Just saying "Look at the games that use Denuvo!" isn't compelling evidence, since a lot of those games could be blamed on the developers (Iron Galaxy also did the Arkham Origins port, which was also regarded as pretty bad) moreso than Denuvo.

 

And when your 'rule' has nearly half of your sample size being 'exceptions,' it's not really a rule is it?

CPU: Intel i7 7700K · MoBo: Asus Maximus Hero IX · RAM: Corsair Dominator Platinum (2x8GB 2400MHz CL10) · GPU: EVGA SC+ GTX 980Ti (6 GB)
Case: Fractal Define R5, Black, Windowed · AMC: Corsair H90 with NF-A14 PWM (Pull)  · PSU Corsair HX 750W · SSD: 500GB Samsung 840 EVO
HDDs: WD Black 2TB and 2x Red 5TB · OS: Windows 10 · KB: Logitech 710+ · Mouse: Logitech G700S 
· Monitor: Acer XB270HU

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I know, there's exceptions. But more than half of the games using Denuvo are clusterfucks.

 

Also why does Battlefront use Denuvo? That doesn't even make sense, nobody is going to pirate a multiplayer only game lol.

No. Denuvo has nothing to do with bad performance. We just all hate it. Remember when people thought JC3 would be DRM free because of the name of the enemies? What a freaking joke lol. https://www.reddit.com/r/pcmasterrace/comments/3obluo/one_of_the_enemy_groups_in_just_cause_3_is_called/

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Two things you need to know: 

 

#1 - TB is not running the game at 1080p. He's running an ROG Swift 1440p G-Sync display (also listed in "about" on YT).

 

#2 - He was not running the game in SLI. He was only running the game on a single Titan X. If you watch his footage you'll notice GPU1 is showing usage and GPU2 is showing 0% usage. 

 

So; he was running it basically maxed at 1440p on a single Titan X. That being said, the performance he was getting is totally acceptable. 

Did you even watch the video sir? He did use SLI at one point, but there was barely any performance increase. Not to mention the FPS was all over the place, but there was still no stuttering which is an issue on lower end(960 and lower), and AMD cards. And he WAS running the game at 1080p. I believe TB, but I also choose to accept what Digital Foundry said since they did WAY more testing and used different cards. 

post-211116-0-13629500-1454013569_thumb.

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Two things you need to know: 

 

#1 - TB is not running the game at 1080p. He's running an ROG Swift 1440p G-Sync display (also listed in "about" on YT).

 

#2 - He was not running the game in SLI. He was only running the game on a single Titan X. If you watch his footage you'll notice GPU1 is showing usage and GPU2 is showing 0% usage. 

 

So; he was running it basically maxed at 1440p on a single Titan X. That being said, the performance he was getting is totally acceptable. 

1.) He said right off the bat at 0:04 that he's running the game at 1080p.

2.) Oh... my bad.

He's running it maxed, but at 1080p. That being said, I don't know whats up with the new Tomb Raider running half as much at 1080p with the 970 and the 390 when there's no real evidence why the new engine eats up that much power and only gives that much on 1080p. 

Link to comment
Share on other sites

Link to post
Share on other sites

Two things you need to know: 

 

#1 - TB is not running the game at 1080p. He's running an ROG Swift 1440p G-Sync display (also listed in "about" on YT). 

 

*EDIT* He was running the game at 1080p. Not sure why, but just wanted to make that correction.  ;)

 

#2 - He was not running the game in SLI. He was only running the game on a single Titan X. If you watch his footage you'll notice GPU1 is showing 90%+ usage and GPU2 is showing 0% usage. 

"Not sure why". Because the performance sucks. That's why lol. But then again, he has over $1800 worth of GPU's. So yeah

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Can someone explain to me whats with the engine that the new Tomb Raider game and why it has to eat that much and give that little if its not considered unoptimized? If its an equivalent of what Witcher 3 or Crysis 3 uses, then I may accept it. If not, then what the fuck?

Edited by WynLore
Link to comment
Share on other sites

Link to post
Share on other sites

Can someone explain to me whats with the engine that the new Tomb Raider game and why it has to eat that much and give that little if its not considered unoptimized? If its an equivalent of what Witcher 3 or Crysis 3 uses, then I may accept it. If not, then what the fuck?

It's called Horizon, it's a proprietary engine that has a built in editor so it's easier to create things. Witcher 3 and Crysis 3 however are very well optimized, that's why they are used to benchmark mid-high end cards such as the 390/970. The fps doesn't jump all over the place and there isn't stuttering. 

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did you even watch the video sir? He did use SLI at one point, but there was barely any performance increase. Not to mention the FPS was all over the place, but there was still no stuttering which is an issue on lower end(960 and lower), and AMD cards. And he WAS running the game at 1080p. I believe TB, but I also choose to accept what Digital Foundry said since they did WAY more testing and used different cards. 

 

Yes, I saw that note at the beginning of his video. I just went back an edited my post. ;)  You ninja'd me. :P 

 

I didn't see when he ran SLI (I'm at work, couldn't watch the whole thing). But I'm not surprised if SLI doesn't make a difference. It's a brand new game and this is typical. 

 

Regardless, the game still runs fine and is not what I would consider "unoptimized"  *cough* FO4 *cough*. If you turn down the AA settings it's really not that demanding. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, I saw that note at the beginning of his video. I just went back an edited my post. ;)  You ninja'd me. :P

 

I didn't see when he ran SLI (I'm at work, couldn't watch the whole thing). But I'm not surprised if SLI doesn't make a difference. It's a brand new game and this is typical. 

 

Regardless, the game still runs fine and is not what I would consider "unoptimized"  *cough* FO4 *cough*. If you turn down the AA settings it's really not that demanding. 

We're not here to argue(I'm not trying to, sorry), but you do need at least a 970 to get 60fps at console quality(according to Digital Foundry). That's higher requirements than previous games. Even FO4 which ran majorly better on the PC because it was a poorly coded game overall. Take it easy dude, if you enjoy the game that's awesome. Personally I'm not getting the game because of the stuttering on AMD's side which is probably their fault as well. 

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's called Horizon, it's a proprietary engine that has a built in editor so it's easier to create things. Witcher 3 and Crysis 3 however are very well optimized, that's why they are used to benchmark mid-high end cards such as the 390/970. The fps doesn't jump all over the place and there isn't stuttering. 

Witcher 3 and Crysis 3 are well optimized and eat up as much as they can and want, and for good reason (which I can totally accept). But with the new Tomb Raider... if thats how Horizon works, they could've just gotten Unreal Engine 4 or something, atleast it'd give a good reason why it eats up that much and not give as much in return. Doesn't make sense to me.

 

 

We're not here to argue(I'm not trying to, sorry), but you do need at least a 970 to get 60fps at console quality(according to Digital Foundry). That's higher requirements than previous games. Even FO4 which ran majorly better on the PC because it was a poorly coded game overall. Take it easy dude, if you enjoy the game that's awesome. Personally I'm not getting the game because of the stuttering on AMD's side which is probably their fault as well. 

So... you need a gpu thats around one, two, or three tiers under the top tier gpu to get that much FPS at console quality? Well, shit.

Link to comment
Share on other sites

Link to post
Share on other sites

"Not sure why". Because the performance sucks. That's why lol. But then again, he has over $1800 worth of GPU's. So yeah

 

I don't think that was why. He uses Shadowplay to capture his game play footage and probably does so with most of his games to keep the file sizes down. 

 

1.) He said right off the bat at 0:04 that he's running the game at 1080p.

2.) Oh... my bad.

He's running it maxed, but at 1080p. That being said, I don't know whats up with the new Tomb Raider running half as much at 1080p with the 970 and the 390 when there's no real evidence why the new engine eats up that much power and only gives that much on 1080p. 

 

Yeah, I edited my post but you guys both quoted me before I could correct it. ;)

 

I'll take the time to watch the DF video analysis later tonight before I comment further on the game's performance. All I can say right now is that it runs very well on my system. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

I think atm the game is bugged or is crazy on memory usage , here is is what EVGA precision X tells me when i launch the game.

I am using GTX980Ti oc 1440Mhz gpu / 1900Mhz memory ,

The game don't even care if the place is, demanding or not it just use full memory all the time.

Whatever the AA setting is , here with SSAAX4

23f7dab15175057f999ef8dfedf172f7.png

Here with FXAA

a50006ec1bd60d7e5a845aacfd92e6f8.png

 

 

Good thing is i can play 60 FPS in 2560*1080... but i can't imagine what would be the game with my old 960 2Gb...

I wish i could oc my body, during winter overheating would be great.

Link to comment
Share on other sites

Link to post
Share on other sites

390(x)/970 can't even run the game on ultra 1080p/60. Damn.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

We're not here to argue(I'm not trying to, sorry), but you do need at least a 970 to get 60fps at console quality(according to Digital Foundry). That's higher requirements than previous games. Even FO4 which ran majorly better on the PC because it was a poorly coded game overall. Take it easy dude, if you enjoy the game that's awesome. Personally I'm not getting the game because of the stuttering on AMD's side which is probably their fault as well. 

 

"Take it easy, dude"? No worries. :) I'm not upset or anything. Not sure where you got that idea...? I'm just taking part in the conversation. :) Was just stating that it runs fine, IMO. That's all.    

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

The only weird thing I came across in this game so far is the AA settings. Take a look: 

 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

"Take it easy, dude"? No worries. :) I'm not upset or anything. Not sure where you got that idea...? I'm just taking part in the conversation. :) Was just stating that it runs fine, IMO. That's all.    

"Take it easy" as in have fun gaming later on. You said you have great performance which is what a lot of steam reviewers have said actually. I guess it's different for everyone. But I'm definitely not picking up the game yet since I have a 390x :(

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just gave it a try (did the same intro scene that TB did) using my rig below and the following settings at 1440p.  I'm running Windows 10 and I'm not using the newest Nvidia drivers (361.75), I'm using 359.06 which were game ready drivers for Just Cause 3 and Rainbow Six: Siege.

 

Pre-rendered cut scenes (saw two of them) seem to be locked at 30.  

 

On average, I was bouncing around 35-40, which the lows being 30.  Never did I see it drop down to under 30 though.

 

Less demanding scenes (e.g. vistas) all hit 50 or higher.

 

Subjectively, I think it's a lot better graphically than Tomb Raider (2013) which I just played and 100%'d for the first time about two weeks ago.  I'll cut down some aspects of the graphic options (motion blur, DoF, Bloom, Vignette Blur, PureHair (Tress FX), Lens Flare) I don't really like when I play it longterm though.  Will probably just keep the SSAA 2x though.

CPU: Intel i7 7700K · MoBo: Asus Maximus Hero IX · RAM: Corsair Dominator Platinum (2x8GB 2400MHz CL10) · GPU: EVGA SC+ GTX 980Ti (6 GB)
Case: Fractal Define R5, Black, Windowed · AMC: Corsair H90 with NF-A14 PWM (Pull)  · PSU Corsair HX 750W · SSD: 500GB Samsung 840 EVO
HDDs: WD Black 2TB and 2x Red 5TB · OS: Windows 10 · KB: Logitech 710+ · Mouse: Logitech G700S 
· Monitor: Acer XB270HU

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just gave it a try (did the same intro scene that TB did) using my rig below and the following settings at 1440p.  I'm running Windows 10 and I'm not using the newest Nvidia drivers (361.75), I'm using 359.06 which were game ready drivers for Just Cause 3 and Rainbow Six: Siege.

 

Pre-rendered cut scenes (saw two of them) seem to be locked at 30.  

 

On average, I was bouncing around 35-40, which the lows being 30.  Never did I see it drop down to under 30 though.

 

Less demanding scenes (e.g. vistas) all hit 50 or higher.

 

Subjectively, I think it's a lot better graphically than Tomb Raider (2013) which I just played and 100%'d for the first time about two weeks ago.  I'll cut down some aspects of the graphic options (motion blur, DoF, Bloom, Vignette Blur, PureHair (Tress FX), Lens Flare) I don't really like when I play it longterm though.  Will probably just keep the SSAA 2x though.

Nice shots dude! Yeah, SSAAx2 is super demanding, but it does look very nice from your screenshots. Basically no aliasing/shimmering at all

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Joker is the real mvp

Current PC: Origin Millennium- i7 5820K @4.0GHz | GTX 980Ti SLI | X99 Deluxe 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Joker is the real mvp

"This guy... FUCKS!"

I'm sorry, I lost my shit when he said that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×