Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Mr_Troll

Rise of The Tomb Raider PC Performance reviewed - Another Bad port?

Recommended Posts

Any kind soul here who doesn't mind using steam share with me? No? okay, I show myself out ._.


Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to post
Share on other sites

You mean the same physics logic in TR2013 where the wind was blowing front to back and the hair was moving back to front, AGAINST the wind? ;)

Pretty much, also HairWorks has logic. The idea it doesn't and is just a simple form of tesselation is a moronic thing to say and an insult to the people who spent years working to develop it. It actually does the same things TressFX does with increased detail at the cost of performance.


i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to post
Share on other sites

I can't help but think Denova impacts performance.

Except it doesn't actually have a performance impact?  The game looks beautiful so I don't know what people are expecting their 750Ti's to play this at anywhere near max.


QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to post
Share on other sites

I knew it.

 

All the games I have zero interest in to begin with turn out to be shit ports. Can't complain.

 

Are you serious about these penguins in norway?

Link to post
Share on other sites

I think the game looks great and plays fine. People just like to complain.


You can't be serious.  Hyperthreading is a market joke?

 

 

Link to post
Share on other sites

Out of curiosity... is it me just dreaming or on AMD cards (not sure about Nvidia ones, don't have any) the frame times are odd? I tried to run it on a iMac 5K with a R9 M295X 4GB (recommended-level GPU) and although it should do 1080p at medium settings with ease, I don't feel it that much fluid when I drop below the 60 FPS range, although the GPU usage is fine and there is no thermal throttling. Tried on a M370X as well (bare minimum requirement), pretty much the same on low settings even if it's probably due to the GPU not making it every time.

Could that be Vsync turned on and/or motion blur being disabled?

Can't update drivers anyway, AMD denies those cards from getting any because they have "Apple Computer" as subvendor. Strange enough, on TR 2013 the M295X works great, and so does in every other game. Bugs me I'm getting something wrong.


AMD Ryzen 5 2600 - Sapphire AMD Radeon RX 580 4GB - S340 Elite Black & White - 2x8GB G.Skill Ripjaws V - Sandisk 480 GB SSD - Gigabyte Aorus X470 Ultra Gaming - Corsair TX750M PSU

 

Link to post
Share on other sites
1 hour ago, Razzaa said:

I think the game looks great and plays fine. People just like to complain.

Dude that was awesome :D. Totally agree with you.


The Storm: Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

 

Starlight (upcoming successor to The Storm): AMD Ryzen 9 3950X 16-core CPU | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG Crosshair VIII Hero Wi-Fi | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black CableMod cables | Corsair ML120 2-pack 2x + ML140 2-pack

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to post
Share on other sites

Okay, riddle me this: Why do you guys think that HairWorks is just a "tessellated pile of junk" while TressFX "runs well". So why? It makes no sense to me at all. They're both a means to an end. They do the same thing. They're computationally intensive. So why are you guys shitting all over Nvidia when they put hard work into the creation of GameWorks as a whole. Was their effort all for nothing?

...bunches of jackasses.


The Storm: Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

 

Starlight (upcoming successor to The Storm): AMD Ryzen 9 3950X 16-core CPU | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG Crosshair VIII Hero Wi-Fi | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black CableMod cables | Corsair ML120 2-pack 2x + ML140 2-pack

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to post
Share on other sites
45 minutes ago, JurunceNK said:

Okay, riddle me this: Why do you guys think that HairWorks is just a "tessellated pile of junk" while TressFX "runs well". So why? It makes no sense to me at all. They're both a means to an end. They do the same thing. They're computationally intensive. So why are you guys shitting all over Nvidia when they put hard work into the creation of GameWorks as a whole. Was their effort all for nothing?

...bunches of jackasses.

Because HairWorks uses 64x multiplier for no other reason than to gimp performance enough for people to buy more expensive cards.

The Witcher 3 HairWorks Geralt only: 20 fps penalty

ROTTR TressFX highest setting: 2-4 fps penalty while looking better.

People like to rip devs a new one when performance is bad, as they just call it unoptimized no matter what. GameWorks is by definition unoptimized to sell higher end cards. If you don't see the problem with that, or why people hate it, then idk what will. Oh year, and GameWorks will always run worse on AMD than Nvidia cards. TressFX runs equally well on both.

GameWorks is a marketing ploy to sell more expensive cards and sabotage the competition, and people love NVidia for it. The amount of dumb is astounding.


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
8 minutes ago, Notional said:

Because HairWorks uses 64x multiplier for no other reason than to gimp performance enough for people to buy more expensive cards.

The Witcher 3 HairWorks Geralt only: 20 fps penalty

ROTTR TressFX highest setting: 2-4 fps penalty while looking better.

People like to rip devs a new one when performance is bad, as they just call it unoptimized no matter what. GameWorks is by definition unoptimized to sell higher end cards. If you don't see the problem with that, or why people hate it, then idk what will. Oh year, and GameWorks will always run worse on AMD than Nvidia cards. TressFX runs equally well on both.

GameWorks is a marketing ploy to sell more expensive cards and sabotage the competition, and people love NVidia for it. The amount of dumb is astounding.

I mean do you know how HairWorks and TressFX works? They use different algorithms but works the same after all.

https://en.wikipedia.org/wiki/TressFX


The Storm: Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

 

Starlight (upcoming successor to The Storm): AMD Ryzen 9 3950X 16-core CPU | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG Crosshair VIII Hero Wi-Fi | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black CableMod cables | Corsair ML120 2-pack 2x + ML140 2-pack

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to post
Share on other sites
13 minutes ago, JurunceNK said:

I mean do you know how HairWorks and TressFX works? They use different algorithms but works the same after all.

https://en.wikipedia.org/wiki/TressFX

No, HairWorks like all VisualFX parts (sans Turbulence) of GameWorks are tessellation based. TressFX is compute based. They are fundamentally different in the way they work.

Either way, HairWorks using 5-10 times more fps for the same or worse result is abysmal, and shows how useless of a tech it is.


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
2 minutes ago, Notional said:

No, HairWorks like all VisualFX parts (sans Turbulence) of GameWorks are tessellation based. TressFX is compute based. They are fundamentally different in the way they work.

Either way, HairWorks using 5-10 times more fps for the same or worse result is abysmal, and shows how useless of a tech it is.

Okay fair enough then. I just want to put some personal experience with TressFX and see if it's just me, or my laptop (circa 2013) runs like ass.

When I benchmark Tomb Raider 2013 (Ultra preset, Tessellation ON, and some other settings I had forgotten by now, and Hair Quality Normal and TressFX), what I saw was about a 10 FPS difference. I need to benchmark the laptop again with that game. Love it.


The Storm: Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

 

Starlight (upcoming successor to The Storm): AMD Ryzen 9 3950X 16-core CPU | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG Crosshair VIII Hero Wi-Fi | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black CableMod cables | Corsair ML120 2-pack 2x + ML140 2-pack

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to post
Share on other sites
1 hour ago, JurunceNK said:

Okay fair enough then. I just want to put some personal experience with TressFX and see if it's just me, or my laptop (circa 2013) runs like ass.

When I benchmark Tomb Raider 2013 (Ultra preset, Tessellation ON, and some other settings I had forgotten by now, and Hair Quality Normal and TressFX), what I saw was about a 10 FPS difference. I need to benchmark the laptop again with that game. Love it.

Tressfx version 1 was a ressource hog, although not as bad as hairworks.

In version 2 they introduced the master and slave principle, where only the master strands get's fully computed while the slave strands just followed the master strands. This made it much more efficient.

Version 3 is what rottr uses and deus ex mankind divided will use. More efficient, better looking and better physics.

So comparing tr 2013 might not be the best representation.


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
2 minutes ago, Notional said:

Tressfx version 1 was a ressource hog, although not as bad as hairworks.

In version 2 they introduced the master and slave principle, where only the master strands get's fully computed while the slave strands just followed the master strands. This made it much more efficient.

Version 3 is what rottr uses and deus ex mankind divided will use. More efficient, better looking and better physics.

So comparing tr 2013 might not be the best representation.

Okay. I don't have RoTTR yet and I want to actually play it. I have played TR2013 three times. Third run I got 100% on hard.


The Storm: Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

 

Starlight (upcoming successor to The Storm): AMD Ryzen 9 3950X 16-core CPU | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG Crosshair VIII Hero Wi-Fi | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black CableMod cables | Corsair ML120 2-pack 2x + ML140 2-pack

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to post
Share on other sites
8 minutes ago, JurunceNK said:

Okay. I don't have RoTTR yet and I want to actually play it. I have played TR2013 three times. Third run I got 100% on hard.

Nice. I have 100% achievements in the game. Can't wait for rottr to go on sale :D


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites

This is 1080p settings maxed. The game looks beautiful in my opinion. I will be playing on 1440p later on. I expect to get over 60fps.

 

 

 

1440p This game is demanding but looks and plays great!!

 

 

 


You can't be serious.  Hyperthreading is a market joke?

 

 

Link to post
Share on other sites
2 hours ago, Notional said:

Nice. I have 100% achievements in the game. Can't wait for rottr to go on sale :D

You can buy it on Windows store for 9 bucks btw. :)

 

 

Playing this game at 1440p with 970 and everything is maxed out except for the foilage. I could say that this port is definitely awesome. Although I could not get an fps counter to work because I bought it on windows store, I could guess my fps averaging at 50-60 depending on the scene. It's been smooth so far. 


Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to post
Share on other sites

It looks like a good port, but quite a demanding game to run, Given how nice the game looks it not surprising its pretty hard to max out.

I think we will see an extra 5-10% performance increase on AMD cards with driver updates over the next few months. 

And maybe Nvidia drivers to gimp performance by 5-10%, in order to sell pascal cards when they come out ;..................................................Nah, they should improve too, Probably............They wouldn't would they??????

I may pick this game up this week. I loved the 2013 game, played through it once on 2x560GTX ran great with good SLI scaling, but sadly no tressFX, then got a 7970 and played it a bit more, ran great on max settings with fxaa and Tressfx over 60fps, then added a second 7970 and scaled beautifully in Crossfire, well over 100fps. i need to play this game again on my fury card..

 

 

 


---Intel I7 4790k@4.6GHz@1.24v---Gigabyte Z97gaming 5----H100i----240GB Kingston HyperX 3k----Samsung 250GB EVO840----3tb Seagate----4TB Western Digital Green----16GB Kingston HyperX Beast 2400Mhz----Sapphire Fury TriX----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to post
Share on other sites
19 hours ago, Mantayd17 said:

Out of curiosity... is it me just dreaming or on AMD cards (not sure about Nvidia ones, don't have any) the frame times are odd? I tried to run it on a iMac 5K with a R9 M295X 4GB (recommended-level GPU) and although it should do 1080p at medium settings with ease, I don't feel it that much fluid when I drop below the 60 FPS range, although the GPU usage is fine and there is no thermal throttling. Tried on a M370X as well (bare minimum requirement), pretty much the same on low settings even if it's probably due to the GPU not making it every time.

Could that be Vsync turned on and/or motion blur being disabled?

Can't update drivers anyway, AMD denies those cards from getting any because they have "Apple Computer" as subvendor. Strange enough, on TR 2013 the M295X works great, and so does in every other game. Bugs me I'm getting something wrong.

 

AMD doesnt have a driver. basically, atm, every AMD card is brute forcing the performance. There is ZERO optimized drivers for Rise of The Tomb Raider atm.

Link to post
Share on other sites

I was playing this weekend on my rig and had a pretty good experience at 1440p. I'm running FXAA and everything else maxed, but pure hair off (not worth the 5-8fps). Kept a 40-60fps rate with my overclocked GTX 980, which is perfect given my G-sync monitor.

I do get random stutters mostly when large vistas are coming into view quickly and loading all at once or right before some cut scenes begin. Some intense scenes have me dropping frames or going into single digits. 98% of the time it is consistent and smooth though. Running FRAPS, I notice CGI cut scenes are locked at 30fps and I get a screen tear in the upper portion on all of these.

Overall though the experience is fine, the game looks amazing. The issues above are nothing a future update couldn't fix; certainly nothing that would have me cry foul and claim it's a bad port.


CPU i5-4690K(OC to 4.4Ghz) CPU Cooler NZXT Kraken x41 Motherboard MSI Z97 Gaming 5 Memory G.Skillz Ripjaws X 16gb 2133 Video Card MSI GTX 1080 Gaming X           Case NZXT H440 Power Supply XFX XTR 750W Modular Storage Samsung 840 EVO 250gb/Seagate Barracuda 2TB Monitor Acer XB270HU G-Sync http://pcpartpicker.com/b/3CkTwP

Link to post
Share on other sites

Rise of the Tomb Raider runs fine on my 7970 at 1080p, it's running solid 45 fps on high-very high settings


Devices:

Desktop(s): Main Rig | CPU: R7 1700x, Ram: 16GB, GPU: GTX 1070 Ti

Server(s): My Server 

Laptop(s): Macbook Pro 13" (2015) 

Phone(s): iPhone SE (64GB), Nokia Lumia 925 

Link to post
Share on other sites
On 29/01/2016 at 4:00 PM, Prysin said:

 

We are on the brink of a new generation of GPUs, the 750Ti is old as fuck and supposed to only function at the low end. Honestly, this is the right thing to do. People need to upgrade.

 

 

750 ti isn't that old imo, I have friends that have 550 ti in their PCs.


Devices:

Desktop(s): Main Rig | CPU: R7 1700x, Ram: 16GB, GPU: GTX 1070 Ti

Server(s): My Server 

Laptop(s): Macbook Pro 13" (2015) 

Phone(s): iPhone SE (64GB), Nokia Lumia 925 

Link to post
Share on other sites

Now running the game with the following settings on an overclocked 980:

-1440p (DSR) with FXAA and Vsync (triple buffer).

-High preset.

-Everything on except motion and vignette blur off.

-Pure hair on very high.

Runs at a solid 60fps most of the time with the ocassional dip into the 50's and the game looks damn good. Almost no aliasing/jaggies. 

 

I don't get why some people turn off, don't care or complain about the pure hair effect in the game (as well as TressFX in TR 2013). To me, having all aspects of the scene, especially the realism of the main character (whom you play and look at the entire time), look as real as possible is important to me. Why would you want her hair to look like an unrealistic solid block that barely moves? The pure hair effect in this game barely takes any performance hit. I'd much rather have Lara look as real as possible even if it takes 2-4fps to do it. Just my 2 cents. 


My Systems:

Gaming:

Spoiler

RUSTIC PC: FX-8350 @4.6GHz // Deepcool Gammaxx 400 // MSI 970 Gaming // AData 2x 4GB DDR3 @1600MHz // Gigabyte RX 570 Gaming 4G // Samsung 840 120GB SSD + 2x 1TB Seagate 7200 HDDs // Cooler Master V650 PSU // Vintage wooden crate enclosure // Windows 10 // Thrustmaster TMX + G27 pedals & shifter // Build Log

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Linux Mint 19 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Manjaro Gnome

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // Corsair 1x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to post
Share on other sites
Guest
This topic is now closed to further replies.

Buy VPN

×