Jump to content

Rise of The Tomb Raider PC Performance reviewed - Another Bad port?

Mr_Troll

When I make a post, unless I am the original poster or ask for a reply, don't bother replying or quoting me because I don't read them.

Link to comment
Share on other sites

Link to post
Share on other sites

Any word on if it will be getting a DX12 patch?

 

The official word is "maybe". They still think they can squeeze more performance out of DX11 but they are apparently looking into DX12. If they do release a DX12 update for the game it will be what they called a "front end option". If we don't hear about a DX12 update by the time the next Deus Ex is out I think it would be safe to say it won't happen.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm really curious to see how the game runs at comparable visual fidelity to the xbone. If it runs league better at the same fidelity, then it's a good port.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Yay! Another dodged bullet due to my refusal of pre-purchasing!

Remember kids, the only difference between screwing around and science is writing it down. - Adam Savage

 

PHOΞNIX Ryzen 5 1600 @ 3.75GHz | Corsair LPX 16Gb DDR4 @ 2933 | MSI B350 Tomahawk | Sapphire RX 480 Nitro+ 8Gb | Intel 535 120Gb | Western Digital WD5000AAKS x2 | Cooler Master HAF XB Evo | Corsair H80 + Corsair SP120 | Cooler Master 120mm AF | Corsair SP120 | Icy Box IB-172SK-B | OCZ CX500W | Acer GF246 24" + AOC <some model> 21.5" | Steelseries Apex 350 | Steelseries Diablo 3 | Steelseries Syberia RAW Prism | Corsair HS-1 | Akai AM-A1

D.VA coming soon™ xoxo

Sapphire Acer Aspire 1410 Celeron 743 | 3Gb DDR2-667 | 120Gb HDD | Windows 10 Home x32

Vault Tec Celeron 420 | 2Gb DDR2-667 | Storage pending | Open Media Vault

gh0st Asus K50IJ T3100 | 2Gb DDR2-667 | 40Gb HDD | Ubuntu 17.04

Diskord Apple MacBook A1181 Mid-2007 Core2Duo T7400 @2.16GHz | 4Gb DDR2-667 | 120Gb HDD | Windows 10 Pro x32

Firebird//Phoeniix FX-4320 | Gigabyte 990X-Gaming SLI | Asus GTS 450 | 16Gb DDR3-1600 | 2x Intel 535 250Gb | 4x 10Tb Western Digital Red | 600W Segotep custom refurb unit | Windows 10 Pro x64 // offisite backup and dad's PC

 

Saint Olms Apple iPhone 6 16Gb Gold

Archon Microsoft Lumia 640 LTE

Gulliver Nokia Lumia 1320

Werkfern Nokia Lumia 520

Hydromancer Acer Liquid Z220

Link to comment
Share on other sites

Link to post
Share on other sites

far cry 4, nuff said

 

???? So if I download Far Cry 4 off nosteam I'm getting 20-30% better framerates than with my UPlay copy of the game?

Link to comment
Share on other sites

Link to post
Share on other sites

The game has Denuvo and GameWorks and is from a developer who sold out for Xbox One timed exclusivity.

Who in the fuck expected anything close to decent? It would be shocking if it comes out ported anywhere close to decent.

Maybe if it wasn't one of the top selling games on Steam WHEN IT ISN'T EVEN FUCKING OUT YET we would get decent ports.

 

But nope, as long as the general public chooses to be retarded we can't have nice things.

I don't know if there's any other market where consumers blindly hand over cash with complete faith in completely untrustworthy devs.

But go kickstart your Tim Schafer and get fucked for like the 8th time in a row. Literally never learn anything from mistakes.

Which is odd, because playing video games requires learning from your mistakes to progress, this group should be decent at this whole learning thing.

Link to comment
Share on other sites

Link to post
Share on other sites

far cry 4, nuff said

 

On topic: I'm surprised Majestic hasn't came and said something pointless. Otherwise, I'm really wondering why it takes a ~85% more powerful card than the 970 to max the game out at 1440p.

 

I can't find anyone who got such bad performance that he lost 1/5 fps. Few fps sure, just not double digit. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Look at the GTX 780 Jetstream as well, it's doing extremely well compared to how Kepler performed in the Witcher 3 launch.

 

Beating out the 390X Turbo and 960 SSC. I wonder what these will look like once AMD get a game driver out though.

we will never "know" how this graph looks like with AMD drivers. because as usual, Nvidia is early on with their drivers. AMD lags behind. All benchmarks are done PRIOR to AMD being able to launch drivers. And hardly anyone ever bothers to revisit their game benchmarks afterwards.

By being too slow to release drivers, AMD has "lost", by simply not having the performance in place when the game goes up for review.

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like my 780ti handles it well at 1440p :D

 

Cant wait to play it actually

Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, Phone Oppo Reno 10x 256gb , Camera Sony A7iii

Link to comment
Share on other sites

Link to post
Share on other sites

The game has Denuvo and GameWorks and is from a developer who sold out for Xbox One timed exclusivity.

Who in the fuck expected anything close to decent? It would be shocking if it comes out ported anywhere close to decent.

Maybe if it wasn't one of the top selling games on Steam WHEN IT ISN'T EVEN FUCKING OUT YET we would get decent ports.

But nope, as long as the general public chooses to be retarded we can't have nice things.

I don't know if there's any other market where consumers blindly hand over cash with complete faith in completely untrustworthy devs.

But go kickstart your Tim Schafer and get fucked for like the 8th time in a row. Literally never learn anything from mistakes.

Which is odd, because playing video games requires learning from your mistakes to progress, this group should be decent at this whole learning thing.

It's a fucking good port tho. The fuck do you want? Obviously graphics(even the new ones) cards are going to have a bad time maxing everything out but thats because this game contains multiple new technologies. With couple of updates, the performance of the game shall be better.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's a fucking good port tho. The fuck do you want? Obviously graphics(even the new ones) cards are going to have a bad time maxing everything out but thats because this game contains multiple new technologies. With couple of updates, the performance of the game shall be better.

 

Exactly. This is a Nixxes port. It doesn't really get better than this. No this game uses some very advanced features, all at the same time. That will kill performance. I am however very impressed with TressFX 3 (truehair). 2-4 fps penalty is insanely low for such a feature. Just think of HairWorks for geralt, how that murdered performance. Sure Deus Ex Mankind Divided will have a larger penalty, as NPC's will also have TressFx, but still.

 

PG gamers are an ungrateful bunch. They whine like manchildren when a game is slighty lowered in graphics (think Watch Dogs and The Division), yet they whine even more when a game pushes boundries and has a subsequent performance decrease. Think the age old joke "can it run crysis"? You can't have it both ways PCMR.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Hmmm... I call bad port. This is utterly ridiculous. A game that tanks a 280X to 23fps at 1440p? That's some bullshit right there and the studio knows it.

 

Some people were obviously cutting a lot of corners if this is what we can expect from a AAA title.

This game should be delayed until the devs can get their shit together and code this game properly. Since when did a 7-fucking80 beat a 390X?

That just pisses me off.

Lol? A 280x and you expect it to run this game at 1440p? What are you smoking really?? It's a rebadged 7970 GHz edition from FOUR YEARS AGO.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

far cry 4, nuff said

 

On topic: I'm surprised Majestic hasn't came and said something pointless. Otherwise, I'm really wondering why it takes a ~85% more powerful card than the 970 to max the game out at 1440p.

Cos 1440p is 78% more pixels than 1080p? A 970 is perfect for 1080p, so how is it anything new that you need a GPU about 80% more powerful to be able to run at about 80% higher res?

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll legit be scare for the pc gaming industry if newer titles dont start becoming more and more demanding, implementing new tecnologies and challenging our beloved rigs. I hate the feeling of being held back by the consoles graphics-wise, so in some sort of weird/twisted way the fact that ROTR was not downgraded and is not a cake walk to run, makes me feel good.

I'll go out on a limb here and say that probably people are so angry at the game because we were expecting a "DX12 omg boosted 160+ fps game" since we've been bashed over and over again a across the years that Dx12 will be the savior, the new best thing and now that we got it, we still getting DX11 performance titles with Dx12 eye candy.

Link to comment
Share on other sites

Link to post
Share on other sites

Why gamers are surprised that their GTX 760 can't max out a brand new AAA title at max graphics and 1440p boggles my mind.

 

Guys.. you can still play it.. just turn the graphics down!

 

This is why developers show early gameplay footage with amazing graphics, and then when the game comes out it's been scaled way back... Because everyone wants to "max out" the game with their $300 GPU. They literally get the exact same experience, but are now longer upset because the game is "maxed out".

 

All they really do is screw over the people that actually have good hardware.

 

This. I remember when Dying Light came out you could set huge draw distance, which even OCed 5960X couldn't run at more than 30 fps. Everyone complained the game was "unoptimized" and then the developer simply reduced the max draw distance you could set. And then people were happy they can max out draw distance with their i5's at 60 fps. 

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

The game has Denuvo and GameWorks and is from a developer who sold out for Xbox One timed exclusivity.

Who in the fuck expected anything close to decent? It would be shocking if it comes out ported anywhere close to decent.

Maybe if it wasn't one of the top selling games on Steam WHEN IT ISN'T EVEN FUCKING OUT YET we would get decent ports.

 

But nope, as long as the general public chooses to be retarded we can't have nice things.

I don't know if there's any other market where consumers blindly hand over cash with complete faith in completely untrustworthy devs.

But go kickstart your Tim Schafer and get fucked for like the 8th time in a row. Literally never learn anything from mistakes.

Which is odd, because playing video games requires learning from your mistakes to progress, this group should be decent at this whole learning thing.

Yeah , believe how smart you are and everyone is retarded. Then bitch how game is shit port cause your almighty 970 won't max it.

Link to comment
Share on other sites

Link to post
Share on other sites

Cos 1440p is 78% more pixels than 1080p? A 970 is perfect for 1080p, so how is it anything new that you need a GPU about 80% more powerful to be able to run at about 80% higher res?

Fine, I'll give you that. 

 

I can't find anyone who got such bad performance that he lost 1/5 fps. Few fps sure, just not double digit. 

I mean much closer towards the game's launch. IIRC the game only ran on the third core before a patch fixed it.

 

???? So if I download Far Cry 4 off nosteam I'm getting 20-30% better framerates than with my UPlay copy of the game?

see above

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly. This is a Nixxes port. It doesn't really get better than this. No this game uses some very advanced features, all at the same time. That will kill performance. I am however very impressed with TressFX 3 (truehair). 2-4 fps penalty is insanely low for such a feature. Just think of HairWorks for geralt, how that murdered performance. Sure Deus Ex Mankind Divided will have a larger penalty, as NPC's will also have TressFx, but still.

PG gamers are an ungrateful bunch. They whine like manchildren when a game is slighty lowered in graphics (think Watch Dogs and The Division), yet they whine even more when a game pushes boundries and has a subsequent performance decrease. Think the age old joke "can it run crysis"? You can't have it both ways PCMR.

I see a lot of people actually reporting that the pureFx is causing them huge fps but I will have to look into it more. Can't really trust people in the internet nowadays.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The 780ti is beating a fury x. The fuck?

Not sure why they included AMD in the testing anyways only to make them look bad? ya well they have not released driver support yet, daa

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure why they included AMD in the testing anyways only to make them look bad? ya well they have not released driver support yet, daa

 

They did it for something rather simple. To show the game's performance on all current graphics card for the launch day.

 

People on Both sides now know exactly what they are expected to get on the latest drivers from both AMD and NV. If AMD looks bad on performance due to drivers, that's entirely their own fault for being late on the drivers for the game.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

To add to the topic. PC tech features trailer.

 

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

They did it for something rather simple. To show the game's performance on all current graphics card for the launch day.

People on Both sides now know exactly what they are expected to get on the latest drivers from both AMD and NV. If AMD looks bad on performance due to drivers, that's entirely their own fault for being late on the drivers for the game.

Yes. I'm so tired of people trying to place blame on nvidia for AMD own short comings

Link to comment
Share on other sites

Link to post
Share on other sites

They did it for something rather simple. To show the game's performance on all current graphics card for the launch day.

 

People on Both sides now know exactly what they are expected to get on the latest drivers from both AMD and NV. If AMD looks bad on performance due to drivers, that's entirely their own fault for being late on the drivers for the game.

 

DF had both the 970 and 390 with the same performance though, not sure why the 390 from PCGameshardware was performing pretty poorly.

 

 

The battle between the GTX 970 and the R9 390 is fascinating. With frame-rates unlocked and settings maxed at 1080p, the Nvidia card provides a 47.5fps average, matched up against 48.8fps on AMD - but it doesn't tell the full story. Some sections of gameplay see the GTX 970 pull ahead by up to 5fps, while interior scenes, cut-scenes and close-ups see the R9 390 dominant. 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope my HTPC can handle it.

 

Anyway I'm buying this as soon as it comes out, I'm fu**ing hyped.

What resolution are you going to play at on your HTPC?

 

Because DigitalFoundry made a Video about the game.

 

Here's the video:

 

https://youtu.be/YHhPOvlnLGM?t=2m8s

CPU: AMD Ryzen 5 5600X | CPU Cooler: Stock AMD Cooler | Motherboard: Asus ROG STRIX B550-F GAMING (WI-FI) | RAM: Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4-3000 CL16 | GPU: Nvidia GTX 1060 6GB Zotac Mini | Case: K280 Case | PSU: Cooler Master B600 Power supply | SSD: 1TB  | HDDs: 1x 250GB & 1x 1TB WD Blue | Monitors: 24" Acer S240HLBID + 24" Samsung  | OS: Win 10 Pro

 

Audio: Behringer Q802USB Xenyx 8 Input Mixer |  U-PHORIA UMC204HD | Behringer XM8500 Dynamic Cardioid Vocal Microphone | Sound Blaster Audigy Fx PCI-E card.

 

Home Lab:  Lenovo ThinkCenter M82 ESXi 6.7 | Lenovo M93 Tiny Exchange 2019 | TP-LINK TL-SG1024D 24-Port Gigabit | Cisco ASA 5506 firewall  | Cisco Catalyst 3750 Gigabit Switch | Cisco 2960C-LL | HP MicroServer G8 NAS | Custom built SCCM Server.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

DF had both the 970 and 390 with the same performance though, not sure why the 390 from PCGameshardware was performing pretty poorly.

 

Looking through their settings unlike PCGameshardware, they didn't max out all the settings. Eurogamer did not run with Softshadows on, they used ASx2 instead of 16, Tessellation off  (but adaptive tessellation is used for snow deformation), and Dynamic Foliage was set to Medium.

They don't state if they're using HBAO+ either though.

 

 

They also state that; which contradicts their own final analysis where they claim the GTX 970 only averages 47.5fps. How can it only average that, but somehow keep a solid 60fps with tessellation maxed

 

It even offers some overhead for boosting settings, where we can start by bumping anisotropic filtering up to 16x. The pursuit of 1080p60 at these same settings takes a little more grunt of course, something we can achieve using a GTX 970. Even with tessellation enabled it sits at a solid 60fps, with only The Village level causing frame-drops below. This is a one-off problem area, and the simple solution is to switch this option off - though alternatively, the issue can be mitigated to a degree via overclocking.    
5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×