Jump to content

Rise of The Tomb Raider PC Performance reviewed - Another Bad port?

Mr_Troll

Squeenix talked about this, it's a derivative of tressfx 3.0 built by amd and crystal dynamics.

 

Cool. it ran pretty well in many videos i've seen and looks good too, much much better than the 1.0 in 2013 TR. Not sure if i've seen AMD logo in the game though.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Soooo.... Hard to run now equals shit port? Holy fuck you guys are conceited. A shitty port is one with technical issues - memory leaks, graphical glitches, sub-par settings options, locked fov, limited resolution options, poor CPU optimization, etc. This game goes none of that. You people are expecting groundbreaking visuals for the same performance requirements of the past. We are solidly within the era of diminishing returns, so something punishing won't look as ahead-of-its time like these games used to be in the past.

This should also be taken as an excellent example of what proper low-level APIs can do, given how this runs on the Xbox.

I wouldn't call it a bad port but it isn't really good optimized.

Look at games like Star Wars Battlefront it looks amazig with perfect scaling while running with 2-3x the fps:gallery_24650_1098_99652.jpeg

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't call it a bad port but it isn't really good optimized.

Look at games like Star Wars Battlefront it looks amazig with perfect scaling while running with 2-3x the fps:gallery_24650_1098_99652.jpeg

Johan Anderson /DICE and their frostbite engine games remain the PC industry gold standard for graphics tech, no doubt...

That said I think Tomb Raider devs have done a good job. Most people seem to have a good experience, which will only get better with AMD's game-specific driver in crimson 16.1.1

The only major issue I see is that the lowest setting is too demanding since a 750ti cannot achieve smoothness

Link to comment
Share on other sites

Link to post
Share on other sites

that bug was fixed within 28 hours of being discovered.

the bug was discovered a few hours after driver release.

28 hours isnt bad, considering the bug was found on a saturday, and by monday morning (GMT+1), the bug was patched

The bug should've been found and fixed before release. There's no excuse for that. People's gpu's died because of their negligence
Link to comment
Share on other sites

Link to post
Share on other sites

who gives a smoking pile of shit about the past.

We live in the present. And in current day, Nvidias drivers are increasingly more buggy, whilst AMDs drivers has been increasingly more stable and polished.

What hasnt changed is that Nvidia spits out drivers faster then Wall-street brokers make money, whilst AMD still takes their sweet time

Like how they took their time with the driver for gta5 oh wait they didn't it was on time like nvidia.
Link to comment
Share on other sites

Link to post
Share on other sites

I bought game game yesterday, downloaded the drivers- usage across both GPUs? 50-FUCKING-%. I have 980ti in SLI and I was expecting at LEAST 100FPS. The maximum I get is 80FPS. GJ square enix.

 

GJ from Nixxes on the port indeed, SLI scaling will mostly be fixed with a driver update.

I got near constant 90-100+fps in the temple areas with my system, the snow areas were the only ones with inconsistent FPS drops.

 

Scaling could certainly be improved, but as far as a lunch go; this is doing better than the Witcher 3 did for me, and that game runs brilliantly fully maxed now at 1440p, and is still a smidge behind this.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

The game looks totally crippled compared to the first reboot.

 

6wtYfKN.gif

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

The game looks totally crippled compared to the first reboot.

 

 

Someone doesn't remember the 2013 Tomb Raider launch performance. TressFX killed performance unlike Pure Hair ( Tress FX 3.0 ). It wasn't even really playable above 1080p at Max settings on top Dual GPUs, or single ones.

 

 

It took later drivers, and more game optimisation especially in regards to TressFX to fix it all, which later gave a 35% performance increase to many systems.

 

Tomb-Raider-GPU-Benchmarks-integrated-Tr

 

18hl7oq0k32qzpng.png

 

TR%202560%20fxaa%20tress.jpg

 

titan%20tomb%20raider%202560.png

 

tr%202560.jpg

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Rise of The Tomb Raider PC Performance Analysis – Gets Tested With AMD and NVIDIA Cards, GeForce Leads The Way

 

Either the game is that demanding or its that broken that u really need a 980ti to run it properly.

It's obviously scarred by GameWorks, a GTX 780 beats Fury X??? what a joke, get out of here!

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's obviously scarred by GameWorks, a GTX 780 beats Fury X??? what a joke, get out of here!

That HBAO+ from NVIDIA y0, it's hard core. Worse than Hairworks for sure.

Never mind the game is also running TressFX, which really means it has more advanced AMD tech than NV; OR that unlike NVIDIA, AMD haven't released a driver for the game. Nor have a basic Xfire profile for it.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

That HBAO+ from NVIDIA y0, it's hard core. Worse than Hairworks for sure.

Never mind the game is also running TressFX, which really means it has more advanced AMD tech than NV; OR thaw unlike NVIDIA, AMD haven't released a driver for the game. Nor have a basic Xfire profile for it.

yeah, it goes to show that they didn't just rehash the game engine from 2013

let's hope Crimson comes soon and we can reep on the performance

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't really had any problems with it on my 970 so far, the only time it frame drops is in the cut scenes for some reason but apart from that it's been really good. It has slight fps drops around 5fps sometimes which isn't too bad considering I am running everything at max with FXAA except for the textures which is set to high, very high textures uses more then 4GB of vram and causes stuttering.

CPU: 6600K @ 4.6Ghz | COOLER: H100i GTX | MOBO: Asus Z170 AR | GPU: EVGA GTX 1080 Ti Hybrid | RAM: G.Skill Trident Z RGB 16GB | 

CASE: Corsair 760T | PSU: Corsair RM750x | STORAGE: Samsung 850 EVO 250GB & Seagate 2TB | KEYBOARD: K70 RGB | MOUSE: Deathadder Elite

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, it goes to show that they didn't just rehash the game engine from 2013

let's hope Crimson comes soon and we can reep on the performance

 

The Crystal Dynamics rep on the Steam forums says AMD users need to wait for the Crimson 16.1.1 drivers and install those. Sadly they're nowhere to be seen. AMD needed to get these out much sooner though, especially with the game's Embargo lifting sooner and day 1 performance analysis coming out.

They were able to get a day 1 driver out for GTA V, which used both AMD and NV tech as well; this time they're behind and it's only hurting their user base. :(

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

The Crystal Dynamics rep on the Steam forums says AMD users need to wait for the Crimson 16.1.1 drivers and install those. Sadly they're nowhere to be seen. AMD needed to get these out much sooner though, especially with the game's Embargo lifting sooner and day 1 performance analysis coming out.

They were able to get a day 1 driver out for GTA V, which used both AMD and NV tech as well; this time they're behind and it's only hurting their user base. :(

GTA V was different, they actually worked actively with RockStar

I don't believe AMD was involved actively in this TB, as it's has nVidia banners all over the internet, and I doubt Huang let them be involved.

All I hope is it doesn't turn out like Project Cars, ib4 960 beats Original Titan and 780ti

because I really want to play this game

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

totalbiscuit did a short video, and said it was running pretty good.

 

He has two Titan X's though, waiting on his full port report to come out

Looking for a job

Link to comment
Share on other sites

Link to post
Share on other sites

GTA V was different, they actually worked actively with RockStar

I don't believe AMD was involved actively in this TB, as it's has nVidia banners all over the internet, and I doubt Huang let them be involved.

All I hope is it doesn't turn out like Project Cars, ib4 960 beats Original Titan and 780ti

because I really want to play this game

 

Nope, as it stands the GTX 780Ti is even giving the GTX 970 a spanking in current tests. Kepler cards are doing very well in this game as it stands.

 

Also since TressFX is in it, AMD had to have worked directly with Nixxes ( the PC port studio ) to implement it as well as it has. The difference for me between Pure Hair off, On, and Very High is only a loss of 4 fps so far. So it's been implemented amazingly well.

 

Contrast that with the Witcher 3 Hairworks where NVIDIA directly helped and it still knocked off like 15-20 fps for some people.

 

 

totalbiscuit did a short video, and said it was running pretty good.

 

He has two Titan X's though, waiting on his full port report to come out

 
It runs really well on my 980Ti's as well, although at everything maxed, especially textures I am maxing my 6GB of VRAM, so the Titan X's with 12GB are doing very well there.
Interestingly enough, dropping textures to Low, dumps almost all the VRAM down to less than 2GB, but doesn't affect my performance at all.
 
5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

The bug should've been found and fixed before release. There's no excuse for that. People's gpu's died because of their negligence

so you are saying mistakes cannot happen.

 

Also, whilst there was shitloads of rumors, there were only like 2-3 confirmed deaths. All of them was due to overclocking.

 

There IS also rumors that quite a few more GPUs died, because people realized that this bug was there and subsequently killed their own GPUs in an attempt to score a new GPU during the massive outcry.

 

The only way this bug would kill a GPU would be because IT WAS OVERCLOCKED WITH THE PROTECTION SYSTEMS TURNED OFF.

This is only possible through TRIXX or MSI Afterburner. AMD Overdrive will not let you pass a certain limit, and your PC will shut down before GPU death. I know, because ive hit that limit more times then i can count.

 

Fun fact: the moment you try to OC a AMD GPU, you VOID YOUR WARRANTY. AMD specifically states that if you decide to OC, you are responsible for whatever happens.

If their GPUs had been at stock speeds, they wouldnt have died. Simply because the stock coolers are more then capable of keeping stock settings under check, even with low fan-speeds.

Link to comment
Share on other sites

Link to post
Share on other sites

so you are saying mistakes cannot happen.

 

Also, whilst there was shitloads of rumors, there were only like 2-3 confirmed deaths. All of them was due to overclocking.

 

There IS also rumors that quite a few more GPUs died, because people realized that this bug was there and subsequently killed their own GPUs in an attempt to score a new GPU during the massive outcry.

 

The only way this bug would kill a GPU would be because IT WAS OVERCLOCKED WITH THE PROTECTION SYSTEMS TURNED OFF.

This is only possible through TRIXX or MSI Afterburner. AMD Overdrive will not let you pass a certain limit, and your PC will shut down before GPU death. I know, because ive hit that limit more times then i can count.

 

Fun fact: the moment you try to OC a AMD GPU, you VOID YOUR WARRANTY. AMD specifically states that if you decide to OC, you are responsible for whatever happens.

If their GPUs had been at stock speeds, they wouldnt have died. Simply because the stock coolers are more then capable of keeping stock settings under check, even with low fan-speeds.

"so you are saying mistakes cannot happen." ya damn skippy. 

 

"Also, whilst there was shitloads of rumors, there were only like 2-3 confirmed deaths. All of them was due to overclocking." Even if 1 died it is still unacceptable no excuses.

 

"Fun fact: the moment you try to OC a AMD GPU, you VOID YOUR WARRANTY. AMD specifically states that if you decide to OC, you are responsible for whatever happens.

If their GPUs had been at stock speeds, they wouldnt have died. Simply because the stock coolers are more then capable of keeping stock settings under check, even with low fan-speeds." Here's a FUN FACT for you if AMD didn't put it out without further testing this wouldn't of happen in the 1st place. i await further damage control from you. 

Link to comment
Share on other sites

Link to post
Share on other sites

Johan Anderson /DICE and their frostbite engine games remain the PC industry gold standard for graphics tech, no doubt...

That said I think Tomb Raider devs have done a good job. Most people seem to have a good experience, which will only get better with AMD's game-specific driver in crimson 16.1.1

The only major issue I see is that the lowest setting is too demanding since a 750ti cannot achieve smoothness

lowest settings are JUST FINE.

 

why?

We are on the brink of a new generation of GPUs, the 750Ti is old as fuck and supposed to only function at the low end. Honestly, this is the right thing to do. People need to upgrade.

 

we keep claiming that "consoles is what is killing PC gaming".

No, assholes who still run GTX480s are what is killing PC gaming. http://anandtech.com/bench/product/1043?vs=1135

Why is  this an issue? because these motherfucking ancient GPUs, not all supports DX11, and certainly doesnt support DX12. Yet people still fucking use em.

 

Because PC is fully forwards and backwards compatible, we "have" to include these kneejerks who run shitty hardware and refuse to upgrade. And yes, most of these people could easily afford an FX 6300 + R9 380 or i3 4170 + R9 380. They simply doesnt care. ANd their careless shitty attittude causes the rest of us having to deal with games being DOWNGRADED to run on said hardware.

 

A game being too hard to run on a ultra-low GPU? BUY SOMETHING BETTER

Link to comment
Share on other sites

Link to post
Share on other sites

"so you are saying mistakes cannot happen." ya damn skippy. 

 

"Also, whilst there was shitloads of rumors, there were only like 2-3 confirmed deaths. All of them was due to overclocking." Even if 1 died it is still unacceptable no excuses.

 

"Fun fact: the moment you try to OC a AMD GPU, you VOID YOUR WARRANTY. AMD specifically states that if you decide to OC, you are responsible for whatever happens.

If their GPUs had been at stock speeds, they wouldnt have died. Simply because the stock coolers are more then capable of keeping stock settings under check, even with low fan-speeds." Here's a FUN FACT for you if AMD didn't put it out without further testing this wouldn't of happen in the 1st place. i await further damage control from you. 

you are way too naive to converse with it seems.

 

once you crawl out of your utopia, wake up and face reality.

 

i have one thing to tell you before i put you on my ignore list though.

https://en.wikipedia.org/wiki/Murphy's_law

 

 

Anything that can go wrong, will go wrong.

 

no matter the QC, no matter the testing. Apply time and eventually shit will happen.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Fun fact: the moment you try to OC a AMD GPU, you VOID YOUR WARRANTY. AMD specifically states that if you decide to OC, you are responsible for whatever happens.

If their GPUs had been at stock speeds, they wouldnt have died. Simply because the stock coolers are more then capable of keeping stock settings under check, even with low fan-speeds.

just a question- how would AMD or anyone know that the GPU has been overclocked?

Link to comment
Share on other sites

Link to post
Share on other sites

Also since TressFX is in it, AMD had to have worked directly with Nixxes ( the PC port studio ) to implement it as well as it has.

I don't think so, TressFX is open source and they don't need any validation from AMD to implement it,

also they had TressFX in their previous engine.

Would you know if it's the old TressFX 2.0 from TR2013, or did they put in the new TressFX 3.0

 

I'm so stoked to play the game, getting my PSU back on monday.

 

TressFX vs. Hairworks - TressFX is just better, HW does nothing but tesselate, it has no logic, it can't be optimised

 

Higher Textures don't  require more power to render they just require more memory to store,

GPU time goes into lightning, reflections, shadows, illumination, post processing

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think so, TressFX is open source and they don't need any validation from AMD to implement it,

also they had TressFX in their previous engine.

Would you know if it's the old TressFX 2.0 from TR2013, or did they put in the new TressFX 3.0

 

I'm so stoked to play the game, getting my PSU back on monday.

 

TressFX vs. Hairworks - TressFX is just better, HW does nothing but tesselate, it has no logic, it can't be optimised

 

Higher Textures don't  require more power to render they just require more memory to store,

GPU time goes into lightning, reflections, shadows, illumination, post processing

 

It's an implementation of TressFX3.0, as Square Enix is using it in the upcoming Deus Ex game as well.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×