Jump to content

Another day, another awful console to PC port release (The last of us: Part 1)

11 minutes ago, Kisai said:

then a RTX 3080 is the minimum requirements.

Why a RTX 3080? Pretty much all the AMD GPUs run it fine and the RTX 3060 12GB does too. You'd have to be exceedingly out of touch to play that style of game and say 47/39 FPS is not enjoyable to play. Neither do you need to play with Ultra since that does basically nothing compared to High other than make it run worse and then you also have DLSS if you really want to run Ultra. This is not a competitive shooter.

 

1080p High on the RTX 3060 12GB is 63/55 btw, and all the 8GB GPUs have no issues with this configuration other than the Intel A750 (which is basically irrelevant).

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, leadeater said:

Why a RTX 3080? Pretty much all the AMD GPUs run it fine and the RTX 3060 12GB does too. You'd have to be exceedingly out of touch to play that style of game and say 47/39 FPS is not enjoyable to play. Neither do you need to play with Ultra since that does basically nothing compared to High other than make it run worse and then you also have DLSS if you really want to run Ultra. This is not a competitive shooter.

 

1080p High on the RTX 3060 12GB is 63/55 btw, and all the 8GB GPUs have no issues with this configuration other than the Intel A750 (which is basically irrelevant).

Did you look at the graph how 1080p medium was just scraping the 10GB VRAM line?

 

If it's not hitting 60fps, it's not meeting the minimum requirements, Period. Maybe its's barely playable at medium on an 8GB card. Who knows, I'm not one of those people with 50 GPU's to test.

 

But it's not a reasonable ask that a game that came out 10 years perform AT LEAST as well as it did 10 years ago on BETTER hardware.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kisai said:

Did you look at the graph how 1080p medium was just scraping the 10GB VRAM line?

I kinda went over that in a previous post. Based on other charts in that video, 8GB cards seemed to be ok still at 1080p High and Medium regardless of what the observed VRAM usage was on the 6950XT. By "ok" I mean the lows were not dropping like the earlier screenshot posted by WereCat.

 

4 minutes ago, Kisai said:

If it's not hitting 60fps, it's not meeting the minimum requirements, Period. Maybe its's barely playable at medium on an 8GB card. Who knows, I'm not one of those people with 50 GPU's to test.

From the video, 1080p high averaged over 60fps on 3060, 3060 Ti, 3070, 6600XT+ and more. 1080p medium extends that down to 3050 and 6600. All without massive drops to lows.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

So like I said...  on my computer, when it runs, it runs with fps between 41 and 58 fps  ... 5800x with RX570 4 GB, 2 monitors, main one 1920x1200, second one 1080p

 

Game runs with all settings on lowest, or off, rendering at 1280x800, FSR2 scaling to 1920x1200  on quality preset. At these settings the game uses around 4700 MB of VRAM, on top of the ~ 850 MB the system uses - probably if I disable second monitor and switch to basic looks in Windows, that 850 MB may go down a bit.

Lowering the rendering resolution or quality preset for FSR2 doesn't seem to change much, maybe lowers that 4700 MB by around 100-150 MB, but makes game horrible... no point rendering at 640x360.

 

I think they just need to tweak the game to get it to use maybe around 3500 MB of VRAM at the absolute lowest, or just raise the minimum requirements to a 6 GB VRAM card like GTX1060, minimum 8 GB on AMD like RX x70 / x80 / x90 8 GB cards.

In my case, having only 16 GB of actual memory may also be an issue, as the game goes up to 9-10 GB before it crashes, with the Oodle2 decompression library changed so that it won't leak memory like crazy.

Link to comment
Share on other sites

Link to post
Share on other sites

Previously : Can it run Crysis ?

 

Now : Can it run TLoU Part 1 ?

 

 

I guess ?

 

Man, been wanting to play TLoU Part 1 since ages ago, but didn't want to buy a console.

Was excited when I heard news about the port, should've known there'll be something wrong....

Holy shit that VRAM usage

 

What's next? Bloodborne port needing 16gb vram for 1080 medium ? <_<

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/30/2023 at 1:28 PM, hysel said:

 I work in the enterprise world, where software testing is paramount for a successful product release.

You can't test a 10 year old game thoroughly. Somebody may leak the plot. Set some realistic expectations! ¯\_(ツ)_/¯

 

 

But who knows how the timeline was set. Maybe they were like "with the success of the TV show we want the PC port a week ago". But that's not how it happened - it was announced in July (2022) that the PC port will launch soon after the PS5 version.

 

The Intel Arc A770 beats the 3070 in the "The Last of Us Part 1"! What a time to be alive! 🤡

Link to comment
Share on other sites

Link to post
Share on other sites

Hey ..it could be worse. it could be the dumpster fire that is Part 2.

I'm sure we'll have that to 'look forward'? to.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, HenrySalayne said:

You can't test a 10 year old game thoroughly. Somebody may leak the plot. Set some realistic expectations! ¯\_(ツ)_/¯

 

 

But who knows how the timeline was set. Maybe they were like "with the success of the TV show we want the PC port a week ago". But that's not how it happened - it was announced in July (2022) that the PC port will launch soon after the PS5 version.

 

The Intel Arc A770 beats the 3070 in the "The Last of Us Part 1"! What a time to be alive! 🤡

A 10y old game's plot is most likely already quite well known anyway. 😅

 

Although, considering their "Mostly Negative" reviews of 10,000-ish....

The game's title might really end up as an unintentional declaration by the devs.

 

I'm guessing somebody in the company rushed the game out thinking there'll be hotfixes to save the day, and it backfired splendidly.

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WereCat said:

You already hit a brick wall at 1080p if you set textures to Ultra with 8GB of VRAM.... On high it works fine though.

 

Look at the 1% lows with the 8GB cards, unplayable stutter mess. RTX 3060 with it's 12GB is beating 3070ti.

 

image.thumb.png.a369ba3244cd289afdee7d153bd9fa49.png

You know its a bad port when I can play Cyberpunk which is more demanding at a much higher frame rate (6800XT)

 

 

1080p_Ultra.png

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I kinda went over that in a previous post. Based on other charts in that video, 8GB cards seemed to be ok still at 1080p High and Medium regardless of what the observed VRAM usage was on the 6950XT. By "ok" I mean the lows were not dropping like the earlier screenshot posted by WereCat.

 

My working theory here is that the game doesn't scale "downward" to under 10GB, like it's interesting to note how it falls right below. Video memory is representative the textures, models, shaders, etc combined. If you overallocate video memory, it is going to use system memory, and at that point on a PC, you're in trouble. I may have 96GB of memory in the desktop, but if the GPU allocates 32GB, 8 GB of that is coming out of system memory, and is going to be 25GB/sec, not 935GB/sec. And that memory has to be shared with EVERYTHING.

 

 

1 hour ago, porina said:

From the video, 1080p high averaged over 60fps on 3060, 3060 Ti, 3070, 6600XT+ and more. 1080p medium extends that down to 3050 and 6600. All without massive drops to lows.

Again, if it's not running at 60fps out of the box, that is a problem. Honestly, I would be surprised if ANY new game can do 1080p60 on a x50 part.

 

Now take into account all the other problems people have, and it points right back to trying to run it on too weak of hardware.

 

Honestly what I wish Steam would do is put computer's steam survey results beside reviews, so we know when reviews are based on people running high end games on potatos.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

If it's not hitting 60fps, it's not meeting the minimum requirements, Period.

For you*, I played this game on original release on the PlayStation and you want to know what the frame rate was? Exactly, no point during playing it did I need more.

 

60 FPS is a personal requirement for yourself that you have set. At no point ever should you be declaring that the minimum requirements for a game should be XYZ or some high end GPU when the game is perfectly playable lesser than that and not bound to your personal choices.

 

You can say you'd recommend XYZ as there are also official recommended specifications but also similarly I wouldn't go round saying Ultra settings is a minimum nor recommended requirement at any resolution.  

 

The RTX 3060 meets your absurd requirement anyway, to quote myself in the post you are replying to

Quote

1080p High on the RTX 3060 12GB is 63/55 btw

"Hitting" 60 FPS has been obtained.

 

20 minutes ago, Kisai said:

Again, if it's not running at 60fps out of the box, that is a problem. Honestly, I would be surprised if ANY new game can do 1080p60 on a x50 part.

And yet the RTX 3050 1080p High gets 50/43 and Medium 71/55 so both setting are quite adequate.

 

I would say a valid criticism is the game isn't picking the correct presets for GPUs, an RTX 3050 should be defaulting to 1080p High. That's what I would be targeting if I were the developer/publisher.

 

Please don't be an elitist gate keeper, its quite unnecessary.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

 

The RTX 3060 meets your absurd requirement anyway, to quote myself in the post you are replying to

 

"Hitting" 60 FPS has been obtained.

image.png.459a14b99d6b86a203621dfc34d4ff25.png

Again. Where is the 60fps line? None of the 8GB cards. They card with the lowest amount of VRAM there is the 3080.

 

vram.png

https://www.techpowerup.com/review/the-last-of-us-benchmark-test-performance-analysis/4.html

performance-1920-1080.png

 

You know what is hilarious?  4K

performance-3840-2160.png

If you're playing 4K on the highest settings, you're not getting 60fps from ANY card except the highest end ones.

 

Even the conclusion notes that:

Quote

Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K. 

 

If they somehow manage to push that VRAM allocation down by 2GB at 1080p, then it should be hitting 60fps on all the 8GB cards. 4Kp60? Impossible for the 10GB and 12GB cards. 

 

So my conclusion is unchanged. The game is not designed for 8GB cards. And clearly the target was 10GB or better cards.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

You don't have to run Ultra preset

IT DOESN'T SEEM TO MATTER WHAT SETTINGS YOU PICK

 

The difference between "low" and "Ultra" does not change the texture sizes.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kisai said:

IT DOESN'T SEEM TO MATTER WHAT SETTINGS YOU PICK

YES IT DOES!!!

 

Ok being stupid aside check the damn 1080p High chart, I'm not going to feed it to you or argue it again. I have literally given you the FPS numbers twice now, either read what I write or stop replying.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Kisai said:

IT DOESN'T SEEM TO MATTER WHAT SETTINGS YOU PICK

WHY ARE WE SCREAMING?

 

And to be fair, it matters a tiny little bit:

5 hours ago, leadeater said:

 

image.thumb.png.62a082b70969c088a632d85bb1ef1901.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HenrySalayne said:

And to be fair, it matters a tiny little bit:

It matters a lot, it's the difference between stable and stutter garbage. If you never check the performance numbers and just look at VRAM usage/allocation then you aren't doing yourself any favors and being informed.

 

image.thumb.png.e09cfd4600805dfea9adc205d756182a.png

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, leadeater said:

You don't have to run Ultra preset

This is true but the fact remains that with more VRAM you COULD run the Ultra preset just fine as the GPU performance is not the limiting factor but the VRAM amount. That said, there is only a handful of games where this happens as of yet but the issue is if this becomes a trend and that will be an issue especially if the lower-end cards that are yet to release will have the same amount of VRAM.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, WereCat said:

This is true but the fact remains that with more VRAM you COULD run the Ultra preset just fine as the GPU performance is not the limiting factor but the VRAM amount. That said, there is only a handful of games where this happens as of yet but the issue is if this becomes a trend and that will be an issue especially if the lower-end cards that are yet to release will have the same amount of VRAM.

Oh absolutely and this does not bode well for 8GB GPUs going forward. Games should at least be getting the default preset selection correct for a given GPU and resolution though, nobody should be graced with a stutter mess by default when it's so easily avoided.

 

The great thing about PCs is the vast variance in hardware and graphical settings, railroading everyone in to Ultra 1440p/4k is simply unfair to the actual majority. It's like people want to make PCs more and more second class afterthoughts to consoles because the platform is an unobtainable halo to the "common person". Why bother putting the effort in for only "10%" of the total sales of the game. Welcome to the self defeating prophesy mindset.

Link to comment
Share on other sites

Link to post
Share on other sites

I think we're over analysing the situation here. The game stated minimum and recommended requirements are 4GB and 8GB VRAM respectively. We probably don't have to really worry about things until such time 8GB becomes minimum. We have usable settings that work with 8GB GPUs. This may be a case of "allocated" does not equal "actually used".


A game dev might look at the potential Steam market at each VRAM minimum size. I'm using numbers from Steam Hardware Survey (Feb 2023)

4GB: 79.3%

6GB: 64.5%

8GB: 45.1%

10GB: 17%

Feels like supporting <8GB will remain for some time unless devs want to do a technical showcase.

 

Also, we're focusing on fps numbers, but has anyone actually looked at what image quality these presets give?

 

12 minutes ago, leadeater said:

Games should at least be getting the default preset selection correct for a given GPU and resolution though, nobody should be graced with a stutter mess by default when it's so easily avoided.

Maybe I missed it, is that the case here? Does the game not default to reasonable settings for the hardware?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

These are the posted system requirements , see picture below.

 

As I said, with a 5800x with 16 GB of ram  and a RX 570 4 GB, game is unplayable for me, it crashes within minutes. 

It uses nearly 5 GB of vram with all graphics settings on low or off, rendering at 1280x720 or thereabouts. With the 500-800 MB of vram used by windows, it's no wonder it's unstable and crashes, as the game would constantly shuffle stuff between vram and ram.

The recommended preset ... I'm not sure 1080p high would work on 8G cards, but low-medium maybe ... it would work but with lots of stutter and 10-20fps 1% lows, because game uses 9-10 GB vram

 

4109991-the-last-of-us-pc-specs.thumb.jpg.93f777f6826ac6e21f2842befbcb9282.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, porina said:

Maybe I missed it, is that the case here? Does the game not default to reasonable settings for the hardware?

Honestly no idea at all, just assuming it's a problem since everyone seems to be complaining about performance and crashing. Basically the only part that interest me was seeing how high those VRAM numbers were. Dragging that good old VRAM capacity debate kicking and screaming back in to prominence. It might actually start to matter, again. Seems to be decade ish cycles before it matters then GPUs bump up and then it stop mattering for another 10 years,

Link to comment
Share on other sites

Link to post
Share on other sites

its so weird seeing vram be the issue. Im so used to the ammount of vram just not actually mattering for games in any real way. And its been that way up until the 3000 series when nvidia introduced compression streaming in ram where in perfect worlds 8 gig = 12 gig because of compression. but it just does not seem to have panned out. 

I noticed this with the RE engine with all their games being able to just yeet it and rip through over 12 gigabytes at just 1440p when re7 dropped if you configured it right, but you could always lower the settings to not do that. 

the last two generations is the first time in forever (as in I have payed close attention since the 200 cards) that for normal consumers its something to even consider. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, porina said:

I think we're over analysing the situation here. The game stated minimum and recommended requirements are 4GB and 8GB VRAM respectively. We probably don't have to really worry about things until such time 8GB becomes minimum. We have usable settings that work with 8GB GPUs. This may be a case of "allocated" does not equal "actually used".

Allocated means it's allocated. You can malloc 32GB of RAM and only ever write to the first 10 bytes, meaning the rest of it is wasted. Sure, I could believe that they "allocated 10GB" of video memory even at low 720p settings, but the way textures work means those 4K textures still have to be loaded and mipmapped downwards. 

 

Like a theoretical scenario here is that they uploaded 4GB of 4K textures, and mipmapped it downwards to 2K textures so the GPU can operate reasonably on it at 1080p. The game is unlikely to be shipping with pre-scaled down textures. But it's also just likely that they just use the 4K textures and don't give a care. We have no insight on this unless someone is data-mining the game with debugging tools to actually see what is being sent to the GPU.

 

 

 

20 minutes ago, porina said:

 


A game dev might look at the potential Steam market at each VRAM minimum size. I'm using numbers from Steam Hardware Survey (Feb 2023)

4GB: 79.3%

6GB: 64.5%

8GB: 45.1%

10GB: 17%

Feels like supporting <8GB will remain for some time unless devs want to do a technical showcase.

Unreal Engine 5.x is likely the end of <12GB cards. Just from what's been observed, it's more likely that 16/24/32/48GB cards would be the x60/x70/x80/x90 configuration going forward in the arms race for 4Kp60 baseline. But seeing as no GPU can do 8K at any capability and no 8K monitors exist (yes I'm aware that NVIDIA regularly touts 8K, but has yet to show off a game running at 8K native at any framerate, only video.)

 

20 minutes ago, porina said:

Also, we're focusing on fps numbers, but has anyone actually looked at what image quality these presets give?

Doesn't appear to be so.

 

20 minutes ago, porina said:

Maybe I missed it, is that the case here? Does the game not default to reasonable settings for the hardware?

Also doesn't appear to be so, or people wouldn't be screaming about it.

 

Like keep in mind that the expectation, out-of-the-box is always a 1080p60 experience and has been for at least 10 years (which is how old the PS4 is.) If a game can't do it, then people will fundamentally believe it's broken when that $500 last-gen console ends up being a better experience.

 

People excusing the PC experience by going "oh just tune down the settings" are missing the point. Or are willingly ignoring the point. Why should you spend all this money on a PC to get an inferior experience to the console version of the exact same game?

 

If you insist on playing games on x50 and x60 parts you should not be expecting anything more than a 1080p60 experience on a new game. But you should also have the expectation that the out-of-the-box experience would be equal to that of a 1080p60 experience on the PS5. Otherwise you would have bought a PS5 instead of spending four times as much on a PC.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, mariushm said:

The recommended preset ... I'm not sure 1080p high would work on 8G cards, but low-medium maybe ... it would work but with lots of stutter and 10-20fps 1% lows, because game uses 9-10 GB vram

HUB testing didn't show stutter at 1080p High on 8GB GPUs. The stutter with 8GB cards happened at Ultra and/or 4k settings.

 

3 minutes ago, Kisai said:

Allocated means it's allocated. You can malloc 32GB of RAM and only ever write to the first 10 bytes, meaning the rest of it is wasted.

As far as game performance goes, if you're not actually using the allocated amount, the unused portion would eventually be swapped out and only the active part kept on hand. This is what we don't know.

 

3 minutes ago, Kisai said:

Unreal Engine 5.x is likely the end of <12GB cards.

If a dev is happy to address what is currently 12% of the active Steam userbase, that is their decision. That proportion will increase over time but outside of tech demos I struggle to see 8GB going unsupported for at least a gen or two.

 

3 minutes ago, Kisai said:

Like keep in mind that the expectation, out-of-the-box is always a 1080p60 experience and has been for at least 10 years (which is how old the PS4 is.) If a game can't do it, then people will fundamentally believe it's broken when that $500 last-gen console ends up being a better experience.

Not all games are 60fps on console. 30fps is still a thing.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×