Jump to content
42 minutes ago, PolishGod said:

I usually play Minecraft and TF2 because I am very good at those and that's where I need high FPS

competitive gamer i see but going past 240hz still seems kinda pointless imo unless you are going for competitions or something

 

i was already demolishing alot of pc players with this pile of crap vivo at somewhere around 30fps back in 21-23 when i was playing really competitive mc bedrock, at some point i realized the low fps was screwing me over with worse bridging ability and block placing which they seem to have fixed somewhat with the newer versions and paired with some inconsistencies in the controls so i basically just starting raging and tryharding even harder alot of the time cause of it because i knew i was being limited by hardware

 

its kind of like getting so pissed that you become a better player to overcome hardware issues

 

i was considering the xperia 1 originally but now im more interested in the aquos zero 2 due to the 240hz display though tbh even 120 will be enough for me, if i ever have to upgrade probably gonna buy another aquos since they are well ahead of everyone else in high refresh displays and the aquos r9 makes the new rog 9 look like a complete joke display wise

 

6 minutes ago, Blue4130 said:

Practice will make you a better gamer. But in regards to minecraft, how do you become a "better"player?

minecraft pvp is a thing, theres also some skill in playing survival like speedrunning the game and whatnot

Link to post
Share on other sites

3 minutes ago, Somerandomtechyboi said:

minecraft pvp is a thing, theres also some skill in playing survival like speedrunning the game and whatnot

if you're getting 200fps already in minecraft, 500fps will not give you any advantage really..

Note: Users receive notifications after Mentions & Quotes. 

Feel free: To ask any question, no matter what question it is, I will try to answer. I know a lot about PCs but not everything.

current PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti [further details on my profile]

PCs I used before:

  1. Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050
  2. Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050
  3. Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti
Link to post
Share on other sites

Like others said, you might be CPU bound before you're GPU bound in the games you mentioned as your priority. Especially minecraft, chunk render distance mostly effects CPU IIRC. Or even engine bound, some games can start to have issues above a certain framerate.

 

Your performance goals in the original post are far, far, far into diminishing returns territory. As are the GPUs with any hope of achieving them (the 4090 currently, 5090 as soon as it releases). Bring them down to something reasonable and a reasonable high-end GPU like the 7900 XTX is a solid choice.

 

I would not advise trying for 4K high refresh, regular 4K60 is brutal enough. 1440p is 3,686,400 pixels. 4K (3840x2160, the most common 4K resolution) is over double that, at 8,294,400 pixels per frame. I have a 4K60 panel and I do sometimes consider dropping down to a 1440p display as I really don't want to pony up for a GPU that can reliably drive that.

 

Don't knock upscaling and frame generation until you try them in each game. Current DLSS, XeSS, and FSR implementations can be very good, especially on the higher quality options, but it does depend on the game. In some it's kinda bleary, in others it'll be totally crisp. Frame generation is similar (I wouldn't recommend it for anything multiplayer, but singleplayer games can be fine). I have it enabled in Horizon Forbidden West and I can't tell it's on, but in Space Marine 2 it feels choppy/laggy even though it hits a solid 60+ fps. So they aren't totally dependable, but they can be useful in games where they're implemented well. Especially if you're more sensitive to framerates than a possible drop in graphical fidelity (thankfully I am not, the only time I could notice the difference between 60Hz and 144Hz+ without looking for it, was in Titanfall 2, and I don't play that anymore).

 

What CPU are you using? If it's as old as the 1070 then it'll choke any current high end GPU.

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i5 12600KF

Cooler: Noctua NH-L12S

Motherboard: ASRock Z690 ITX/ax

RAM: 2x16GB 3600Mhz DDR4

GPU: Intel ARC A770 16GB LE

Storage: 1TB MP34 + 2TB P41 + 2x 1TB MX500

ODD: LG WH14NS40

PSU: EVGA 850W GM

Case: Silverstone Sugo 14

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 15" M3 MacBook Air (work) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, flash, running Rockbox), Nintendo Switch

 

Vehicles: 2002 Ford F150, 2022 Kawasaki KLR650

Link to post
Share on other sites

5 hours ago, PolishGod said:
...I need to get at least 500 fps when in 1440p (would like to get 500 fps with a 4K monitor) when recording and streaming, at least for TF2 and Minecraft...
 
...In other games like Hogwarts Legacy, GTA 5-6 and wii games, I only need maybe 240...
 
...but if I wanted to get one right now, this is the one I would get, it's the closest things I found to an OLED, low response time, 500 Hz 4K...

Just get a 5090 and stop thinking about it

 

Or go outside and do things more meaningful in life, watch it all happen at infinite fps and full raytracing for $0

Link to post
Share on other sites

17 hours ago, Zando_ said:

Like others said, you might be CPU bound before you're GPU bound in the games you mentioned as your priority. Especially minecraft, chunk render distance mostly effects CPU IIRC. Or even engine bound, some games can start to have issues above a certain framerate.

 

Your performance goals in the original post are far, far, far into diminishing returns territory. As are the GPUs with any hope of achieving them (the 4090 currently, 5090 as soon as it releases). Bring them down to something reasonable and a reasonable high-end GPU like the 7900 XTX is a solid choice.

 

I would not advise trying for 4K high refresh, regular 4K60 is brutal enough. 1440p is 3,686,400 pixels. 4K (3840x2160, the most common 4K resolution) is over double that, at 8,294,400 pixels per frame. I have a 4K60 panel and I do sometimes consider dropping down to a 1440p display as I really don't want to pony up for a GPU that can reliably drive that.

 

Don't knock upscaling and frame generation until you try them in each game. Current DLSS, XeSS, and FSR implementations can be very good, especially on the higher quality options, but it does depend on the game. In some it's kinda bleary, in others it'll be totally crisp. Frame generation is similar (I wouldn't recommend it for anything multiplayer, but singleplayer games can be fine). I have it enabled in Horizon Forbidden West and I can't tell it's on, but in Space Marine 2 it feels choppy/laggy even though it hits a solid 60+ fps. So they aren't totally dependable, but they can be useful in games where they're implemented well. Especially if you're more sensitive to framerates than a possible drop in graphical fidelity (thankfully I am not, the only time I could notice the difference between 60Hz and 144Hz+ without looking for it, was in Titanfall 2, and I don't play that anymore).

 

What CPU are you using? If it's as old as the 1070 then it'll choke any current high end GPU.

I said what components I use in the post, I use an i9-13900k.

"Your performance goals in the original post are far, far, far into diminishing returns territory. As are the GPUs with any hope of achieving them (the 4090 currently, 5090 as soon as it releases). Bring them down to something reasonable and a reasonable high-end GPU like the 7900 XTX is a solid choice."

48 fps while interpolating, and 500 fps in tf2 and minecraft are reasonable framerates. I already get maybe 600 fps while playing on mcmanhunt, I'm just going to need a GPU that lets me bump up the settings while getting the same FPS.

There is no 500 hz 4K oled monitor, the one I'm looking at is 1440p 480 hz Oled. I will probably not get a new monitor for like five years though, there will be one better by then. The worst resolution I'll go down to is 1080p, but refresh rate is the most important part to me.

"Don't knock upscaling and frame generation until you try them in each game" Have you seen me play TF2? I quickly turn around, very often, isn't it going to look weird if it predicts I'll keep looking forward but then suddenly look back?

You don't recommend it for anything multiplayer? Well, that's Minecraft and TF2, I do not play singleplayer often anymore unless I want to do something like play against a terminator or beat the game on a timer. 

About you only telling 60 from 144 hz without looking for it once, I noticed my monitor's refresh rate was set somewhere at half of the normal 242 hz (+2 hz from overclocking) because of my cursor, and it's similar in TF2.

 

So, my 13th generation CPU, do you think it's big enough for a card like the RX 7900 XTX? Does that card have a low price / performance?

Link to post
Share on other sites

22 minutes ago, PolishGod said:

I said what components I use in the post, I use an i9-13900k.

In my defense, it's a wall of text and hard to read. 

 

I don't know why you mentioned 4K if you mean to play on 1440p for 5 years, any current card will be up for replacement by then anyways. 

22 minutes ago, PolishGod said:

You don't recommend it for anything multiplayer? Well, that's Minecraft and TF2, I do not play singleplayer often anymore unless I want to do something like play against a terminator or beat the game on a timer. 

Are "Hogwarts Legacy, GTA 5-6 and wii games" not single-player titles? GTA Online isn't, but it runs the same as GTA5 as it's part of the same game, I assume GTA6 will be set up similarly. 

22 minutes ago, PolishGod said:

About you only telling 60 from 144 hz without looking for it once, I noticed my monitor's refresh rate was set somewhere at half of the normal 242 hz (+2 hz from overclocking) because of my cursor, and it's similar in TF2.

Yep, that tracks. Some people can just notice the difference easily. I only can if I actually look for it, I think in Linus' blind tests he's done over the years with his employees, some folks also just couldn't see the difference if they weren't told to look for it. 

22 minutes ago, PolishGod said:

So, my 13th generation CPU, do you think it's big enough for a card like the RX 7900 XTX? 

Yep. Still a top tier CPU, should keep up with a 7900 XTX easily. 

22 minutes ago, PolishGod said:

Does that card have a low price / performance?

For your use case price/performance is quite good. Grabbing the 1440p summary from TomsHardware's launch review of the 7900 XT and XTX (ignore the fps numbers themselves, the % difference between each GPU is what matters in benchmarks):

d8bjSkj6ZyoKQ6EnjFqZXL-1200-80.png.thumb.webp.966a889fb5ba469df13bb8b7c86a07d0.webp

 

And grabbing the cheapest price for each card off US PCPartPicker as that's my frame of reference:

Screenshot2025-01-14at1_41_49PM.thumb.png.1c5a0f16b21f374b0a5ed530c0bbff78.png

 

And plugging these in to a percentage difference calculator, with the 7900XTX as the baseline:

7900 XT - 9% slower, 25% cheaper

4090 - 12% faster, 77% more expensive

 

If you're willing to drop ~$1000 on a GPU, the 7900 XTX is quite nice. If you'd rather spend less, the XT isn't much slower, and is a good bit cheaper. The 4090 really isn't worth it unless you just gotta have all the frames and money is no object.

 

Of course these numbers are rough estimates (based off AAA singleplayer titles, the difference may be smaller in easier to run titles). and obviously the price difference changes depending on what exact model you're buying and from where. But it's good enough to get the point across. 

 

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i5 12600KF

Cooler: Noctua NH-L12S

Motherboard: ASRock Z690 ITX/ax

RAM: 2x16GB 3600Mhz DDR4

GPU: Intel ARC A770 16GB LE

Storage: 1TB MP34 + 2TB P41 + 2x 1TB MX500

ODD: LG WH14NS40

PSU: EVGA 850W GM

Case: Silverstone Sugo 14

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 15" M3 MacBook Air (work) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, flash, running Rockbox), Nintendo Switch

 

Vehicles: 2002 Ford F150, 2022 Kawasaki KLR650

Link to post
Share on other sites

21 hours ago, Zando_ said:

In my defense, it's a wall of text and hard to read. 

 

I don't know why you mentioned 4K if you mean to play on 1440p for 5 years, any current card will be up for replacement by then anyways. 

Are "Hogwarts Legacy, GTA 5-6 and wii games" not single-player titles? GTA Online isn't, but it runs the same as GTA5 as it's part of the same game, I assume GTA6 will be set up similarly. 

Yep, that tracks. Some people can just notice the difference easily. I only can if I actually look for it, I think in Linus' blind tests he's done over the years with his employees, some folks also just couldn't see the difference if they weren't told to look for it. 

Yep. Still a top tier CPU, should keep up with a 7900 XTX easily. 

For your use case price/performance is quite good. Grabbing the 1440p summary from TomsHardware's launch review of the 7900 XT and XTX (ignore the fps numbers themselves, the % difference between each GPU is what matters in benchmarks):

d8bjSkj6ZyoKQ6EnjFqZXL-1200-80.png.thumb.webp.966a889fb5ba469df13bb8b7c86a07d0.webp

 

And grabbing the cheapest price for each card off US PCPartPicker as that's my frame of reference:

Screenshot2025-01-14at1_41_49PM.thumb.png.1c5a0f16b21f374b0a5ed530c0bbff78.png

 

And plugging these in to a percentage difference calculator, with the 7900XTX as the baseline:

7900 XT - 9% slower, 25% cheaper

4090 - 12% faster, 77% more expensive

 

If you're willing to drop ~$1000 on a GPU, the 7900 XTX is quite nice. If you'd rather spend less, the XT isn't much slower, and is a good bit cheaper. The 4090 really isn't worth it unless you just gotta have all the frames and money is no object.

 

Of course these numbers are rough estimates (based off AAA singleplayer titles, the difference may be smaller in easier to run titles). and obviously the price difference changes depending on what exact model you're buying and from where. But it's good enough to get the point across. 

 

 

 

" In my defense, it's a wall of text and hard to read. " But I write well 😞

 

" I don't know why you mentioned 4K if you mean to play on 1440p for 5 years, any current card will be up for replacement by then anyways. " 4K is better than 1440p, it would be great if I can find a GPU that handles 4K.

" Writing is broken, I can't make new lines.. "Are "Hogwarts Legacy, GTA 5-6 and wii games" not single-player titles? GTA Online isn't, but it runs the same as GTA5 as it's part of the same game, I assume GTA6 will be set up similarly. " I don't play those games competitively, framerate is less important here. I'm probably getting 7900 XTX or 7900 XT, 7900 XT if it's good enough and more valuable. The 9% slower thing is when gaming, I guess? If it's something like 40% slower at AI-tasks, that's pretty bad. Can I upload that image to ChatGPT and he'll calculate the percentagedifferences? Is it important what brand I buy it from? Because this whole time I've just seen a 7900 XTX as a 7900 XTX regardless of the brand.
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×