Jump to content

Why is the fx-8350 So Underrated for 4k Gaming?

11 minutes ago, avrona said:

Once again, it's a 60Hz monitor, so for that the performance is good, it's not just my standards.

No it's not. Getting an unsteady frame rate on a 60hz monitor is even worse than on a 144hz one.

 

It's FINE that you don't want to upgrade. Just stop saying you have proof that it's good. For the vast majority of people it wouldn't be playable. Personally that kind of choppy frame rate would give me motion sickness.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, JoostinOnline said:

No it's not. Getting an unsteady frame rate on a 60hz monitor is even worse than on a 144hz one.

 

It's FINE that you don't want to upgrade. Just stop saying you have proof that it's good. For the vast majority of people it wouldn't be playable. Personally that kind of choppy frame rate would give me motion sickness.

Well I do have proof that's it good though, and it's not choppy whatsoever.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, JoostinOnline said:

It's FINE that you don't want to upgrade

... and BD is not as bad as you claim it is today, you can live with it still.

Better than with a Pentium or low grade i3...

And the power consumption is equal to other 32nm Intel CPU like Sandy-E....

 

And in a year or two, he can upgrade to Ryzen 3000 or even 4000 series.

 

I really don't get why you bash someone that's happy with an older AMD System...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, JoostinOnline said:

No it's not. Getting an unsteady frame rate on a 60hz monitor is even worse than on a 144hz one.

 

It's FINE that you don't want to upgrade. Just stop saying you have proof that it's good. For the vast majority of people it wouldn't be playable. Personally that kind of choppy frame rate would give me motion sickness.

Sensitivity to frametime variance is what most matters here...
And easier to explain to those that are not as sensitive to those variances.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, avrona said:

Well I do have proof that's it good though, and it's not choppy whatsoever.

To most people that's incredibly choppy. Uneven frametimes don't bother you, which is great. It means you don't need to upgrade or lower your settings. But that doesn't mean the performance is objectively good. I'm not sure why you don't get that.

37 minutes ago, Stefan Payne said:

... and BD is not as bad as you claim it is today, you can live with it still.

Better than with a Pentium or low grade i3...

And the power consumption is equal to other 32nm Intel CPU like Sandy-E....

 

And in a year or two, he can upgrade to Ryzen 3000 or even 4000 series.

 

I really don't get why you bash someone that's happy with an older AMD System...

I never bashed it or said he had to upgrade. I specifically said it was fine that he didn't want to upgrade. I'm getting sick of you making up nonsense.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

I suggest to watch this very good video by gamers nexus to understand what I (and I guess @JoostinOnline) are saying. Nobody here is saying his system is bad or that he needs to upgrade if he is happy with his gaming experience, but a siege benchmark result and a task manager screenshot aren't a proof of good performance when you see a 6.5 min (that benchmark can have outliers in the first scene if it's still loading but not in the pool room scene).

 

(By the way I had an fx 6300 before the ryzen 1600, I'm not hating on FX)

Tag me @mbox or quote me if you want me to read

CPU Ryzen 5 1600 @3.9GHz  MOBO Asus Prime X370 Pro  GPU XFX RX 580 GTS XXX 8GB

COOLER Noctua NH-U12S  RAM 2x4GB HyperX Predator 3000 MHz  SSD Samsung 850 evo 250GB  CASE NZXT s340

MONITOR 1 AOC G2590PX MONITOR 2 LG 29WN600

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2019 at 7:19 AM, JoostinOnline said:

To most people that's incredibly choppy. Uneven frametimes don't bother you, which is great. It means you don't need to upgrade or lower your settings. But that doesn't mean the performance is objectively good. I'm not sure why you don't get that.

I never bashed it or said he had to upgrade. I specifically said it was fine that he didn't want to upgrade. I'm getting sick of you making up nonsense.

I never said uneven framerates don't bother me, they do, and thankfully I don't have uneven framerates in my current rig.

On 1/31/2019 at 7:33 AM, mbox said:

I suggest to watch this very good video by gamers nexus to understand what I (and I guess @JoostinOnline) are saying. Nobody here is saying his system is bad or that he needs to upgrade if he is happy with his gaming experience, but a siege benchmark result and a task manager screenshot aren't a proof of good performance when you see a 6.5 min (that benchmark can have outliers in the first scene if it's still loading but not in the pool room scene).

 

(By the way I had an fx 6300 before the ryzen 1600, I'm not hating on FX)

Well are there any other specific benchmarks I could post the results of to help prove my point?

Link to comment
Share on other sites

Link to post
Share on other sites

My Vega 64 and 2600X gets 94 average FPS on ULTRA settings with the downloaded Ultra texture pack in Rinbow Six. That's better than most 1080 tis I see on YouTube.

 

 

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

The reason people get up in arms over comments like "it does this ..fine by me" without the indepth backing people here would want as a followup.

 

Many have seen investigative proof otherwise..to FX IPC vs Drawcalls "based on also.. What they prefer frame time wise" as a trend and how it can at times, have flux, and not hold up in varied instances. #AllGamesareDifferent

Because many games exist, more than one person can really play all the time, but yet you can look it all up from a huge community,.. and all game backend services and features (game engines) behave and perform differently, with all our varied hardware.

Which is why people love the investigating journalism and indepth knowledge, but that can also make them way more sensitive to the differences between our platform.

 

It would have been easier to say that... frametime variance does not bother me that much, in which IMO negating more frequent platform upgrade cycles, especially at 4K, besides maybe drawcall heavy games or underutilised threading..

Seems perfect, but small niggles exist and people know about them.

 

Many here will agree with your standpoint.

- >knowing the variance has little effect on your experience and enjoyment levels.

Online Discussion "turned" Bashing of ideas can usually just be a misunderstanding of information or the intent of blanket statements.

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, SkilledRebuilds said:

The reason people get up in arms over comments like "it does this ..fine by me" without the indepth backing people here would want as a followup.

 

Many have seen investigative proof otherwise..to FX IPC vs Drawcalls "based on also.. What they prefer frame time wise" as a trend and how it can at times, have flux, and not hold up in varied instances. #AllGamesareDifferent

Because many games exist, more than one person can really play all the time, but yet you can look it all up from a huge community,.. and all game backend services and features (game engines) behave and perform differently, with all our varied hardware.

Which is why people love the investigating journalism and indepth knowledge, but that can also make them way more sensitive to the differences between our platform.

 

It would have been easier to say that... frametime variance does not bother me that much, in which IMO negating more frequent platform upgrade cycles, especially at 4K, besides maybe drawcall heavy games or underutilised threading..

Seems perfect, but small niggles exist and people know about them.

 

Many here will agree with your standpoint.

- >knowing the variance has little effect on your experience and enjoyment levels.

Online Discussion "turned" Bashing of ideas can usually just be a misunderstanding of information or the intent of blanket statements.

 

But here's the thing, I never said that frametime variance doesn't bother me, it does, but thankfully my FPS remains consistent with my current setup.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2019 at 2:38 AM, avrona said:

Well again, I am getting good performance, and I do have the benchmark results to prove it.

 I didn't even notice that the framerate dropped so low, meaning it must've been just for a second or less. There is literally no way around the fact that the benchmarks all point towards a fx-8350 and GTX 1080 Ti combo working out really well.

If you used an ancient i5 2400 you would see massively improved performance in terms of min/avg/max fps, and thats slow compared to anything new really. You just don't know because you've never had anything better. I went from an 8350 to a 4790k. I had my R9 Fury in both systems. In GTAV I could go as low as 60% GPU usage with the 8350, and now I can't get my i7 above 60% usage and the Fury hasn't dropped from 100% since I bought it. Your 1080 ti is just sitting on its theoretical hands waiting on an 8 year old chip with the performance of 12 year old chips to keep TRY and keep up. Quit being ignorant and pretending a 1080 ti belongs with an FX 8350. Hell, a stock R5 1600 would probably bottleneck a 1080 ti a LITTLE bit, and it cleans the clock of an FX thats on liquid nitrogen at 7ghz. And on that note of "good performance", if you wanted fps like that why didn't you buy an Xbox one x? You'd get similar performance out of that. lmao

 

On 1/31/2019 at 9:08 AM, ChewToy! said:

My Vega 64 and 2600X gets 94 average FPS on ULTRA settings with the downloaded Ultra texture pack in Rinbow Six. That's better than most 1080 tis I see on YouTube.

On 1/31/2019 at 9:38 AM, avrona said:

But here's the thing, I never said that frametime variance doesn't bother me, it does, but thankfully my FPS remains consistent with my current setup.

I'm really trying to be as nice as possible here, but theres just no chance your fps is nearly as consistent as you think. Look at the dude I quoted above's video. His Vega 64 is stomping your 1080 ti into the ground HARD, and that shouldn't even be a close race at all.

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Vegetable said:

 

I'm really trying to be as nice as possible here, but theres just no chance your fps is nearly as consistent as you think. Look at the dude I quoted above's video. His Vega 64 is stomping your 1080 ti into the ground HARD, and that shouldn't even be a close race at all.

Well it is consistent though. Not sure how else to say it.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, avrona said:

Well it is consistent though. Not sure how else to say it.

Consistently at 100% CPU usage maybe. Those minimum fps on the singular benchmark you have show me otherwise. Choppiness is worse than a consistently low fps, meaning I'd rather be locked at 30fps than go from 45 to 60 to 12 in 3 seconds. If I could shit out a 1080 ti right now and bench R6S my min/max/avg fps would be at least 50% higher than yours WITH THE SAME GPU. I dunno. If you're happy not getting what you paid for in terms of graphics horsepower then so be it. Have a blast.

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Vegetable said:

Consistently at 100% CPU usage maybe. Those minimum fps on the singular benchmark you have show me otherwise. Choppiness is worse than a consistently low fps, meaning I'd rather be locked at 30fps than go from 45 to 60 to 12 in 3 seconds. If I could shit out a 1080 ti right now and bench R6S my min/max/avg fps would be at least 50% higher than yours WITH THE SAME GPU. I dunno. If you're happy not getting what you paid for in terms of graphics horsepower then so be it. Have a blast.

Like I said, the minimum FPS means nothing if it's so short you can't even see it, and the entire benchmark, I personally only noticed one obvious dip in FPS, and that's coming from someone who really notices that kind of thing. Plus I don't really need my average FPS to be any higher, as in most games, it's maxed at out 60 at all times, with R6 being the only exception, and since the monitor is a 60Hz one, I don't need to go above that.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, avrona said:

Like I said, the minimum FPS means nothing if it's so short you can't even see it, and the entire benchmark, I personally only noticed one obvious dip in FPS, and that's coming from someone who really notices that kind of thing. Plus I don't really need my average FPS to be any higher, as in most games, it's maxed at out 60 at all times, with R6 being the only exception, and since the monitor is a 60Hz one, I don't need to go above that.

personally i had a FX-8320 overclocked to 4.5ghz in the past and when i was playing Dying light and Far Cry 3 i was gettings FPS dips in the low 30's quite a bit...and GPU load would drop down to as low as 50% (GTX 780 at the time) so yeah...if you search a bit, you'll find games where the good ol FX really struggle to keep the boat from sinking ;)

it's an aweful CPU for gaming, it really is...in your case you are pushing so many pixels it's rediculous your killing the shnit out of the 1080ti by doing that...but if you were playing at a reasonable resolution for games like 1440P and you would try to get 60FPS+ at all times you would realize your GPU would be capable of 120FPS+ in most games but the CPU can barely keep a steady 50...if that ;)

 

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Vegetable said:

 

I'm really trying to be as nice as possible here, but theres just no chance your fps is nearly as consistent as you think. Look at the dude I quoted above's video. His Vega 64 is stomping your 1080 ti into the ground HARD, and that shouldn't even be a close race at all.

To be fair, it beats every 1080 ti that I saw on YouTube. But yes, his is definitely not outputting what it's capable of. He should easily be in the 80-90s FPS at 4K on average if his CPU wasn't bottlenecking. 

 

Funny thing about that game, is that it advertisers Invidia right on the intro lol. 

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, i_build_nanosuits said:

personally i had a FX-8320 overclocked to 4.5ghz in the past and when i was playing Dying light and Far Cry 3 i was gettings FPS dips in the low 30's quite a bit...and GPU load would drop down to as low as 50% (GTX 780 at the time) so yeah...if you search a bit, you'll find games where the good ol FX really struggle to keep the boat from sinking ;)

it's an aweful CPU for gaming, it really is...in your case you are pushing so many pixels it's rediculous your killing the shnit out of the 1080ti by doing that...but if you were playing at a reasonable resolution for games like 1440P and you would try to get 60FPS+ at all times you would realize your GPU would be capable of 120FPS+ in most games but the CPU can barely keep a steady 50...if that ;)

 

Unless you combine it with a 1080 Ti that is. Even with the games you mentioned, you only used a 780, meaning the performance will be much better, despite my CPU still being slower, as mine isn't overclocked at all, yet it can keep a steady 50 FPS in games like Siege, and in every other game I have, it's a solid 60 FPS all the time.

3 minutes ago, ChewToy! said:

To be fair, it beats every 1080 ti that I saw on YouTube. But yes, his is definitely not outputting what it's capable of. He should easily be in the 80-90s FPS at 4K on average if his CPU wasn't bottlenecking. 

 

Funny thing about that game, is that it advertisers Invidia right on the intro lol. 

Again, I don't really need to be able to produce 80-90 FPS if my monitor is only a 60Hz one.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, avrona said:

Unless you combine it with a 1080 Ti that is. Even with the games you mentioned, you only used a 780, meaning the performance will be much better, despite my CPU still being slower, as mine isn't overclocked at all, yet it can keep a steady 50 FPS in games like Siege, and in every other game I have, it's a solid 60 FPS all the time.

Again, I don't really need to be able to produce 80-90 FPS if my monitor is only a 60Hz one.

If you're happy with it then there's no issue. Just use HWInfo and when you open it click load sensors only. It will tell you the minimum, average and maximum while you're playing. Post the screen shot after you play for a while if you really want to prove what you're saying. 

 

Either way, what every one is saying is that you could have spent less on a GPU and gotten the same performance, ie you're not getting what you paid for, whether you're satisfied with it or not. 

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ChewToy! said:

If you're happy with it then there's no issue. Just use HWInfo and when you open it click load sensors only. It will tell you the minimum, average and maximum while you're playing. Post the screen shot after you play for a while if you really want to prove what you're saying. 

 

Either way, what every one is saying is that you could have spent less on a GPU and gotten the same performance, ie you're not getting what you paid for, whether you're satisfied with it or not. 

I mean I didn't pay for it so it doesn't really matter if I'm fully using it or not.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2019 at 2:15 PM, avrona said:

Unless you combine it with a 1080 Ti that is. Even with the games you mentioned, you only used a 780, meaning the performance will be much better, despite my CPU still being slower, as mine isn't overclocked at all, yet it can keep a steady 50 FPS in games like Siege, and in every other game I have, it's a solid 60 FPS all the time.

Again, I don't really need to be able to produce 80-90 FPS if my monitor is only a 60Hz one.

yeah you,re probably right and the whole internet and review outlets around the world are wrong...FX is an AMAZING chip for gaming and in no games at no point did it even drop in the 30FPS range.

 

BRO, you just happen to play games that run ''okay'' on it, but it's not the case for many many games out there...and the fact that you are rendering at 4K resolution is your life saver, because at that resolution the GTX 1080ti perform about the same as a GTX 1050ti would at 1080p...which is about the caliber for this CPU.

 

On 1/31/2019 at 2:22 PM, avrona said:

I mean I didn't pay for it so it doesn't really matter if I'm fully using it or not.

you got a free 1080ti?! ?

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, i_build_nanosuits said:

yeah you,re probably right and the whole internet and review outlets around the world are wrong...FX is an AMAZING chip for gaming and in no games at no point did it even drop in the 30FPS range.

 

BRO, you just happen to play games that run ''okay'' on it, but it's not the case for many many games out there...and the fact that you are rendering at 4K resolution is your life saver, because at that resolution the GTX 1080ti perform about the same as a GTX 1050ti would at 1080p...which is about the caliber for this CPU.

I'm not saying it's amazing for gaming, but when combined with a super good graphics card, a 1080 Ti, then the performance is still great. And no, they don't run "okay", they run absolutely fine, constant 60 FPS, and around 50-60 on Siege. Plus today I got another current-gen game, Anno 1800, a 2019 title, and it also runs absolutely smoothly.

Just now, i_build_nanosuits said:

you got a free 1080ti?!

Yes I did get it for free.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, avrona said:

I'm not saying it's amazing for gaming, but when combined with a super good graphics card, a 1080 Ti, then the performance is still great. And no, they don't run "okay", they run absolutely fine, constant 60 FPS, and around 50-60 on Siege. Plus today I got another current-gen game, Anno 1800, a 2019 title, and it also runs absolutely smoothly.

Yes I did get it for free.

NO, that's the point...like i said you're running at 4K which means you are still rendering in the 45 to 60FPS range...and if you would drop the resolution to 1440p or 1080p you would STILL render in the 45 to 60FPS range because you would be strongly CPU LIMITED...don't believe me? go ahead try it...lower your resolution and monitor your GPU usage in games...you'll see ;)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

NO, that's the point...like i said you're running at 4K which means you are still rendering in the 45 to 60FPS range...and if you would drop the resolution to 1440p or 1080p you would STILL render in the 45 to 60FPS range because you would be strongly CPU LIMITED...don't believe me? go ahead try it...lower your resolution and monitor your GPU usage in games...you'll see ;)

What difference does it make though? It's not I want to go back to a lower resolution anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, avrona said:

What difference does it make though? It's not I want to go back to a lower resolution anyway.

the difference it makes is this:

 

I'm playing on a 27'' 2560x1440p Gsync 144hz IPS gaming monitor...which is head and shoulders above any 60hz 4K monitor when it comes to gaming, and with 8700K and 1080ti i get 100FPS+ in games, and in many cases i get 120 to 144FPS...it's smooth and fast like you could never imagine a game can run smooth and fast, and i'm 100% convinced the experience is A LOT better than gaming on a 60hz 4K monitor at 50FPS...all day ;)

 

You made wierd choices...bad monitor choice (unless it's a 50inch+ TV?) and even worse CPU choice

 

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, i_build_nanosuits said:

the difference it makes is this:

 

I'm playing on a 27'' 2560x1440p Gsync 144hz IPS gaming monitor...which is head and shoulders above any 60hz 4K monitor when it comes to gaming, and with 8700K and 1080ti i get 100FPS+ in games, and in many cases i get 120 to 144FPS...it's smooth and fast like you could never imagine a game can run smooth and fast, and i'm 100% convinced the experience is A LOT better than gaming on a 60hz 4K monitor at 50FPS...all day ;)

 

You made wierd choices...bad monitor choice (unless it's a 50inch+ TV?) and even worse CPU choice

 

Again, my CPU choice isn't bad as it does the job really well. And the whole higher FPS vs. higher resolution is simply a matter of taste. Some prefer one side, some prefer another, and some can't stand looking at a display if it's not both 4k and 144Hz. I'm in the camp of people who prefer having a bigger resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×