Jump to content

fx 8350 bottleneck

abdoo

@OP, are you a professional, competitive gamer? You playing on 4k?1080p 144Hz? Surround Display setup? Alien things like that?NO???

Then you can go with a new graphics card, be it GTX970 or an R9 290X.

 

Most people tend to make a drama out of this "bottlenecking" issue...It's not like you're gonna be playing at 20FPS, or that the frames will drop each second. It's not gonna happen...

For 1080p, it will be more than enough. Would you really start yelling like a little girl if your FPS dropped from 120 to like 90-80? Of course not, because on a 60Hz monitor you're not gonna notice the difference.

DX12 is coming, nextgen games have multithreaded optimizations, so CPU will be less of an issue. It will still be slower than the i7s or i5s in games(duh..it's almost 3years old), but it won't make your gaming experience unpleasant. 

 

I am really faceplaming when someone asks to pair a high-end gpu with an AMD FX CPU, and they recommend switching the platform. I know people who paired GTX970s with FX 6300s, and although they admit they're not using the full potential of the card, the boost was still significant.

 

The point is. If you have the CPU, then leave it there, and use it. If you don't, then AMD it's not worth taking into consideration. And this comes from an AMD guy.

I'd go with the GTX 970

CPU AMD FX-8350 @ 4.0GHzCooling AMD StockMotherboard AsRock 970 Extreme4RAM 8GB (2x4) DDR3 1333MHz GPU AMD Sapphire R9 290 Vapor-XCase Fractal Define R5 Titanium 


Storage Samsung 120GB 840 EVO | PSUThermaltake Litepower 600WOS Windows 8.1 Pro 64-bit


Upgrading to - Intel i7 - New motherboard - Corsair AIO H110i GT watercooler -  1000W PSU


Link to comment
Share on other sites

Link to post
Share on other sites

I find those 8350's etc rip-off for a gaming system, for the same price you can get much better CPU's and people deserve this but AMD is again playing games like this; 

900x900px-LL-6756c890_220-20DOTA20220Pri

"Competitive" leadership, yeah.. Nothing more than vicious marketing, it's almost false advertising to me. You rarely see people regretting their i5 or i7 after they look how worse AMD performs.

 

Developed in 2010? It took them 6 years; youtube.com/watch?v=koepHTdw7ZE @ 1:10 (that guy got fired btw who works for Samsung now)

AMD fails to say "However when you buy an AMD FX-8350 there is absolutely no way you can upgrade this CPU to something better in the future. But please buy it anyway."

CPU AMD FX-8350 @ 4.0GHzCooling AMD StockMotherboard AsRock 970 Extreme4RAM 8GB (2x4) DDR3 1333MHz GPU AMD Sapphire R9 290 Vapor-XCase Fractal Define R5 Titanium 


Storage Samsung 120GB 840 EVO | PSUThermaltake Litepower 600WOS Windows 8.1 Pro 64-bit


Upgrading to - Intel i7 - New motherboard - Corsair AIO H110i GT watercooler -  1000W PSU


Link to comment
Share on other sites

Link to post
Share on other sites

@Faceman Honestly, too old to deal with your shitposting. I have far better things to do with my time, than to sit here and argue with am arrogant person who doesn't have a clue.

He's not arrogant or ignorant, he's right. There's a difference.

You're either lying about performance, rarely ever play Skyrim, or haven't even tried modding. My FX-8350 bottlenecked my GTX 780 HEAVILY in Skyrim as soon as I started loading up CPU intensive mods. The mods I picked made Skyrim unplayable on an 8350, but magically, my i7-4790k can play the same mods without issue. And no, hyperthreading doesn't mean anything in this case, Skyrim is very single threaded. The performance I achieved with my i7 would match an i5 with a close clock rate.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay guys, I've decided to settle this whole fucking debate once and for all.  I bought a GTX 980 to replace my 2x 270X cards, and got nearly identical 3dmark firestrike results (See the top two results here: http://www.3dmark.com/fs/3682770 versus http://www.3dmark.com/fs/3674556 )

 

However, I saw massive improvements in quite a few games. (Advanced warfare in particular is a way better experience, despite similar framerates.) Is it maxing out utilizatrion?  No, there are plenty of moments where my utilization does indeed drop.  I'll show you that in a minute.

 

But first: 

 

Rome 2

My average framerate doubled, and all stutter or slowdown disappeared.  I went from a min framerate of the mid 40s to not even dipping below 60.  

http://i.imgur.com/QEeXozY.jpg

 

 

Skyrim's framerate now never dips below 60 whereas it could before.

http://i.imgur.com/qLJXEmm.jpg

 

 

World of Warcraft: My framerate from this spot went from 35 to 51.  Still bottlenecked for sure and not perfect, but infinitely more playable.  Even though the CPU was limiting my performance, I still saw huge gains with a more powerful GPU.  Presumably, this is now my new "minimum" but this is probably the most intense scene available to me atm. (Have not done any raiding yet, boosted a character just to test performance.)

http://i.imgur.com/tIvro4V.jpg

 

Now here's what a real bottleneck looks like.  I have identical framerates in this scene as I've had with a single 270X, dual 270X's and now a GTX 980.  Overclocking my CPU from 4.0 GHz to 4.9 GHz brought the framerate up by ~5.  

Arma 3:

http://i.imgur.com/HyraFl1.jpg

 

 

 

The tl,dr is that while an FX will bottleneck you in certain circumstances, in the vast majority of games I've tested (only been running the GPU for 30 minutes.) I noticed huge framerate gains, huge improvements to smoothness and removal of stutter or slowdown (likely due to ditching crossfire frametimes.) and a significant boost in performance overall.  The thing here is this is 1080p, and the vast majority of games will run well over 60 FPS.  At 1440p and 4K the CPu bottleneck will disappear almost entirely except for games like Arma 3.

 

 So as per the OP's question, he does not need to replace his CPU and motherboard to get significant performance benefits from upgrading, but he will not win in any benchmarks I suppose.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

The tl,dr is that while an FX will bottleneck you in certain circumstances, in the vast majority of games I've tested (only been running the GPU for 30 minutes.) I noticed huge framerate gains, huge improvements to smoothness and removal of stutter or slowdown (likely due to ditching crossfire frametimes.) and a significant boost in performance overall.  The thing here is this is 1080p, and the vast majority of games will run well over 60 FPS.  At 1440p and 4K the CPu bottleneck will disappear almost entirely except for games like Arma 3.

 

 So as per the OP's question, he does not need to replace his CPU and motherboard to get significant performance benefits from upgrading, but he will not win in any benchmarks I suppose.

well...i tested the FX extensively with an high-end card (GTX780 overclocked) for over 3 months...not only 30 minutes and i'm telling you the FX is no match for such high end GPU's at 1080p...plain and simple otherwise i would still be running my overclocked FX.

 

In the vast majority of games out there the FX will touch it's limit way before my crippled keppler card would...let alone a powerful fully enabled flagship maxwell based GPU.

 

Problem with the FX and the part in which i agree with you is that indeed it can provide 60FPS average in MOST games...yes, defenetly..but it suffers from terrible minimum's at times...some games i was dipping in the 30's where as now the minimums with my i7 are in the 50's in those games...the FX won't stutter or anything from my experience, but it drops frames constantly and the GPU loads is all over the place in MANY games.

 

now, would it improve performance to get highend card like you did: of course it would you are the living proof of this...but is it optimal? nowhere near...you'll see the full capacity of your card and perfectly smooth gameplay across the board only when you'll upgrade to a powerful core i5 or idealy a core i7 for optimum smoothness.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

well...i tested the FX extensively with an high-end card (GTX780 overclocked) for over 3 months...not only 30 minutes and i'm telling you the FX is no match for such high end GPU's at 1080p...plain and simple otherwise i would still be running my overclocked FX.

 

In the vast majority of games out there the FX will touch it's limit way before my crippled keppler card would...let alone a powerful fully enabled flagship maxwell based GPU.

 

Problem with the FX and the part in which i agree with you is that indeed it can provide 60FPS average in MOST games...yes, defenetly..but it suffers from terrible minimum's at times...some games i was dipping in the 30's where as now the minimums with my i7 are in the 50's in those games...the FX won't stutter or anything from my experience, but it drops frames constantly and the GPU loads is all over the place in MANY games.

 

now, would it improve performance to get highend card like you did: of course it would you are the living proof of this...but is it optimal? nowhere near...you'll see the full capacity of your card and perfectly smooth gameplay across the board only when you'll upgrade to a powerful core i5 or idealy a core i7 for optimum smoothness.

Quite frankly, this card is already overkill for 1080p anyway.  I'm not concerned about being optimal by any means, and certainly I won't argue that an i5 isn't a better choice if you really want the best performance.  I just contest that an FX is inadequate for gaming with a high end GPU.  If it truly bottlenecked a high end card, I wouldn't see framerate improvements (in both minimum, average and max) across the board by upgrading and more games would look like Arma 3 with a completely flatline performance.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Quite frankly, this card is already overkill for 1080p anyway.  I'm not concerned about being optimal by any means, and certainly I won't argue that an i5 isn't a better choice if you really want the best performance.  I just contest that an FX is inadequate for gaming with a high end GPU.  If it truly bottlenecked a high end card, I wouldn't see framerate improvements (in both minimum, average and max) across the board by upgrading and more games would look like Arma 3 with a completely flatline performance.

depends what you think is ''adequate'' honestly...personaly having a 650$ gpu not being fully utilised and dropping to 45fps in a modern game due to a 100$ cpu limiting the performance for me pesonaly is not acceptable or adequate in any way...hence the reason why i did upgraded from the FX processor...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

depends what you think is ''adequate'' honestly...personaly having a 650$ gpu not being fully utilised and dropping to 45fps in a modern game due to a 100$ cpu limiting the performance for me pesonaly is not acceptable or adequate in any way...hence the reason why i did upgraded from the FX processor...

 

My rule of thumb is intel will get you 20% more performance in non-GPU limited situations (Some games like Arma 3 see closer to 50%, others see 5% or less.)  The question was until the i3-4330 is whether or not 20% more performance was worth 50% or more money.  Nothing from the 3xxx line shat on the FX 8 cores, but the i3-4330 certainly wrecks them.  For someone building a system today, it's a no brainer to go intel unless you're only looking to use a middle range GPU like a 270X/280X.  You needed atleast a $300 i5 to really outperform an FX8, which could be had for $200 in 2012-2013.

 

I would never recommend someone running an FX 8350 with a 7870 toupgrade CPU before GPU...they'll see massive games even if they occasionally dip to 45 FPS in certain scenes in some games.  (Remember, the 7870 is a card that usually plays games within 45-60 FPS at best.)  However, most of this thread seemed to think this was necessary.

 

All this talk of FX8 core bottlenecks misses the entire point.  Most people don't care about constant 60 FPS, they just want to play games well.  A 7870 will dip down to mid 30's or lower in newer games.  A 290X will absolutely kill it.  Minimum framerate (which varies so wildly between benchmarks that I suspect everyone uses a different methodology to determine it.) is less important than framerate delta.  A drop to 45 FPS could be removed entirely by slightly reducing a graphical setting or two that you wouldn't even miss.  It's only for pure enthusiast who want maximum settings where an FX may hold them back.  (And I'm still pretty happy with mine.)

 

Of course upgrading to an i3-4330 with the 290X is better... but that's also another $250 on top of the video card for what, slightly better minimum and maximum framerates? For a lot of people, that's a pretty large pill to swallow. 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Okay guys, I've decided to settle this whole fucking debate once and for all.  I bought a GTX 980 to replace my 2x 270X cards, and got nearly identical 3dmark firestrike results (See the top two results here: http://www.3dmark.com/fs/3682770 versus http://www.3dmark.com/fs/3674556 )

 

However, I saw massive improvements in quite a few games. (Advanced warfare in particular is a way better experience, despite similar framerates.) Is it maxing out utilizatrion?  No, there are plenty of moments where my utilization does indeed drop.  I'll show you that in a minute.

 

But first: 

 

Rome 2

My average framerate doubled, and all stutter or slowdown disappeared.  I went from a min framerate of the mid 40s to not even dipping below 60.  

http://i.imgur.com/QEeXozY.jpg

 

 

Skyrim's framerate now never dips below 60 whereas it could before.

http://i.imgur.com/qLJXEmm.jpg

 

 

World of Warcraft: My framerate from this spot went from 35 to 51.  Still bottlenecked for sure and not perfect, but infinitely more playable.  Even though the CPU was limiting my performance, I still saw huge gains with a more powerful GPU.  Presumably, this is now my new "minimum" but this is probably the most intense scene available to me atm. (Have not done any raiding yet, boosted a character just to test performance.)

http://i.imgur.com/tIvro4V.jpg

 

Now here's what a real bottleneck looks like.  I have identical framerates in this scene as I've had with a single 270X, dual 270X's and now a GTX 980.  Overclocking my CPU from 4.0 GHz to 4.9 GHz brought the framerate up by ~5.  

Arma 3:

http://i.imgur.com/HyraFl1.jpg

 

 

 

The tl,dr is that while an FX will bottleneck you in certain circumstances, in the vast majority of games I've tested (only been running the GPU for 30 minutes.) I noticed huge framerate gains, huge improvements to smoothness and removal of stutter or slowdown (likely due to ditching crossfire frametimes.) and a significant boost in performance overall.  The thing here is this is 1080p, and the vast majority of games will run well over 60 FPS.  At 1440p and 4K the CPu bottleneck will disappear almost entirely except for games like Arma 3.

 

 So as per the OP's question, he does not need to replace his CPU and motherboard to get significant performance benefits from upgrading, but he will not win in any benchmarks I suppose.

 

exactly :)

Link to comment
Share on other sites

Link to post
Share on other sites

At 1440p and 4K the CPu bottleneck will disappear almost entirely except for games like Arma 3.

If you're going to use an FX series chip on a high resolution monitor, you need to reallocate your funds because, even though the game becomes GPU dependent, it makes almost no sense to buy such a weak CPU with a high end graphics card and high resolution monitor. Honestly kind of feels putting gold plating on an Xbox One or PS4.

 

You can say the FX wins* 1440p benchmarks but that doesn't change how poor it actually performs in comparison to other chips.

 

*and by wins I mean it doesn't lose as poorly as it would on 1080p or lower

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

If you're going to use an FX series chip on a high resolution monitor, you need to reallocate your funds because, even though the game becomes GPU dependent, it makes almost no sense to buy such a weak CPU with a high end graphics card and high resolution monitor. Honestly kind of feels putting gold plating on an Xbox One or PS4.

 

You can say the FX wins* 1440p benchmarks but that doesn't change how poor it actually performs in comparison to other chips.

 

*and by wins I mean it doesn't lose as poorly as it would on 1080p or lower

 

i think this would be an interessting article for ya.

Before you make some claims

 

http://www.tweaktown.com/tweakipedia/58/core-i7-4770k-vs-amd-fx-8350-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

Link to comment
Share on other sites

Link to post
Share on other sites

If you're going to use an FX series chip on a high resolution monitor, you need to reallocate your funds because, even though the game becomes GPU dependent, it makes almost no sense to buy such a weak CPU with a high end graphics card and high resolution monitor. Honestly kind of feels putting gold plating on an Xbox One or PS4.

 

You can say the FX wins 1440p benchmarks but that doesn't change how poor it actually performs in comparison to other chips.

Why?  If I already own the FX and buy a 1440p monitor, my bottleneck disappears as the GPu becomes the limiting factor.

 

I wouldn't recommend anyone build a new PC in 2014 or 2015 with an FX CPU. (Unless they need the multithreading for video rendering or something.)  But for people who built them in 2012, 2013 etc... I certainly wouldn't tell them to go swap it out entirely in favour of a better GPU.  It really doesn't make sense economically... especially at 2560x1400 or 4K resolutions where there is almost zero difference between intel or AMD CPU's.

 

For real world gaming, an FX CPU works.  It doesn't win benchmarks, you CAN have better and you don't even need to pay the $300-500 you needed to a year or two ago to get it... but the performance you get with an FX CPU is "good enough" for the vast majority of gamers that aren't PC building enthusiasts.  And again, I think I've definitively proved that while the FX will have weaker performance, most games don't have a true bottleneck like Arma 3 does and it's still worthwhile to upgrade to high end GPU's.

 

While you're at it, you'll get a comparative performance upgrade from an i3-4330 by getting an i7-5960X as you get from going from an FX 8 to an i5, but you wouldn't say that the i3 "bottlenecks" high end GPu's.  Saying the FX series bottlenecks GPU's is an incorrect usage of the term bottleneck.  Simple as that.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

i think this would be an interessting article for ya.

Before you make some claims

 

http://www.tweaktown.com/tweakipedia/58/core-i7-4770k-vs-amd-fx-8350-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

wow...even at 4K the i7-4770K still shows superior results...this is really unexcpected and really highlight just how much better it is when compaired to an FX CPU...don't you have such benchmarks but run at 1080p that would be a good laugh!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

i think this would be an interessting article for ya.

Before you make some claims

 

http://www.tweaktown.com/tweakipedia/58/core-i7-4770k-vs-amd-fx-8350-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

"oh wow the fx-8350 doesn't bottleneck in GPU dependent scenarios I would have never guessed, I have suddenly changed sides because of a single source versus the hundreds of others in favor of Intel"

 

Is that the response you were expecting? Because you're not getting it from posting a link to testing that makes no sense. A budget conscious 4K gamer? Oh, what a laughable thought. As I said, gold plating on a console.

 

No mentions of overclocks, since Intel scales infinitely better with overclocks.

 

"The multi-core nature of the architecture makes the FX-8350 more competitive in our handbrake H.264 video encoding test, but a score of 2,701 points at stock and 3,164 points when overclocked is still slower than Intel’s Core i5-2500K, let alone the Ivy Bridge i5-3570K which scores 3,1160 points at stock and 4,245 when overclocked." Bit-Tech

 

Just the 3570k makes a huge jump in performance with an overclock where as the 8350 makes almost none.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

wow...even at 4K the i7-4770K still shows superior results...this is really unexcpected and really highlight just how much better it is when compaired to an FX CPU...don't you have such benchmarks but run at 1080p that would be a good laugh!

 

$389 versus ~$200 for a handful of frames.

 

Hold on, I have a $400 RAM kit to sell you.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

$389 versus ~$200 for a handful of frames.

 

Hold on, I have a $400 RAM kit to sell you.

 

I'm not paying 400 for ram when I can just download it for free!!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not paying 400 for ram when I can just download it for free!!!!!

LOL, same here!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

$389 versus ~$200 for a handful of frames.

 

Hold on, I have a $400 RAM kit to sell you.

i hope you do realise that this is unrealistic 4K benchmarking...at 1080p the difference between both CPU's when paired with dual GTX980's in SLI would be insanely huge...like almost double.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

wow...even at 4K the i7-4770K still shows superior results...this is really unexcpected and really highlight just how much better it is when compaired to an FX CPU...don't you have such benchmarks but run at 1080p that would be a good laugh!

 

Well at 4k the cpu doesnt make that much sense anymore as shown in that article.

You will become GPU limited anyway.

Look at Metro LL which is cpu bound, GRID is also a cpu bound game at 1080p.

4770k shows nowhere real superiour results, not $150,- more worth of results atleast

 

But at 1080p things are completely diffrent.

Link to comment
Share on other sites

Link to post
Share on other sites

Well at 4k the cpu doesnt make that much sense anymore as shown in that article.

You will become GPU limited anyway.

 

But at 1080p things are completely diffrent.

 

1080P was so 2013 lol

Link to comment
Share on other sites

Link to post
Share on other sites

Well at 4k the cpu doesnt make that much sense anymore as shown in that article.

You will become GPU limited anyway.

Look at Metro LL which is cpu bound, GRID is also a cpu bound game at 1080p.

 

 

But at 1080p things are completely diffrent.

i know babe, i know ;)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

i know babe, i know ;)

 

BUt yeah honnestly GTX980 Sli for 1080p meh,

even a 5960X at 4.5Ghz will bottleneck those 2 cards like crazy at 1080p.

There allways will be a cpu bottleneck at a certain point basicly.

 

But haswell is simply superiour to philedriver in gaming cpu bound, cpu + gpu bound.

Link to comment
Share on other sites

Link to post
Share on other sites

BUt yeah honnestly GTX980 Sli for 1080p meh,

even a 5960X at 4.5Ghz will bottleneck those 2 cards like crazy at 1080p.

yes of course it won't make sense...even the 144hz display itself will in fact bottleneck dual GTX 980's at 1080p :P

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Long distance relationship I see.....

 

It'll never work. Your a Montague and he is a Capulet.........

she's way to recent a model for me anyways man ;)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×