Jump to content

i7 8700k with a 3080?

Consul

I've been doing some searches over the past hour or so and I've seen people say that 8700k will bottleneck a 3080 on a 1080p 144hz monitor.

Is this true? I haven't seen much of a performance difference in games in 1080p when 8700k is compared to a 10700k (when 2080ti is used on the PC).

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wait for ampere to come out and check how they perform and the 8700k holds up with a 3080, no one knows how the 3080 performs. 

but the 8700k is still a good gaming chip and should handle the 3080 just fine but again, wait and check.

PC: Motherboard: ASUS B550M TUF-Plus, CPU: Ryzen 3 3100, CPU Cooler: Arctic Freezer 34, GPU: GIGABYTE WindForce GTX1650S, RAM: HyperX Fury RGB 2x8GB 3200 CL16, Case, CoolerMaster MB311L ARGB, Boot Drive: 250GB MX500, Game Drive: WD Blue 1TB 7200RPM HDD.

 

Peripherals: GK61 (Optical Gateron Red) with Mistel White/Orange keycaps, Logitech G102 (Purple), BitWit Ensemble Grey Deskpad. 

 

Audio: Logitech G432, Moondrop Starfield, Mic: Razer Siren Mini (White).

 

Phone: Pixel 3a (Purple-ish).

 

Build Log: 

Link to comment
Share on other sites

Link to post
Share on other sites

The 8700K, and any other Intel 6c/12t CPU within the last 3 release generations, is at the baseline for top tier gaming performance. They have the same single threaded performance (so, the fastest) and enough threads for good scaling in any game.

 

You can reasonably put any high end GPU with an 8700K.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Consul said:

I've been doing some searches over the past hour or so and I've seen people say that 8700k will bottleneck a 3080 on a 1080p 144hz monitor.

Is this true? I haven't seen much of a performance difference in games in 1080p when 8700k is compared to a 10700k (when 2080ti is used on the PC).

 

We don't know the performance of the 3080 (no, Nvidia's presentations or controlled reviews aren't proof of anything), but an 8700k is capable of getting 144fps.  It's fine for any GPU.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

An 8700K running at 5GHz (or at least close to that) is still among the best gaming chips out there. I wouldn't worry about it.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

Bottleneck doesn't matter, if it is even there at all. Don't worry about it. It wont affect performance in a meaningful way outside of comparing benchmark scores.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't use a 3080 with a 1080p 144hz 

As a wise master once said, 

Put as much effort into your question as you would expect a stranger to add to their answer!

 

Desktop: AMD Ryzen 5 3800XT | Gigabyte X570 Aorus Ultra | Corsair Vengeance LPX 32 GB 4000Mhz CL18 | MSI RX 5700 XT Gaming X | Two WD Black SN750 1TB NVMe SSD | Corsair AX850 Titanium | Fractal Design Define 7 White, Solid

 

Laptop: Apple Macbook Pro 16 | i9 9980H | 16 GB RAM |  1TB SSD | AppleCare+ | Space Grey

Peripherals: RAMA U-80 Lake | Logitech MX Master 3 | Kanto YU2 White + Beyerdynamic DT 177X GO 

Displays: LG GX 55"| Acer XB273K | Dell Ultrasharp U2720Q | LG 32UN650-W 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Monitor hz won’t really matter. Comes down to the game. 
I know using a 1080 with mine on a game like black ops 4, it pegged all cores to 100%. Which is why I didn’t use my 1080ti. The stutter was bad like people claim sli does. 
 

So if games are bad like that and the new cards are as good as they claim, I can see it being an issue performance wise. But I guess you can be ok if you’re the type to cap frames. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Consul said:

I've been doing some searches over the past hour or so and I've seen people say that 8700k will bottleneck a 3080 on a 1080p 144hz monitor.

Is this true?

 

It's probably true, but the same could probably be said with any CPU out there right now. The 8700k is still a valid high end gaming CPU and should be for another year or two. 

 

Your bigger issue is the 144hz monitor. If you are limited to 144fps then the 3080 will be a waste because it will likely exceed 144 by a huge margin in most games. TBH i dont see much value in having FPS beyond 140. Even beyond 110 or so I cant really see a difference, which is why I greatly prefer 1440p. It looks so much nicer and with a decent GPU I can still achieve high enough FPS to have a smooth experience, even in first person shooters. 

CPU: Ryzen 7 3700x,  MOBO: ASUS TUF X570 Gaming Pro wifi, CPU cooler: Noctua U12a RAM: Gskill Ripjaws V @3600mhz,  GPU: Asus Tuf RTX OC 3080 PSU: Seasonic Focus GX850 CASE: Lian Li Lancool 2 Mesh Storage: 500 GB Inland Premium M.2,  Sandisk Ultra Plus II 256 GB & 120 GB

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, maizenblue said:

It's probably true, but the same could probably be said with any CPU out there right now. The 8700k is still a valid high end gaming CPU and should be for another year or two. 

 

Your bigger issue is the 144hz monitor. If you are limited to 144fps then the 3080 will be a waste because it will likely exceed 144 by a huge margin in most games. TBH i dont see much value in having FPS beyond 140. Even beyond 110 or so I cant really see a difference, which is why I greatly prefer 1440p. It looks so much nicer and with a decent GPU I can still achieve high enough FPS to have a smooth experience, even in first person shooters. 

That is true, I totally agree, but if I get a chance to switch to a 1440p monitor it would be great to already have a 3080. So I'm getting it just in-case if I upgrade to a 1440p monitor (and to have a GPU for a longer time compared to what I have now which is a 1060 6gb).

I essentially really want to future-proof as much as possible.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Consul said:

That is true, I totally agree, but if I get a chance to switch to a 1440p monitor it would be great to already have a 3080. So I'm getting it just in-case if I upgrade to a 1440p monitor (and to have a GPU for a longer time compared to what I have now which is a 1060 6gb).

I essentially really want to future-proof as much as possible.

Future proofing is not a science, it's  just educated guesses based on available data to attempt to predict the future.

 

I think the 8700k will be able to handle a 3080 for the most part compared to most other existing CPUs, as most games currently don't utilize more than 8-12 threads. There's some outliers, but they aren't currently representative of most games.

 

That said, we don't know if future games will be programmed to utilized more cores, if pcie4 and RTX/IO will play a larger part in game development (thereby lifting some load off the cpu), or if the 4th generation Ryzen or 11th generation Intel chips will be a large enough jump than current cpus, or if future gpus will increase in potency exponentially.

 

My guess is that for anything current, the 8700k won't bottleneck a 3080 or 3090 any more than other existing cpus, but it's likely that future CPUs will show that the 3080/3090 have more juice in then than the 8700k and other current CPUs can offer.

 

It makes sense to go 3080, but I would definitely avoid the 3090 because the cost is just too high.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Mister Woof said:

Future proofing is not a science, it's  just educated guesses based on available data to attempt to predict the future.

 

I think the 8700k will be able to handle a 3080 for the most part compared to most other existing CPUs, as most games currently don't utilize more than 8-12 threads. There's some outliers, but they aren't currently representative of most games.

 

That said, we don't know if future games will be programmed to utilized more cores, if pcie4 and RTX/IO will play a larger part in game development (thereby lifting some load off the cpu), or if the 4th generation Ryzen or 11th generation Intel chips will be a large enough jump than current cpus, or if future gpus will increase in potency exponentially.

 

My guess is that for anything current, the 8700k won't bottleneck a 3080 or 3090 any more than other existing cpus, but it's likely that future CPUs will show that the 3080/3090 have more juice in then than the 8700k and other current CPUs can offer.

 

It makes sense to go 3080, but I would definitely avoid the 3090 because the cost is just too high.

Just the type of answer I was looking for and yes 3090 would be an overkill for me. Thanks for the great answer!

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Consul said:

I've been doing some searches over the past hour or so and I've seen people say that 8700k will bottleneck a 3080 on a 1080p 144hz monitor.

Is this true? I haven't seen much of a performance difference in games in 1080p when 8700k is compared to a 10700k (when 2080ti is used on the PC).

 

 


Neither Intel nor AMD managed to bring out a faster CPU for gaming yet. More cores/threads are only useful in a very limited amount of games.
You would actually have to upgrade to higher resolution or wait until something faster comes out to remove any CPU bottlenecks on 1080p.

 

The trend even goes in favor of the 8700k with more and more dx12 games being developed, lifting the CPU bottleneck. I mean i have a 2080ti and even at 720p the division2 and modernwarfare on small maps (deathmatch) are GPU limited while gaining about 100fps compared to 1080p (all settings on lowest possible).

 

So my guestimation is 8700k will be fine for the next 3years, because AMD isnt even on par with Intel in gaming yet and wont be on par with its next gen.

(way to far behind, google for some 720p benchmarks to know how far, spoiler: its above 20% on average and even for AMD a +20%increase is unrealistic.)

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you will have a bottleneck with that. I am still running a old 4790K @ 4.8 and im doing fine with a 2080 super, even if you get a little performance hit by the 8th gen CPU, it will be very small. Also most games don't take advantage of extra cores as well so. 

THE PIT VIPER SPECS

4790k @ 4.7ghz  (EVGA CLC 360MM) | 32GB Patriot Viper Extreme @2133 mhz | Asus Z97-DELUXE Cooler Master H500M.| EVGA RTX 2080 Ti FTW3

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, maizenblue said:

Your bigger issue is the 144hz monitor. If you are limited to 144fps then the 3080 will be a waste because it will likely exceed 144 by a huge margin in most games.

Hi,

i don't really understand that point, as you can manually set the max framerate the gpu has to render in nvidia control panel.

at least, you won't waste more watts than necessary to achieve 144 fps.

 

from my point of view, this is a nice feature.

let's say... if you like to play at 60 fps in full hd, and if you lock the max framerate, the 3080 will run very very very cool in a lot of games.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

So since the benchmarks are out, what do you guys think? I've seen Jayztwocentz talk about how a 8700k will bottleneck the 3080 on a 1080p monitor. So;

 

Will my 8700k bottleneck a 3080?
Will a new 1440p monitor remove my bottleneck?

I'm legitimately terrified about the idea of upgrading my CPU and in no way I'm thinking of doing it.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Consul said:

So since the benchmarks are out, what do you guys think? I've seen Jayztwocentz talk about how a 8700k will bottleneck the 3080 on a 1080p monitor. So;

 

Will my 8700k bottleneck a 3080?
Will a new 1440p monitor remove my bottleneck?

I'm legitimately terrified about the idea of upgrading my CPU and in no way I'm thinking of doing it.

He said that without any benchmarks. 

The 8700k won't bottleneck the 3080 any worse when clocked the same as the 9900/10700/10900k's at that resolution (and looking at the Ryzen data at 1080p, the 8700k will beat them all) in the games that don't give a shit about cores.

 

In games that do give a shit about cores, then yeah to a point it will.

 

Keep in mind, ALL cpus are gonna bottleneck a 3080 at 1080p, so that's a stupid angle.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Consul said:

So since the benchmarks are out, what do you guys think? I've seen Jayztwocentz talk about how a 8700k will bottleneck the 3080 on a 1080p monitor. So;

 

Will my 8700k bottleneck a 3080?
Will a new 1440p monitor remove my bottleneck?

I'm legitimately terrified about the idea of upgrading my CPU and in no way I'm thinking of doing it.

Quite vague. It depends on the game. 
Ive had one game peg all cores at 100 and another only peg 1 at 100. 
1440 helps but given how much better the new cards are than my old 1080, I’d expect the same potential. 
 

But for me, hitting 100% and causing a stutter is the biggest problem. Or using that same old gpu on my older cpu and drastically reducing frames. 
So yes there’s a possibility of something. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mister Woof said:

Keep in mind, ALL cpus are gonna bottleneck a 3080 at 1080p, so that's a stupid angle.

Oh yeah that's an obvious one. I'll definitely upgrade to a 1440p 144hz monitor when I have the money.

 

5 minutes ago, Mister Woof said:

The 8700k won't bottleneck the 3080 any worse when clocked the same as the 9900/10700/10900k's at that resolution (and looking at the Ryzen data at 1080p, the 8700k will beat them all) in the games that don't give a shit about cores.

That's relieving to hear. Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Consul said:

Oh yeah that's an obvious one. I'll definitely upgrade to a 1440p 144hz monitor when I have the money.

 

That's relieving to hear. Thanks!

Look at this

 

mjH64YYWhFEoMuQivCurKf-1920-80.png

 

Tom's Hardware 3080 review with CPU scaling.

 

There's a teeeny tiny difference between a Ryzen 5 3600 (6/12) and Ryzen 9 3900x (12/24). Is it from cores or frequency? Can't be sure, but I'm going with frequency. IIRC the 3900x has around a 200mhz higher clock when gaming generally than the 3600.

 

Look at the 9900k (8/16) and 10900k (10/20). No difference. If you take the 3600 and 3900x data (with a 50% core/thread delta), and ASSUME that the difference is based on its core count, then even THEN the difference is negligible at 1440p. If you assume its from frequency, then it's a wash and cores past 6/12 don't matter.

 

Take that data, apply it to an equally clocked 8700k (6/12) vs 9900k (8/16) vs. 10900k (10/20) and there's your answer: Between teeeny tiny in some games (based on the 3600 to 3900 delta) to virtually none (presuming that beforementioned delta is from frequency, not core count).

 

That doesn't mean ALL of these CPUs aren't bottlenecking the 3080. I believe they are. But it indicates the 8700k probably isn't any more of a bottleneck than anything else.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Mister Woof said:

That doesn't mean ALL of these CPUs aren't bottlenecking the 3080. I believe they are. But it indicates the 8700k probably isn't any more of a bottleneck than anything else.

Yeah, it totally makes sense.
 

Pardon my amateurity but when I look at gaming benchmarks between 9900k and 8700k with a 2080 (or a 2080ti), there isn't much of a difference too.
Essentially, newer CPUs from Intel don't have the huge gap in performance when compared to a 8700k I believe.

So, what I make out from the information you've given, whatever happens to new Intel CPUs that are paired with a 3080 (like bottlenecks) will apply to me but in the same scale, not worse.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Consul said:

Yeah, it totally makes sense.
 

Pardon my amateurity but when I look at gaming benchmarks between 9900k and 8700k with a 2080 (or a 2080ti), there isn't much of a difference too.
Essentially, newer CPUs from Intel don't have the huge gap in performance when compared to a 8700k I believe.

So, what I make out from the information you've given, whatever happens to new Intel CPUs (like bottlenecks) will apply to me but in the same scale, not worse.

They are all the same design. 6th through 10th generation are all Skylake and their improvements. There's no real IPC differences between them - that is, if you clocked them all the same with the same core configurations, they'd behave the same.

 

What DID change:

 

-Improved fabrication and yields, meaning each generation was capable of more and more frequency. 

-Increased core count, meaning more lateral performance and more cache.

-Improved thermal performance - 9th gen performance chips were soldered again and 10th gen performance chips were soldered + featured improved PCB/IHS designs.

-Integrated security fixes that could potentially reduce overhead of BIOS/OS level security fixes.

 

All that said, if you could get an i7-7700k to 5ghz and an i9-10900k at 5ghz, and your game only needed four cores/threads exactly, and assuming background overhead was excluded, they'd perform very very close to one another.

 

So an 8700k at 5ghz vs a 10900k will be pretty damn close in games that don't care about cores beyond what the 8700k has.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Mister Woof said:

So an 8700k at 5ghz vs a 10900k will be pretty damn close in games that don't care about cores beyond what the 8700k has.

So it comes down to the type of games, thanks for the huge info dump, I really appreciate it!

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/6/2020 at 10:44 AM, Fasauceome said:

The 8700K, and any other Intel 6c/12t CPU within the last 3 release generations, is at the baseline for top tier gaming performance. They have the same single threaded performance (so, the fastest) and enough threads for good scaling in any game.

 

You can reasonably put any high end GPU with an 8700K.

Here's the video the OP references. Jay had noticed that when doing benchmarks on the 3080, the 8700k was bottlenecking the 3080 by 20fps. Now this was on 1080p and he recommended getting a 1440p monitor instead of a new CPU/MB for those in this situation.

But what about 1440p? I have a 8086k at 5GHz but I have a 1440p monitor unlike the OP. And even I'm worried now that I'll have to upgrade my CPU/MB too.     

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/16/2020 at 3:15 PM, Consul said:

So since the benchmarks are out, what do you guys think? I've seen Jayztwocentz talk about how a 8700k will bottleneck the 3080 on a 1080p monitor. So;

 

Will my 8700k bottleneck a 3080?
Will a new 1440p monitor remove my bottleneck?

I'm legitimately terrified about the idea of upgrading my CPU and in no way I'm thinking of doing it.

Let's pray together my friend. When I watched Jays video and he mentioned that, my heart sank. I have a 1440p monitor but the idea that I would have to do a build upgrade for the 30 series when I have an 8086k OC'd, terrifies me too. I would definitely get a 1440p monitor either way though, as that will be a major improvement for you. Let's hope that 1440p/Coffee Lake is safe with these cards.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×