Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Intel Fanboy

Is there any method to make integrated graphics and discrete GPU work together in a PC?

Recommended Posts

Posted · Original PosterOP

I'm looking to build a system with an Intel Core i5 10600k. I like the 10600k because it has a base clock of 4.1 ghz, and I don't want to change my CPU. I really like this one. Now the thing is that this one has integrated graphics, with is Intel UHD 630. I also will put a discrete GPU in the system, looking at RTX 2080 super. Will the 10600k bottleneck an rtx 2080 super? I don't think so. Also, is there any way to make the integrated graphics work with the rtx 2080s? I want that because i want this cpu because of it's high base clock, but it has integrated graphics, but i want to get a discrete gpu for power. Is there any way to make them work together? Thanks to anyone trying to help

Link to post
Share on other sites

It won't bottleneck. 

 

You can enable them both at once in the BIOS settings. In terms of getting them to work together, it depends on the workload. For games? Not really, at least, not in any way that's worth it. 

 

For some productivity applications that support multi-GPU processing, you can set them up to use both. Still, I'd advise against it, it can cause problems, with applications using the iGPU when you want it to use the discrete GPU for example. 

Link to post
Share on other sites

Shouldn’t be a bottleneck. As for getting them to work together, if you mean will it work, then yes both can be present in the system. If you mean both working at the same thing at the same time, it’s possible but I wouldn’t advise it.


I am far from an expert in this so please correct me if I’m wrong.

Quote or tag me so I can see your response

 

PSU Tier List

Motherboard Tier List

 

Link to post
Share on other sites
4 minutes ago, youngboy said:

I like the 10600k because it has a base clock of 4.1 ghz, and I don't want to change my CPU.

Clock speed doesn't make a CPU. Incidentally, the 10600K is a solid chip (assuming you're building a gaming PC), but keep in mind that there's more to it than just the clock speed written on the box.

5 minutes ago, youngboy said:

Will the 10600k bottleneck an rtx 2080 super?

Not in any current-gen title.

6 minutes ago, youngboy said:

Also, is there any way to make the integrated graphics work with the rtx 2080s?

Not in games. It's not like it would've made too noticeable of a difference anyway, since the UHD 630 isn't meant for anything more than video playback really on its own.

The one use for the iGPU would be QuickSync encoding, but even then it's not that big of a bonus imo.


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
Posted · Original PosterOP
32 minutes ago, Oshino Shinobu said:

You can enable them both at once in the BIOS settings. In terms of getting them to work together, it depends on the workload

This is going to be a gaming system for esports games. What is the BIOS setting for it?

 

33 minutes ago, Oshino Shinobu said:

Still, I'd advise against it, it can cause problems, with applications using the iGPU when you want it to use the discrete GPU for example. 

How do I switch the GPUs when I switch programs? I will use the 2080 s during games and igpu for web browsing. Do I have to keep unplugging and replugging my monitor? Or should I get a monitor that supports multiple displays? I kind of like this idea

 

30 minutes ago, zeusthemoose said:

but I wouldn’t advise it.

Why?

 

30 minutes ago, Mateyyy said:

Incidentally, the 10600K is a solid chip

That's why I chose it

 

31 minutes ago, Mateyyy said:

but keep in mind that there's more to it than just the clock speed written on the box.

I'm not doing overclocking, and I really like the other specs of the CPU, which is why I chose it, but the main spec I liked was the base clock. I also liked the other specs of this CPU, but the main one I liked was the base clock. 

 

Question: How do I switch between the igpu and the 2080 when I game vs when I web browse? Do I have to keep unplugging the replugging the HDMI on the monitor? Should I get a monitor that supports multiple display inputs?

Link to post
Share on other sites
2 minutes ago, youngboy said:

Why?

From what ive heard, it will be slow switching. It might even need a restart. I know when you switch what gpu is being used on a laptop, it needs to restart. Why do you want to use the igpu for web browsing?


I am far from an expert in this so please correct me if I’m wrong.

Quote or tag me so I can see your response

 

PSU Tier List

Motherboard Tier List

 

Link to post
Share on other sites
Posted · Original PosterOP
1 hour ago, zeusthemoose said:

From what ive heard, it will be slow switching. It might even need a restart. I know when you switch what gpu is being used on a laptop, it needs to restart. Why do you want to use the igpu for web browsing?

I mean it's enough for web browsing and watching youtube videos, but gaming will put enough strain on the 2080, so I might as well give it some rest and not use it when I don't need to. At least I'm making use of the igpu. Here is what I will do, I will buy a monitor that supports multiple HDMI inputs, and plug in an HDMI cable from the motherboard's HDMI port using the igpu and an HDMI cable from the 2080, and I will switch between them. Will that work? What tasks should I use the igpu for and what tasks should I use the discrete gpu for? Thanks

Link to post
Share on other sites
36 minutes ago, youngboy said:

so I might as well give it some rest and not use it when I don't need to.

I get what you mean, but doing simple things like web browsing and watching youtube uses around 3-4% on average (staying mostly around 1 but spikes too 5 every so often). Thats not enough to cause any type of strain or negative effects on your gpu. It would be more hassle then its worth to switch gpus.


I am far from an expert in this so please correct me if I’m wrong.

Quote or tag me so I can see your response

 

PSU Tier List

Motherboard Tier List

 

Link to post
Share on other sites
Posted · Original PosterOP
13 minutes ago, zeusthemoose said:

I get what you mean, but doing simple things like web browsing and watching youtube uses around 3-4% on average (staying mostly around 1 but spikes too 5 every so often). Thats not enough to cause any type of strain or negative effects on your gpu. It would be more hassle then its worth to switch gpus.

So what should I do with the iGPU build into this excellent chip?

Link to post
Share on other sites
6 minutes ago, youngboy said:

So what should I do with the iGPU build into this excellent chip?

I don’t use mine for anything. The chip itself is great, but the integrated gpu is ok at best.

another thing I forgot to mention earlier, is the integrated gpu uses your system memory as it’s vram when it’s being used. So by using it, you would have less usable ram.


I am far from an expert in this so please correct me if I’m wrong.

Quote or tag me so I can see your response

 

PSU Tier List

Motherboard Tier List

 

Link to post
Share on other sites
Posted · Original PosterOP
15 minutes ago, zeusthemoose said:

another thing I forgot to mention earlier, is the integrated gpu uses your system memory as it’s vram when it’s being used. So by using it, you would have less usable ram.

I know, but I'm going for 16 gb RAM, so even if it uses my system memory, I don't care since I have so much of it. I'm just not going to use the integrated graphics itself, and always use the 2080. I think that's what I will do. Thanks

Link to post
Share on other sites

In theory a iGPU normally is in use when on desktop or in power saving modes, espesially for laptops. ;)


Vishera-X8-9370 (workstation PC)
Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter / https://pcpartpicker.com/list/ZW4Jtg

Zen-II-X6-3600 (gaming PC)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600 (Base: @3.6GHz | Turbo: @4.2GHz) Black Edition Hexa-Core (T.S.M.C. 7nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: G.Skill Sniper X DDR4-3200MHz CL16-18-18-38-75-1T (2x8GB) 16.38GB / Operating System 1: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter / pcpartpicker link F3yNV7

Godavari-X4-880 (old config 3)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter / https://pcpartpicker.com/list/LBjbw6

Vishera-X8-8370 (old config 2)
Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-44 (4x4GB) 16.38GB / Operating System 1: Ubuntu Kylin 14.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter / https://pcpartpicker.com/list/ZW4Jtg

Kaveri-X4-860K (old config 1)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD FX-8370 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-2133MHz CL9-11-11-31-42 (4x4GB) 16.38GB / Operating System 1: Ubuntu Kylin 14.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Desktop 2TB SSHD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
MSM8926 - Microsoft Lumia 640 LTE
MSM8974AA - Blackberry Passport
MT2601 - TicWatch E
MT6580 - TECNO Spark 2 (1GB RAM)
MT6592M - my|phone my32 (orange)
MT6592M - my|phone my32 (yellow)
MT6735 - HMD Nokia 3 Dual SIM
MT6737 - Cherry Mobile Flare S6
MT6739 - my|phone myX8 (blue)
MT6739 - my|phone myX8 (gold)
MT6750 - Huawei honor 6C Pro / V9 Play
MT6765 - TECNO Pouvoir 3 Plus
MT6797D - my|phone Brown Tab 1
SDM710 - Oppo Realme 3 Pro

 

Link to post
Share on other sites
7 hours ago, youngboy said:

I'm not doing overclocking, and I really like the other specs of the CPU, which is why I chose it, but the main spec I liked was the base clock. I also liked the other specs of this CPU, but the main one I liked was the base clock

Here's an example: a 9900K has a base clock of 3.6GHz, while a 10600K has a base clock of 4.1GHz. That being said, these CPUs won't run at their base clock during load, they'll instead run at their all-core boost, which is? 4.7 on the 9900K and 4.5 on the 10600K.

Base clock doesn't mean a whole lot nowadays, that's basically my point. Don't make your decision just based on that.

 

7 hours ago, youngboy said:

I mean it's enough for web browsing and watching youtube videos, but gaming will put enough strain on the 2080, so I might as well give it some rest and not use it when I don't need to. At least I'm making use of the igpu. Here is what I will do, I will buy a monitor that supports multiple HDMI inputs, and plug in an HDMI cable from the motherboard's HDMI port using the igpu and an HDMI cable from the 2080, and I will switch between them. Will that work? What tasks should I use the igpu for and what tasks should I use the discrete gpu for? Thanks

I think you're worrying too much about what a modern GPU can handle. It would be pointless to take out the display cable out of the graphics card, plug it into the motherboard, restart, and rinse and repeat every time you open/close a game. Like it actually doesn't make any sort of sense.

Switching over from the iGPU to the dGPU does make sense in laptops, for power saving reasons, but that's also implemented differently in those cases (Nvidia Optimus for instance is only features in laptops, because that's where it actually has an use).


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
Posted · Original PosterOP
7 hours ago, Mateyyy said:

ere's an example: a 9900K has a base clock of 3.6GHz, while a 10600K has a base clock of 4.1GHz. That being said, these CPUs won't run at their base clock during load, they'll instead run at their all-core boost, which is? 4.7 on the 9900K and 4.5 on the 10600K.

Base clock doesn't mean a whole lot nowadays, that's basically my point. Don't make your decision just based on that.

 

I have a laptop that has a dual core celeron base clock is 1.1 ghz. It always runs at 1.1 ghz, even when it's under load. so that's why I think base clock is the normal max speed, and boost clock is max speed from overclocking. Anyways, I still really like the 10600k.

 

8 hours ago, Nena Trinity said:

In theory a iGPU normally is in use when on desktop or in power saving modes, espesially for laptops. ;)

I'm just going to forget about the igpu and always use the 2080s. 

Link to post
Share on other sites
1 minute ago, youngboy said:

I have a laptop that has a dual core celeron base clock is 1.1 ghz. It always runs at 1.1 ghz, even when it's under load. so that's why I think base clock is the normal max speed, and boost clock is max speed from overclocking. Anyways, I still really like the 10600k.

I know, it was that way on my old i5 too, but recent CPU generations have made the base clock kind of irrelevant with Turbo Boost.


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
Posted · Original PosterOP
Just now, Mateyyy said:

I know, it was that way on my old i5 too, but recent CPU generations have made the base clock kind of irrelevant with Turbo Boost.

So what does base and boost clock even mean?

Link to post
Share on other sites
3 minutes ago, youngboy said:

So what does base and boost clock even mean?

The base clock you'll probably see only if you run into thermal or power limitations.

 

The boost clock, or at least the actual advertised boost clock refers to the speed your CPU (or in this case, one core) will run at during a single-core load, and this is the case for both Intel and AMD. During an all-core load, you'll see something around 300MHz less than that, depending on the CPU.

In the case of the 10600K, according to multiple reviews, it has a 4.5GHz all-core boost clock. So basically, during most workloads, your CPU will run at 4.5GHz by default.


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
Posted · Original PosterOP
12 minutes ago, Mateyyy said:

In the case of the 10600K, according to multiple reviews, it has a 4.5GHz all-core boost clock. So basically, during most workloads, your CPU will run at 4.5GHz by default.

So it can reach the boost clock without overclocking? Why don't laptop cpu's do that?

Link to post
Share on other sites
17 minutes ago, youngboy said:

So it can reach the boost clock without overclocking? Why don't laptop cpu's do that?

Yes.

Most do, Celerons though don't, probably because they're lower-end.


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
Just now, youngboy said:

Will i5 reach boost clock?

As I said previously, the 10600K should run at 4.5GHz in most workloads by default.


Desktop: Intel Core i9-9900K (w/ TG Hydronaut) | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | Corsair Vengeance LPX 32GB 3000MHz CL15 | Intel 660p 1TB | Samsung 860 EVO 500GB | WD Green 2TB | EVGA GeForce RTX 2070 SUPER XC Ultra (w/ TG Hydronaut) | Corsair RM650x | Fractal Design Define R6 Blackout USB-C

Displays: BenQ BL2420PT & Alienware AW2521HF

Peripherals: Steelseries Rival 600 & Logitech MX Master 3 | Ducky Shine 7 Gunmetal (Cherry MX Brown) | Sennheiser Game One

Laptop:  Apple MacBook Pro 13" 2018 - i5-8259U | 8GB RAM | 512GB SSD

Link to post
Share on other sites
20 minutes ago, youngboy said:

Will i5 reach boost clock the chip I want?

Without overcooking it will reach it on one core but will run at 4.5 on all (like @Mateyyy said) if you do over clock, you might be able to get it on all cores but that depends on your over clocking abilities, cooling solution, and silicon quality.


I am far from an expert in this so please correct me if I’m wrong.

Quote or tag me so I can see your response

 

PSU Tier List

Motherboard Tier List

 

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, zeusthemoose said:

Without overcooking it will reach it on one core but will run at 4.5 on all (like @Mateyyy said) if you do over clock, you might be able to get it on all cores but that depends on your over clocking abilities, cooling solution, and silicon quality.

Okay thanks

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×