Jump to content

Intel 10-Core Comet Lake-S CPU Could Suck Up To 300W

https://www.tomshardware.com/news/intels-new-desktop-processor-draws-too-much-power

Quote

Intel might have finally hit a wall with Comet Lake-S.

Quote

As we've witnessed, the Core i9-9900K can draw over 200W of power when pushed to the edge. The Core i9-9900K only sports eight cores and a 4.7 GHz all-core boost though. The Core i9-10900K, which is the rumored flagship model in Intel's 10th-generation Comet Lake-S family, comes with two additional cores. The chip seemingly features a 4.9 GHz all-core boost. ComputerBase's sources claim that the forthcoming 10-core Comet Lake-S part can pull over 300W at maximum load. This information is plausible taking into consideration that the processor has a PL2 (Power Level 2) of 250W.

Quote

Obviously, motherboard vendors are very upset with Intel at the moment. Apparently, the new Intel 400-series motherboards are ready to go, but Intel hasn't been able to optimize the 10-core processor's power draw for smooth operation. Initial speculation put Comet Lake-S's release date somewhere in February. As how things stand right now though, the launch might be pushed to April or May.

Quote

In actuality, the excessive power consumption could explain why Intel might not launch any 10-core mobile Comet Lake parts.

AMD cpu suck up to 300watts

Screenshot_20200109-143152_DuckDuckGo.jpg.6877f09fcce1f0ebb4c12a55f46d630c.jpg

 

Intel cpu suck up 300watts

Screenshot_20200109-143216_DuckDuckGo.jpg.6d3eface0b2ec6a236dc824164c6da5c.jpg

 

300watts power consumption on 14nm comes with a feature to cook your egg like GTX 480 did in the past. But wait, our cpu can cook faster than GTX 480 up to 25%! In addition, our 10 core cpu can be a portable heater too. 

Link to comment
Share on other sites

Link to post
Share on other sites

Watt the fuck

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Granted Skylake and it's many, many derivatives have been good architectures for a while, but this is really starting to look pathetic. Not nearly as bad, but comparable to AMD's shitty FX refreshes that inched performance upwards in exchange for workstation-class power draw.

Ryzen 5 1600 @ 3.8Ghz w/ Arctic Freezer 33 Tower Cooler | MSI B450 Tomahawk |  32GB Crucial Ballistix DDR4 3200MHz CAS 14 

Sapphire RX 5700XT Pulse | EVGA 650w GQ 80+ Gold Semi-Modular  |  XPG SX6000 512GB Nvme SSD | NZXT H500

Acer XF270HU - 1440p 144Hz Freeesync IPS | Corsair Strafe - Cherry MX Red  |  Logitech G502

Link to comment
Share on other sites

Link to post
Share on other sites

I...can't say I care. If it performs better I wouldn't care if it uses a little more power, provided the board and coolers are available to sustain it. People are drawing comparisons to the 9590, but IIRC that chip consumed tons of power AND undeperformed in ever aspect. 

 

Watching a bunch more motherboards catch fire and melt as people try to put them in boards they shouldn't be will bring me some entertainment though lol

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, dizmo said:

I...can't say I care. If it performs better I wouldn't care if it uses a little more power, provided the board and coolers are available to sustain it.

^^^ I'd care even less, my CPU already eats 300W or more lol. I don't think many enthusiasts buying the top end SKU really care all that much about power consumption. ?

 

3 minutes ago, dizmo said:

Watching a bunch more motherboards catch fire and melt as people try to put them in boards they shouldn't be will bring me some entertainment though lol

Top 10 reasons for Intel to not have backwards compatibility with older mobos, it narrows the range of fire hazards uninformed or idiotic people have access to. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, dizmo said:

I...can't say I care. If it performs better I wouldn't care if it uses a little more power, provided the board and coolers are available to sustain it. People are drawing comparisons to the 9590, but IIRC that chip consumed tons of power AND undeperformed in ever aspect. 

 

Watching a bunch more motherboards catch fire and melt as people try to put them in boards they shouldn't be will bring me some entertainment though lol

idk how i feel about having to buy a noctua nh-dh15 cooler for a consumer cpu

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, steelo said:

Is Intel even in the top 20 for best price to performance ratio anymore?

Would depend on where you're looking at performance numbers. In gaming they'd probably have at least one spot somewhere, outside of gaming eeeeh... very few workloads. AVX512 or something is an Intel specialty AFAIK, other than that not really much. 

Although there's often more to the argument than sheer fps/$. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

With great power comes great thermal responsibility. 

 

 

Ryzen 7 2700x | MSI B450 Tomahawk | GTX 780 Windforce | 16GB 3200
Dell 3007WFP | 2xDell 2001FP | Logitech G710 | Logitech G710 | Team Wolf Void Ray | Strafe RGB MX Silent
iPhone 8 Plus ZTE Axon 7 | iPad Air 2 | Nvidia Shield Tablet 32gig LTE | Lenovo W700DS

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Zando Bob said:

Would depend on where you're looking at performance numbers. In gaming they'd probably have at least one spot somewhere, outside of gaming eeeeh... very few workloads. AVX512 or something is an Intel specialty AFAIK, other than that not really much. 

Although there's often more to the argument than sheer fps/$. 

True, they do ever so slightly hold the gaming edge. But as far as consumer grade CPU's they've got to be one of the worst in value.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, dizmo said:

I...can't say I care. If it performs better I wouldn't care if it uses a little more power, provided the board and coolers are available to sustain it. People are drawing comparisons to the 9590, but IIRC that chip consumed tons of power AND undeperformed in ever aspect. 

 

Watching a bunch more motherboards catch fire and melt as people try to put them in boards they shouldn't be will bring me some entertainment though lol

Give me a 600W TDP...fuck if I care with a waterblock and car-sized radiator.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

For context, my 8086k (6 core) running Prime95 6x128k FFT with uncapped power took about 120W. Scaling that to 10 cores (assuming same clocks) would give 200W. The all core clock is 4.3 GHz. If Intel have increased the all core boost even at 10 cores, that could result in the higher claimed consumption.

 

Historically Intel mainstream enthusiast systems (anyone pairing a k CPU to a Z chipset) would find mobos default to unlimited turbo power with MCE off. This is not considered by Intel to be an overclock since it does not directly modify the clock behaviour, other than it doesn't power limit. In that scenario, with the right workload (Prime95 with smaller FFTs) you can pull quite a lot of power. People saying Prime95 isn't representative of most workloads - this is your time to shine! MCE on is a mobo enabled overvoltage/overclock so not considered here.

 

AMD behaviour is different, in that by default the PPT limit is honoured by the mobo by default. While there isn't a 10 core CPU to compare against, it's more than the 8 a single CCD supports so let's assume for now the nearest to compare is a 3900X. Although it is 105W TDP, the actual PPT limit is 142W. By default, it wont take more than 142W. If you enable PBO, that removes the power limit and is more comparable in operating state to Intel's. I have no idea what power you can draw under that condition. My attempts at testing using the 6 core 3600 were thermally limited since the thermal density is higher than on Intel, so I couldn't push the power limiter before the thermal limiter kicked in. There is also another difference to note here, in that I believe PBO is considered by AMD to be an overclock. Please correct if necessary. As another data point, when I was overclocking a 1700 (8 cores) it was drawing 180W at 4.0 GHz. Higher powers are not surprising if you push the CPUs enough.

 

I have to wonder, if a solution for this would be for mobo manufacturers to enforce a PL2 value suitable for their mobo's capacity to deliver power. PL2 is the short term boost power limit level that enthusiast boards traditionally ignore. Overclockers who want to go all the way can then pick those supporting the higher PL2 values.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, steelo said:

True, they do ever so slightly hold the gaming edge. But as far as consumer grade CPU's they've got to be one of the worst in value.

Depends on your aims with the hardware. I paid $99 for my X99 Classy and $290 for my 5960X, it's both more fun to OC and cost me less than the $330 2700X and $290 Crosshair VII combo I had before. Since I like to OC and don't mind the power consumption or the fact that the CPU is used and the mobo is refurbished, it's a better value for me. 

Now if I were to take that same amount of money and put it into a Zen 2 system... well I wouldn't really get much better value. Could get... a 3700X on a B450 board? No quad channel RAM, cutting my PCIe lanes down to 20-24 IIRC (5960X is 40), lower quality board overall. The CPU would perform better but not really all that much better (they're almost equal with a 9900K right? My 5960X isn't too far off from a stock 9900K in the first place, and I only play at 75Hz). Also it wouldn't be all that fun to OC, the headroom is way lower. You can tinker with them in different ways though, but eeh I'm comfortable with how Intel works. Couldn't push much on an $80 or so board anyways I imagine, and IDK what used/refurbished higher end boards there are available around that price point (3700X is a bit more than what my 5960X cost me, so I'd only have $70-80 to spend on a mobo). 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, porina said:

For context, my 8086k (6 core) running Prime95 6x128k FFT with uncapped power took about 120W. Scaling that to 10 cores (assuming same clocks) would give 200W. The all core clock is 4.3 GHz. If Intel have increased the all core boost even at 10 cores, that could result in the higher claimed consumption.

Curious as to how much power your particular sample uses on a 4.9 GHz all-core OC? That would paint a better apples-to-apples comparison of per-core-scaling power consumption.

Link to comment
Share on other sites

Link to post
Share on other sites

what's shocking to me is that they're releasing a 10 core to combat a 16 core. have fun for the next 2 years hanging on the back !

 

Thought i'd never say this about Intel my whole life, But stop selling 2nd grade cut down product when your competition is close to 2 steps ahead in performance.

Details separate people.

Link to comment
Share on other sites

Link to post
Share on other sites

Bulldozer 2 Electric Boogaloo.

AMD Ryzen 7 3700X | Thermalright Le Grand Macho RT | ASUS ROG Strix X470-F | 16GB G.Skill Trident Z RGB @3400MHz | EVGA RTX 2080S XC Ultra | EVGA GQ 650 | HP EX920 1TB / Crucial MX500 500GB / Samsung Spinpoint 1TB | Cooler Master H500M

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, porina said:

...

 

AMD behaviour is different, in that by default the PPT limit is honoured by the mobo by default. While there isn't a 10 core CPU to compare against, it's more than the 8 a single CCD supports so let's assume for now the nearest to compare is a 3900X. Although it is 105W TDP, the actual PPT limit is 142W. By default, it wont take more than 142W. If you enable PBO, that removes the power limit and is more comparable in operating state to Intel's. I have no idea what power you can draw under that condition.

...

With PBO on motherboard limits, my 3900X has never went beyond 165W on PPT under stress testing conditions based on my personal memory.

CPU: Ryzen 9 3900X | Cooler: Noctua NH-D15S | MB: Gigabyte X570 Aorus Elite | RAM: G.SKILL Ripjaws V 32GB 3600MHz | GPU: EVGA RTX 3080 FTW3 Ultra | Case: Fractal Design Define R6 Blackout | SSD1: Samsung 840 Pro 256GB | SSD2: Samsung 840 EVO 500GB | HDD1: Seagate Barracuda 2TB | HDD2: Seagate Barracuda 4TB | Monitors: Dell S2716DG + Asus MX259H  | Keyboard: Ducky Shine 5 (Cherry MX Brown) | PSU: Corsair RMx 850W

Link to comment
Share on other sites

Link to post
Share on other sites

My next gaming laptop/PC will definitely NOT have an Intel CPU if this is the case.  My battery life is already mediocre; this would just make it worse.  And that fucking heat output!  UGH!

 

Sorry for the mess!  My laptop just went ROG!

"THE ROGUE":  ASUS ROG Zephyrus G15 GA503QR (2021)

  • Ryzen 9 5900HS
  • RTX 3070 Laptop GPU (80W)
  • 24GB DDR4-3200 (8+16)
  • 2TB SK Hynix NVMe (boot) + 2TB Crucial P2 NVMe (games)
  • 90Wh battery + 200W power brick
  • 15.6" 1440p 165Hz IPS Pantone display
  • Logitech G603 mouse + Logitech G733 headset

"Hex": Dell G7 7588 (2018)

  • i7-8750H
  • GTX 1060 Max-Q
  • 16GB DDR4-2666
  • 1TB SK Hynix NVMe (boot) + 2TB Crucial MX500 SATA (games)
  • 56Wh battery + 180W power brick
  • 15.6" 1080p 60Hz IPS display
  • Corsair Harpoon Wireless mouse + Corsair HS70 headset

"Mishiimin": Apple iMac 5K 27" (2017)

  • i7-7700K
  • Radeon Pro 580 8GB (basically a desktop R9 390)
  • 16GB DDR4-2400
  • 2TB SSHD
  • 400W power supply (I think?)
  • 27" 5K 75Hz Retina display
  • Logitech G213 keyboard + Logitech G203 Prodigy mouse

Other tech: Apple iPhone 14 Pro Max 256GB in White, Sennheiser PXC 550-II, Razer Hammerhead earbuds, JBL Tune Flex earbuds, OontZ Angle 3 Ultra, Raspberry Pi 400, Logitech M510 mouse, Redragon S113 keyboard & mouse, Cherry MX Silent Red keyboard, Cooler Master Devastator II keyboard (not in use), Sennheiser HD4.40BT (not in use)

Retired tech: Apple iPhone XR 256GB in Product(RED), Apple iPhone SE 64GB in Space Grey (2016), iPod Nano 7th Gen in Product(RED), Logitech G533 headset, Logitech G930 headset, Apple AirPods Gen 2 and Gen 3

Trash bin (do not buy): Logitech G935 headset, Logitech G933 headset, Cooler Master Devastator II mouse, Razer Atheris mouse, Chinese off-brand earbuds, anything made by Skullcandy

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Tech_Dreamer said:

what's shocking to me is that they're releasing a 10 core to combat a 16 core. have fun for the next 2 years hanging on the back !

 

Thought i'd never say this about Intel my whole life, But stop selling 2nd grade cut down product when your competition is close to 2 steps ahead in performance.

you want 480w power draw? :P

Link to comment
Share on other sites

Link to post
Share on other sites

The last AMD cpu I used was the Athlon II X4 635 Black Edition, maybe this year I should try AMD again ?

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Harry P. Ness said:

AMD cpu suck up to 300watts

 

Intel cpu suck up 300watts

 

300watts power consumption on 14nm comes with a feature to cook your egg like GTX 480 did in the past. But wait, our cpu can cook faster than GTX 480 up to 25%! In addition, our 10 core cpu can be a portable heater too. 

300 watts on an AMD chip would produce the exact same amount of heat as 300 watts on an Intel chip.

MacBook Pro 16 i9-9980HK - Radeon Pro 5500m 8GB - 32GB DDR4 - 2TB NVME

iPhone 12 Mini / Sony WH-1000XM4 / Bose Companion 20

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×