Jump to content

Intel to use solder TIM on some of their 9th gen cpus

NumLock21
6 minutes ago, Stefan Payne said:

So why is it OK if Intel makes a CPU that consumes as much Power as an FX9590 but when AMD does it its not??

 

That is the Problem I have right now, on one hand you defend 250W Power for the CPU, on the other hand yo do not. Either you care about it or you don't...

 

I'm talking about HARDWARE Fixes, not some software Mitigations/Workarounds.


FX Series were a fucking flop and we all know it, that's why. The CPU doesn't anywhere pull near 250W even with a mild OC. I care about the heat it dissipates, not the power consumption. Most of us do own sufficiently powerful PSUs to power up the CPU, that's a non-issue for most people. Cooling, is more of an issue however neither that is going to be with my customized liquid cooling setup.

Intel compromised performance for security when they plowed their way through the recent generations, but that is slowly getting worked on.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Stefan Payne said:

What about Power Consumption?
What About TDP??
The Intel CPUs we have right now already break the TDP, that isn't even worth the paper it is written on...

At least they should offer two values: One base value, one max. Turbo Value...

 


And their biggest Problem is still Meltdown and Spectre.


And _THAT_ is the most interesting point of that: Will those CPUs have anything done against Spectre/Meltdown or not?

Power Consumption will be highly dependent on the load, voltage, the efficiency of the components that actually supply power to the CPU itself. This is true for Intel, AMD, IBM, VIA, NVidia, Qualcomm of for any IC.

 

The TDP is 95W at base clock as their own white papers specify. It doesn't break TDP as they define it. Which is based off of base clock. So their according to their specs they will be 95W at 3.6, 3.6, 3.7GHz for the 9900K, 9700K, and 9600K respectively. Any turbo boost past those specs will be entirely dependent on the cooling.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Dylanc1500 said:

Power Consumption will be highly dependent on the load, voltage, the efficiency of the components that actually supply power to the CPU itself. This is true for Intel, AMD, IBM, VIA, NVidia, Qualcomm of for any IC.

 

The TDP is 95W at base clock as their own white papers specify. It doesn't break TDP as they define it. Which is based off of base clock. So their according to their specs they will be 95W at 3.6, 3.6, 3.7GHz for the 9900K, 9700K, and 9600K respectively. Any turbo boost past those specs will be entirely dependent on the cooling.

Power consumption and tdp are different anyways. I mean if they want a more realistic number for two they would have to have a different set of standards. I mean what would the standard be? Test all the cpus at the max all core turbo speed? I would suspect that there is much higher variance in thermals at max all core boost based on the silicone lottery where as at the base it likely doesn't matter as much. 

Link to comment
Share on other sites

Link to post
Share on other sites

I think the i7 9700 line will monster gaming chips. 

 

Bleigh!  Ever hear of AC series? 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

Power consumption and tdp are different anyways. I mean if they want a more realistic number for two they would have to have a different set of standards. I mean what would the standard be? Test all the cpus at the max all core turbo speed? I would suspect that there is much higher variance in thermals at max all core boost based on the silicone lottery where as at the base it likely doesn't matter as much. 

I know, that was why I had entirely separated the points. I won't speak on the other points as it gets real complicated real quick, and is out of the scope of this topic.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stefan Payne said:

So why is it OK if Intel makes a CPU that consumes as much Power as an FX9590 but when AMD does it its not??

To be fair, it was more because the FX-9590 initially released with a price over $800 AND consumed far more power than the competition while still losing in many benchmarks.

 

FX just wasn't all that competitive.

 

Nowadays with Threadripper, though, it's an entirely different story. They're efficient and extremely powerful, so no one complains.

Current Build:

CPU: Ryzen 7 5800X3D

GPU: RTX 3080 Ti FE

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Tuf X570 Plus Wifi

CPU Cooler: NZXT Kraken X53

PSU: EVGA G6 Supernova 850

Case: NZXT S340 Elite

 

Current Laptop:

Model: Asus ROG Zephyrus G14

CPU: Ryzen 9 5900HS

GPU: RTX 3060

RAM: 16GB @3200 MHz

 

Old PC:

CPU: Intel i7 8700K @4.9 GHz/1.315v

RAM: 32GB G.Skill Trident Z CL16 3200 MHz

Mobo: Asus Prime Z370-A

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Motifator said:

 The CPU doesn't anywhere pull near 250W even with a mild OC.

Sadly I can't link to the "POwer Consumption vs. Overclocking" Article that XBitlabs did a while ago. Although its old, it proves you wrong.

 

And the modern i7-8700 already is at 160W Max Load.

https://www.tomshardware.com/reviews/amd-ryzen-7-2700x-review,5571-12.html

 

So with that, lets assume  160/6*8 -> ~213W

And that is at default....

1 hour ago, Motifator said:

I care about the heat it dissipates, not the power consumption.

Though both are the same, you know...

As its impossible for energy to get lost, it has to be converted into another form.

And sincer there is no movement energy or light energy that the CPU can produce, almost 100% of the power it consumes is converted into heat.

 

1 hour ago, Motifator said:

Most of us do own sufficiently powerful PSUs to power up the CPU, that's a non-issue for most people. Cooling, is more of an issue however neither that is going to be with my customized liquid cooling setup.


Intel compromised performance for security when they plowed their way through the recent generations, but that is slowly getting worked on.

So you're saying that its only an issue when one does it but not the other, right??

1 hour ago, Dylanc1500 said:

Power Consumption will be highly dependent on the load, voltage, the efficiency of the components that actually supply power to the CPU itself. This is true for Intel, AMD, IBM, VIA, NVidia, Qualcomm of for any IC.

Yes, obviously...

 

1 hour ago, Dylanc1500 said:

The TDP is 95W at base clock as their own white papers specify. It doesn't break TDP as they define it.

But you agree that the "Intel TDP" these days is just bullshit that isn't worh the paper it is written on as it doesn't really define anything.

 

As the base clock is almost never a thing.

 

And if I'm mean I'd link to the TDP Specifications of the Noctua NH-L9i and compare it to the AMD Variant. THAT would be really mean.

You know why? 

Because Noctua states that you have to disable the Turbo for the 8xW Haswells for it to operate and that the 95W TDP CPUs don't work with the heatsin. While on the AMD Side, they state that even 105W TDP CPUs (2700x) are possible...

1 hour ago, Dylanc1500 said:

Which is based off of base clock. So their according to their specs they will be 95W at 3.6, 3.6, 3.7GHz for the 9900K, 9700K, and 9600K respectively. Any turbo boost past those specs will be entirely dependent on the cooling.

...and that's why its bullshit as nobody uses the CPU at base clock because the Turbo is always on and boosting to something else.

 

So that means that its a bullshit specification with no value as it doesn't really specify anything.


Its like specifying a Pentium D 840 for 65W TDP...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, seoz said:

AMD is really making Intel their b*tch right now, and I am admittedly enjoying this show.

Yep, me too

 

intel >> aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

amd >> 32 cores

intel >> aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Motifator said:


If you could buy a 9900K, you can cope with its power consumption. Heat, will be handled better this time around due to the solder and more refined architecture compared to Skylake-X.

Yeah, no. People are going to buy chips and stick them in SFF builds or prebuilts based upon the TDP rating. The solder and 'process refinements' may mean the heat ramps up slightly slower, but if the cooler doesn't have the capacity, it won't make a difference long-term.

Link to comment
Share on other sites

Link to post
Share on other sites

Does anyone have any knowledge about the CPU caches here?

 

I'm curious how the i7 and i5 have 1.5Mb per thread/core, but the i9 has only 1mb per thread (2mb per core). Should the i9 really be rocking a bit more cache, or is it really only affecting the cores rather than the threads?

Edited by SADS
grammar and clarity
Link to comment
Share on other sites

Link to post
Share on other sites

I love how they promote things lol. Like getting STIM shouldn't be this big'ol feature, and oh boy it's an i9!? Did u call it an i9 just so that you can throw another "first" in the list? Cause why did it have to be an i9, I need a refresher on how they name chips. xP

- Fresher than a fruit salad.

Link to comment
Share on other sites

Link to post
Share on other sites

Good, maybe finally a reason to replace my 2600K xD

Desktop:     Core i7-9700K @ 5.1GHz all-core = ASRock Z390 Taichi Ultimate = 16GB HyperX Predator DDR4 @ 3600MHz = Asus ROG Strix 3060ti (non LHR) = Samsung 970 EVO 500GB M.2 SSD = ASUS PG279Q

 

Notebook:  Clevo P651RG-G = Core i7 6820HK = 16GB HyperX Impact DDR4 2133MHz = GTX 980M = 1080p IPS G-Sync = Samsung SM951 256GB M.2 SSD + Samsung 850 Pro 256GB SSD

Link to comment
Share on other sites

Link to post
Share on other sites

i7 without hyperthreading? So we're taking two steps backward with the i7 line to take one step forward with the i9 line?

 

NEAT!

9900K  / Noctua NH-D15S / Z390 Aorus Master / 32GB DDR4 Vengeance Pro 3200Mhz / eVGA 2080 Ti Black Ed / Morpheus II Core / Meshify C / LG 27UK650-W / PS4 Pro / XBox One X

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, OriAr said:

40 PCI Lanes and native TB3 support intrigue me to say the least.

 

23 hours ago, TVwazhere said:

Uhm, hold on...

 

40 PCI-E lanes?

 

22 hours ago, NumLock21 said:

It's 40 Platform PCIe Lanes

It's 40 Platform PCIe Lanes

 

Ya, 24 from the chipset, so it only has 16 from the CPU what a let down.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Stefan Payne said:

So why is it OK if Intel makes a CPU that consumes as much Power as an FX9590 but when AMD does it its not??

 

That is the Problem I have right now, on one hand you defend 250W Power for the CPU, on the other hand yo do not. Either you care about it or you don't...

 

I'm talking about HARDWARE Fixes, not some software Mitigations/Workarounds.

You do know the new 2990wx is using over 350 watts WHEN IDLE if you run 3.8+ on all cores right? They have shown when overclocked to 4 ghz on all cores that it is capable of pulling a little over 900 watts from the wall on a chip marked for only 250watts. I mean hell it is pulling more wattage at idle than a overclocked i7 or R7.

 

So both sides are guilty of fudging their TDP and chances are if we ran these on stock without any bios settings adding MCE or the like... then they would fall pretty close to those numbers. The problem is with how motherboards are configured from the factory this rarely happens on these chips.

 

Anyways my other point is that people aren't really giving either of them a hard time atm. The fx9590 was hot, power hungry, and offered performance that was not proportional to those numbers. I think that is the key difference.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, NumLock21 said:

...The 9900K will be the very first mainstream i9 CPU for the desktop...

Which begs the question, what does i9 even mean?  For a long time there, we had a decent system.  i3 was 2/4, i5 was 4/4, and i7 was 4/8... except when it wasn't.  It was 4/8 on the mainstream but enthusiast chips were anything higher.  Now to me, the most incredibly obvious thing to do is to throw all those enthusiast under the i9 branding, but when it came out, that's not what it was, and to be honest I still don't understand what differentiates an i7 from an i9.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AngryBeaver said:

You do know the new 2990wx is using over 350 watts WHEN IDLE if you run 3.8+ on all cores right? They have shown when overclocked to 4 ghz on all cores that it is capable of pulling a little over 900 watts from the wall on a chip marked for only 250watts. I mean hell it is pulling more wattage at idle than a overclocked i7 or R7.

 

So both sides are guilty of fudging their TDP and chances are if we ran these on stock without any bios settings adding MCE or the like... then they would fall pretty close to those numbers. The problem is with how motherboards are configured from the factory this rarely happens on these chips.

 

Anyways my other point is that people aren't really giving either of them a hard time atm. The fx9590 was hot, power hungry, and offered performance that was not proportional to those numbers. I think that is the key difference.

That TDP, like any chip, is given for stock configuration.  If and when you overclock, obviously that goes out the window.  Couple that with the fact that different companies rate TDP in different ways and it's not surprising in the least that the 250 figure doesn't mean much.  Anyone paying attention would have expected ~600 W overclocked and already knew it was going to be meaningless.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Ryan_Vickers said:

Which begs the question, what does i9 even mean?  For a long time there, we had a decent system.  i3 was 2/4, i5 was 4/4, and i7 was 4/8... except when it wasn't.  It was 4/8 on the mainstream but enthusiast chips were anything higher.  Now to me, the most incredibly obvious thing to do is to throw all those enthusiast under the i9 branding, but when it came out, that's not what it was, and to be honest I still don't understand what differentiates an i7 from an i9.


It's a moniker, but generally i9 tends to mean more workforce as far as core amount goes. It has been a thing starting with Skylake-X, people talked about this when the 7900X came out as well.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

That TDP, like any chip, is given for stock configuration.  If and when you overclock, obviously that goes out the window.  Couple that with the fact that different companies rate TDP in different ways and it's not surprising in the least that the 250 figure doesn't mean much.  Anyone paying attention would have expected ~600 W overclocked and already knew it was going to be meaningless.

I agree. Apparently though some people in this thread expect a 90w chip to never break 90 watts. When the truth is those 90w chips tossed into a motherboard with MCE on by default will pull around 130+ under load right off the bat

 

I think wattage ratings are only good for giving you an idea of how power hungry a chip is going to be. Obviously a 90w cpu is going to have higher clocks than say a 15w chip/

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Motifator said:


FX Series were a fucking flop and we all know it, that's why. The CPU doesn't anywhere pull near 250W even with a mild OC. I care about the heat it dissipates, not the power consumption. Most of us do own sufficiently powerful PSUs to power up the CPU, that's a non-issue for most people. Cooling, is more of an issue however neither that is going to be with my customized liquid cooling setup.

Intel compromised performance for security when they plowed their way through the recent generations, but that is slowly getting worked on.

FX are alright chips, just depends what you're using them for. People complain about heat issues with the fx series, but check this shit out:

CoreTemp-Scr.png.f5568befc652f95325cf812be7ea90f4.png.e5f3f2404a670c6c10c95e3924651096.png

No heat issues for me and I'm only using a Zalman CNPS5X Performa and Thermal Grizzly Kryonaut paste. Kept it at stock speeds for that because otherwise it can't represent the normal temperatures of an fx 6300 out of the box.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Aetheria said:

Yeah, no. People are going to buy chips and stick them in SFF builds or prebuilts based upon the TDP rating. The solder and 'process refinements' may mean the heat ramps up slightly slower, but if the cooler doesn't have the capacity, it won't make a difference long-term.

Exactly!

And thats also what you see when you go on the Noctua Website and look up heatsinks and for what CPUs they are approved to work with the Cooler. You can choose any low profile Noctua Heatsink.

https://noctua.at/en/tdp-guide

 

I've mentioned L9i and L9a (yes, they are a bit different but that doesn't affect the Cooling Performance much).

 

 

8 hours ago, Aetheria said:

No, it is Intel's fault for falsely advertising it as something suitable for SFF builds and low-TDP coolers.

Well, technically its not false advertizing as intel Specifys the Base Clock as the Point of the TDP.

That only makes this specification useless or bullshit, as it only takes a look at a part of the CPU Consumption.

 

So they shouldn't have called it TDP, its the same shit that AMD has done with Socket F Opterons in the Past as they invented ACP. The way Intel specifys the TDP is similar to this...

 

And that means that with the Turbo they can violate the TDP however they want and we should criticize and call out and NOT defend!

 

46 minutes ago, Ryan_Vickers said:

That TDP, like any chip, is given for stock configuration.  If and when you overclock, obviously that goes out the window.

And that is the Problem with the Intel TDP, that it doesn't even apply for stock configuration. It Aplies for Stock Configuration if you disable Turbo. But the Turbo is part of the Stock Configuration and at least the power consumption with Turbo should have been mentioned somewhere in the specification of the CPU! But its not. 

 

Ähm what the?! SRYSLY?!

*ARGH*....

 

 

So what Intel is saying that the CPU with deactivated Turbo Mode consumes around this specified TDP mark. 

With Turbo anything goes. And that makes this TDP pretty much useless and problematic...

 

While the other side seem to take Power Consumption with Turbo somewhat into account...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AngryBeaver said:

I agree. Apparently though some people in this thread expect a 90w chip to never break 90 watts.

No, but stay somewhere in that ballpark.

And that is what doesn't happen.

 

And we are more talking about 65W TDP than 90W and even then, it should be somewhere in the Ballpark and not off by 75% or more...

And that is what should be criticized because you have to use the TDP for something but you can't even really do that right now...

 

 

1 hour ago, AngryBeaver said:

When the truth is those 90w chips tossed into a motherboard with MCE on by default will pull around 130+ under load right off the bat

And that is why I'm saying that Intel should mention the Power Consumption with max all Core Turbo or while turboing and not just the TDP at the Base Frequency.

 

But that might look bad on paper if you write 65W/150W TDP...

Because back in the olden days you'd specify the higher wattage or something in the middle but closer to the max.

 

1 hour ago, AngryBeaver said:

I think wattage ratings are only good for giving you an idea of how power hungry a chip is going to be. Obviously a 90w cpu is going to have higher clocks than say a 15w chip/

Yes of course.

And the Problem begins if you really are power limited and have to take the power consumption into accound and need the CPU to not break the TDP - like an ANtec ISK 110 for example.

 

There is a reason why I'm always talking about the NH-L9i and a Versions. Because its the rigs where you can only use those and nothing else is possible as that is the biggest Heatsink you can put in a case. In these cases you have a Power Supply that only does 5A on +12V (6A on 5V)...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Ryan_Vickers said:

Which begs the question, what does i9 even mean?  For a long time there, we had a decent system.  i3 was 2/4, i5 was 4/4, and i7 was 4/8... except when it wasn't.  It was 4/8 on the mainstream but enthusiast chips were anything higher.  Now to me, the most incredibly obvious thing to do is to throw all those enthusiast under the i9 branding, but when it came out, that's not what it was, and to be honest I still don't understand what differentiates an i7 from an i9.

The difference between a mainstream 9th Gen Core i7 and Core i9, is Core i9 has HT and Core i7 does not. What you have to do is, forget about all of the Core i7s that came out including 8th Gen and their HEDT line up will never cross paths with the mainstream, so you won't confused yourself.

 

e.g. #1

Intel 9th Gen CPUs, has a Core i5, Core i7, and Core i9. Both Core i9 and Core i7 have 8 cores and the Core i5 only have 6 cores. Core i9 supports HT, while Core i7 and Core i5 does not support HT.

Intel HEDT line up, both Core i7 and Core i9 have HT. Core i9 has more cores than Core i7, and it also has more PCIe lanes than Core i7

 

e.g. #2

Intel 9th gen CPUs have Core i5, i7 and i9. Core i7 and i9 are both 8 cores. the 8th gen i7 only has 6 cores. 9th gen i9 has HT and the i7 does not, but 8th gen i7 have HT too.

8th gen i7 has less core, but has HT. 9th gen i7 has more cores but no HT. Core i7 8700K faster than Core i7 9700K. Core i9 9900K faster than Core i7 8700K. Core i7 7700k 6700k all have HT so they must be equal to Core i7 9700K. Core i9 9900K is the same as Core i7 7820X, but core i9 9900K has to be better than Core i7 7820X because 9 is higher than 7 and higher number is always better.

 

When it comes to Intel cpus, if your mind is thinking like #2, then it will always be confusing.

7 minutes ago, The Benjamins said:

Ya, 24 from the chipset, so it only has 16 from the CPU what a let down.

Meh, what can you do, it's marketing. At least they didn't make up come cringy name for them PCIe lanes.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×