Jump to content

Intel 12th Gen Core Alder Lake for Desktops: Top SKUs Only, Coming November 4th +Z690 Chipset

Lightwreather

Summary

Intel 12th Gen Core Alder Lake for Desktops: Top SKUs Only, Coming November 4th

 

Quotes

Quote

The first things we’ll go into are the new CPUs that Intel is announcing today: the overclockable models of Intel 12th Gen Core. As with previous launches, we have Core i9, Core i7, and Core i5, with the key highlights including new support for DDR5, PCIe Gen 5, new overclocking features, and a change in how Intel is promoting its Thermal Design Power (TDP).

image.png.0437a4f4ec9b17a878ee6a2ff36c78db.png

All processors will come with 16 lanes of PCIe 5.0 from the processor, and an additional 4 lanes of PCIe 4.0 for storage. Memory support is listed as both DDR5-4800 and DDR4-3200, although systems will only support one or the other, for a maximum of 128 GB. The K processors also feature 32 EUs of Intel’s Xe-LP graphics, designated as UHD Graphics 770. Prices will start at $264 for the base Core i5 model, up to $589 for the top Core i9 model.

Inside each processor, alongside the 16x PCIe 5.0 lanes for add-in cards and 4x PCIe 4.0 lanes for storage, is an additional link to the chipset. Intel lists this as a DMI 4.0 x8 link, as they use a custom protocol over an effective PCIe physical connection – we asked Intel, and they said the link is rated for 15.76 GB/s, which means the chipset can take two PCIe 4.0 x4 drives at peak before getting near to that limit. This is doubled compared to Z590, which was only 7.88 GB/s.

Today Intel is only announcing its Z690 chipset, build on Intel’s 14nm, and the motherboard manufacturers have about 60+ models to launch in the upcoming week. The processors use a new LGA1700 socket, which means everyone buying the new CPUs also need a new motherboard.I

Quote

n the past, Intel promoted its processor power as a single number: TDP (Thermal Design Power*). The issue wasn’t so much that this number was wrong, it was because it lacked massive context that wasn’t communicated to anyone. Arguably it took us several years to find out what it really meant, especially in relation to its turbo.

That changes with Alder Lake. Intel is now acknowledging that its turbo mode does indeed have a power increase, and is providing that number alongside the regular set of numbers. To that end, the base ‘TDP’ number of previous generations is gone, and we get two numbers to talk about:

  • Processor Base Power (Base): Guaranteed Peak Power at Base Frequency
  • Maximum Turbo Power (Turbo): The Maximum Power at full turbo mode that is in spec

So for example, the Processor Base Power (Base) for the Core i9-12900K is set at 125 W. The Maximum Turbo Power is 241 W. This means that systems using this processor will be able to boost up to 241 W if the system is set up to do so, and that is within specification.

My thoughts

Hey, it's finally here. Intel finally announced their Alder Lake CPUs. Seems like the leakers were spot on with their date prediction. But now onto the CPUs. They are configured in a rather interesting way. All CPUs (announced) are configured to have 4 or 8 E cores and 6-8 P cores. Now, whether these processors will stand up and against AMD, seems likely based on leaked performance numbers, but we'll have to wait for Independent reviews in order for that. All in all, I wish you the best Intel, let the flame of competition grow. Now all of this is great but what really caught my attention is INtel changing how they measure power consumption. And now in what is a commendable move, they've started using base and maximum Turbo power. This is much better than the TDP measurements they used before imo.

Sources

Anandtech

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, J-from-Nucleon said:

although systems will only support one or the other, for a maximum of 128 GB

Bummer, I see no reason to go with DDR5 for a desktop if not to take advantage of the higher density DIMMs.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, igormp said:

Bummer, I see no reason to go with DDR5 for a desktop if not to take advantage of the higher density DIMMs.

Might be due to them trying to keep corporate and home market separate.  Really hope intel can grow up from this phase. 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, J-from-Nucleon said:

So for example, the Processor Base Power (Base) for the Core i9-12900K is set at 125 W. The Maximum Turbo Power is 241 W

lol RIP, so they still haven't solved the massive peak power issue to get maximum performance. Really hope someone does some PL2 performance scaling tests from 125W through to that 240W, at 10W or 15W increments.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

 

Thanks Steve

Love the description.

Quote

Intel today announced its official specs, pricing, and release date for the Intel Core i9-12900K, Core i7-12700K, and Core i5-12600K. As of now, the company hasn’t officially announced the i5-12400 or other variations (aside from the 12900KF, 12700KF, and 12600KF without the IGP). The core counts are the most interesting, since Intel is splitting between performance cores and efficient cores, but frequency also gets light detailing. The most obvious competitors will be the 12900K vs. the 5900X (et al.), the 12600K vs. the R5 5600X, and the 12700K versus a pile of sand (the 11700K and 5800X).

 

 

 

 

 

 

I really cant wait to see reall performance numbers, they look good, but wiht them being tested in windows 11 there could be a big performace differece between the said numbers and the real ones...

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Forbidden Wafer said:

Uh, 20 PCI-E lanes?

its 20 user accessible pcie lanes (x16 5.0 slot, and x4 4.0 for m.2), 28 total

 

so they finally stopped lying about the tdps, but....its still 240W,

that's way too much, seems close to a bulldozer moment if people were consistent, but who are we kidding they wont.

Link to comment
Share on other sites

Link to post
Share on other sites

Not about the CPU but what happened to atx12vo? 🤔

 

This is from short circuit showing z690

Screenshot_20211028-013133.thumb.png.f5aa87182533e8974bb74477a9261be2.png

 

Looks like current PSU to me

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, cj09beira said:

so they finally stopped lying about the tdps, but....its still 240W,

that's way too much, seems close to a bulldozer moment if people were consistent, but who are we kidding they wont.

Bulldozer wasn't bad because it had high power, it was bad because it didn't have the performance to match the power usage, in fact it just didn't have any good performance outright.

 

Would have been better if Intel were able to get whatever the performance they have under 200W and then we could OC it ourselves to way higher, which we will be able to anyway. I could cool a 500W CPU no problem so long as the transfer through the silicon to the IHS and then to the water block is good enough and quite honestly if the CPU was completely stable at 500W and gave WAY more performance that's exactly how I'd run it lol.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Moonzy said:

Not about the CPU but what happened to atx12vo? 🤔

 

This is from short circuit showing z690

Screenshot_20211028-013133.thumb.png.f5aa87182533e8974bb74477a9261be2.png

 

Looks like current PSU to me

likely the oem (asus, msi, etc) guys said to intel: Shove it! / Kiss my ass.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, cj09beira said:

likely the oem guys said to intel: Shove it! / Kiss my ass.

Nah the OEMs more than likely welcomed it, they already do non standard 12V stuff anyway which they have to design themselves. Not having to custom order PSU designs and make your own motherboard 12V VRM standards is more likely something Dell and HP would welcome not turn away. It's all us DIY and other ODM/system integrators that use standard ATX that would all say "No thanks my 500W-800W PSU is perfectly fine and I never intend to replace it unless it dies".

 

Good luck introducing a new PSU standard on to the market, our relevant market, when zero people are willing to replace a working component they already barely care about.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

Nah the OEMs more than likely welcomed it, they already do non standard 12V stuff anyway which they have to design themselves. Not having to custom order PSU designs and make your own motherboard 12V VRM standards is more likely something Dell and HP would welcome not turn away. It's all us DIY and other ODM/system integrators that use standard ATX that would all say "No thanks my 500W-800W PSU is perfectly fine and I never intend to replace it unless it dies".

 

Good luck introducing a new PSU standard on to the market, our relevant market, when zero people are willing to replace a working component they already barely care about.

i meant the desktop guys, i agree that guys like dell and hp wouldn't mind

Link to comment
Share on other sites

Link to post
Share on other sites

Are we forced to use windows 11 on these chips or it'll work on windows 10?

 

I don't really mind going to windows 11 if it's already stable, just wanna give it some time

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

Bulldozer wasn't bad because it had high power, it was bad because it didn't have the performance to match the power usage, in fact it just didn't have any good performance outright.

 

Would have been better if Intel were able to get whatever the performance then have under 200W and then we could OC it ourselves to way higher, which we will be able to anyway. I could cool a 500W CPU no problem so long as the transfer through the silicon to the IHS and then to the water block is good enough and quite honestly if the CPU was completely stable at 500W and gave WAY more performance that's exactly how I'd run it lol.

nearly 2x the power for the same performance seems like a pretty bad showing to me...

 

even at 200w one can start to feel the room warming up, now add a gpu to that and its going to get toasty

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, cj09beira said:

even at 200w one can start to feel the room warming up, now add a gpu to that and its going to get toasty

I've run my systems for probably a decade now at 800W+ actual in game power draw heh

Link to comment
Share on other sites

Link to post
Share on other sites

"German Engineered Perfection" - I killed my 12900K, oops.

 

And he had his hands on the CPU at the beginning of October, 3-4 weeks ago, bloody.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, J-from-Nucleon said:

Now all of this is great but what really caught my attention is INtel changing how they measure power consumption. And now in what is a commendable move, they've started using base and maximum Turbo power. This is much better than the TDP measurements they used before imo.

So they rebranded TDP to base power, presumably because of all the idiots who failed to understand what TDP was or was not, trying to make it into something it never was. I don't feel anything changes in practice other than the new name.

 

However the move to maximum turbo power does kinda better match up the reality of what mobo manufacturers have been doing since forever. I need to look further into this, are Intel enforcing Max Turbo Power now (for stock)? There is still technically a difference between setting PL2 to MTP and unlimited.

 

52 minutes ago, cj09beira said:

so they finally stopped lying about the tdps, but....its still 240W,

Just remember it is a power limit, not saying it will run at that at any particular time. When running power unconstrained, it tends to be AVX/FP64 workloads (e.g. Prime95) that draw the most power. Even running 100% it is likely that limit would not be reached for other lighter workloads like Cinebench.

 

Above applies to AMD also, but you don't see it as they do enforce a much lower power limit. So instead of seeing power go up, you see core clocks go down when harder workloads are applied unless you bypass it.

 

24 minutes ago, Moonzy said:

Are we forced to use windows 11 on these chips or it'll work on windows 10?

 

I don't really mind going to windows 11 if it's already stable, just wanna give it some time

The understanding is you can run Win10 on Alder Lake, but Win10 will not understand the difference between P and E cores, so performance may be unpredictable. In that case, disabling E cores may give a better experience.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I don't get the hate for Windows 11. I've been on it since August without any issues.

 

They fixed the AMD problem, didn't they?

 

About the only thing I don't like is how I don't have a clock on my secondary display.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 thing about going from a 4790K to a 3700X - there is no way to justify upgrading.

 

With the changes in architecture I fully expect Linux to take some time before people have an update for the kernel to address full feature support for Alderlake.

2 hours ago, Moonzy said:

Are we forced to use windows 11 on these chips or it'll work on windows 10?

 

I don't really mind going to windows 11 if it's already stable, just wanna give it some time

Windows 10 won't have the scheduler improvements needed to handle them, so you will have to use Windows 11 if you want the CPU to perform as it was designed.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

but why e cores on desktop cpus

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, suicidalfranco said:

but why e cores on desktop cpus

to boost those core numbers and mt scores lol

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Dabombinable said:

With the changes in architecture I fully expect Linux to take some time before people have an update for the kernel to address full feature support for Alderlake.

Linux already has had support for hybrid designs (see ARM CPUs), and intel has already sent many patches related to alder lake that are already in the upstream

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, suicidalfranco said:

but why e cores on desktop cpus

 

better, why so many 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×