Jump to content

Intel 12th Gen Core Alder Lake for Desktops: Top SKUs Only, Coming November 4th +Z690 Chipset

Lightwreather

I'm glad that Linus addressed all the AMD fanboys in the latest WAN-show as well.

It is so painfully obvious when someone is a fanboy as soon as they start talking about how it "isn't fair" to use DDR5 for Intel and DDR4 for AMD.

 

I guess the next mental gymnastic AMD fanboys will pull is "just wait for Zen 4", because that's the typical go-to argument for fanboys when they have been beat:

Quote

Yeah, the products you can buy today from company X is better than company Y, but in a undisclosed time my favorite company will release an undisclosed product which we don't know the price or performance of, but it will surely beat what company X offers today, so my favorite company is still the best!.

-A fanboy

 

It's like reading fanfic.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, LAwLz said:

Or we can stop paying so much attention to halo products like the i9 and Ryzen 9 and look at the more sane segment, the i5 and i7.

If you ask me, very few people should be looking at the i9 and Ryzen 9, especially people on this forum. Complete waste of money. I bet that 9 out of 10 people buying these extremely high end consumer chips are just doing it for bragging rights, not because it is actually a good buy.

It is and it isn't. I grabbed 5800X because I already had 6 core, 12 threads CPU before. I felt 8 cores and 16 threads would be good upgrade if I'm already buying new. And 5800X has single CCX which kind agives it immunity to certain inter CCX communication issues. Reason I buy relatively high end models is because I keep platforms as whole for 5-6 years and I pick well in beginning as I don't upgrade anymore down the line. It's why I stuck 32GB of RAM too. I'll just have great experience the entire time.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

It is and it isn't. I grabbed 5800X because I already had 6 core, 12 threads CPU before. I felt 8 cores and 16 threads would be good upgrade if I'm already buying new. And 5800X has single CCX which kind agives it immunity to certain inter CCX communication issues. Reason I buy relatively high end models is because I keep platforms as whole for 5-6 years and I pick well in beginning as I don't upgrade anymore down the line. It's why I stuck 32GB of RAM too. I'll just have great experience the entire time.

it is and it is

yK4AVBWrEuggZAQb9osYq7-970-80.png.webp

how often do you actually perform tasks that use 16 cores, and of those tasks, how many of them is it actually important to have 16 cores over 12 (answer: probably 0)

 

and as you can see, the 12600k is clearly not a joke like 11th gen was

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

I am glad Intel are back into the game, I was not too happy putting down the money for my AMD system, the 5000 series are not that budget friendly, but they certainly are very good CPU's.

The only thing disheartening about Intel's lineup is power draw and the fact that DDR5 is a long way off with current prices.

Vans UltraRange EXO SE

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Yep I remember that, it was my understanding that it was going to be completely disabled, to the point of also being so in the microcode and impossible to enable. I don't know where things got missed/mixed in information flow or if that is actually mistake and it's actually not supposed to be possible, who knows at this point and likely never will.

 

Worst case a new stepping is released that actually does outright disabled it, which in that case the old stepping CPU value will go up (no I'm not saying investment buy the bloody things).

It's now the first *real* post launch time for the main NA offices to start making decisions on what's next for AVX-512. Thursday having been launch and friday being PR day, now's the time when real discussions will start happening (beyond the impromptu ones over the week end of course).

 

It isnt a very big deal for the vast majority of users, but it's very strange for this to happen at intel. It seems like a sort of left hand not talking to right hand sort of situation.

 

New early coverage starting to come in on the performance from Phoronix:

 

https://www.phoronix.com/scan.php?page=article&item=alder-lake-avx512

 

Seems Dr. Kinghorn of Puget has also started his AVX testing on linux with the MKL, but only up to AVX2 so far. Not sure if he knows about 512 enabling yet. Then again, since it's not POR, Puget may not want to get involved with it.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, RejZoR said:

Reason I buy relatively high end models is because I keep platforms as whole for 5-6 years and I pick well in beginning as I don't upgrade anymore down the line.

1) Chances are you will be fine with a mid range chip for 5-6 years as well, so you buying more performance than you need today will most likely not translate to your PC lasting longer without an upgrade.

2) Buying a really expensive high end computer and keeping it for let's say 10 years often ends up costing more than two mid-range computers over the same 10 years, and you often get higher performance in the end as well.

 

Stop trying to "future proof" PCs by overpaying for them today. In 99,9% of cases it's a waste of money.

 

 

 

19 minutes ago, OneOfYas said:

The only thing disheartening about Intel's lineup is power draw and the fact that DDR5 is a long way off with current prices.

I keep hearing people say DDR5 is expensive but I don't really see it. If you are only looking at capacity then yes, DDR5 is way more expensive than DDR4. But that's like saying "DDR4 from Crucial sure is expensive. 32GB of 4400MHz DDR4 from Crucial is 405 dollars. This 2400MHz kit from G.Skill is only 95 dollars for the same capacity".

 

Capacity is only one out of several factors when looking at RAM.

If you are someone who only cares about capacity and don't want high speed RAM then sure, DDR5 sucks for you. But it doesn't suck because DDR5 is inherently expensive. It sucks because you are basically forced to buy "high end DDR4".

It's kind of like saying SSDs are expensive because a 256GB SSD might cost as much as a 1TB HDD. Saying that "SSDs are 4 times as expensive as HDDs" doesn't mean much since we are ignoring the speed benefit.

 

Also, Alder Lake can work with DDR4, so it's not like you must buy DDR5. Judging by some benchmarks I've seen you won't really lose much performance anyway. It seems like Alder Lake with DDR4 is the best option for most buyers looking for a new CPU. Unless they can find some Ryzen chip on sale for way below MSRP.

 

 

And as for power consumption, it seems like only the i9 uses ridiculous amounts of power. The i5 and i7 seem fine and is roughly the same as the AMD R5 and R7 for most common workloads.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Or we can stop paying so much attention to halo products like the i9 and Ryzen 9 and look at the more sane segment, the i5 and i7.

If you ask me, very few people should be looking at the i9 and Ryzen 9, especially people on this forum. Complete waste of money. I bet that 9 out of 10 people buying these extremely high end consumer chips are just doing it for bragging rights, not because it is actually a good buy.

I bought a 5950X because I didn't want to do a custom copper-pipe loop and get meme'd on for cooling a 5600X with 4 radiators:

Spoiler

tpd6Hzsd-wr89hab9EG_otUaM4GcBHN8glRtGHnlbqIzZI8DpjGmnUWdHqmlDwH7jiv8odbIL3bJqMy02K8LyHkyVAsnI2vb6QMFZcasP12SR7I7h5P9XQH4fZIHjwpx53ZPazliQV4XK5B2NnRR_IP6dhAZKzUeYxjASUj7vVhOINqZ1XyYxAt0t4d70XBLM0ijBmnv2YajBHPnMNW6bb8EuHfdiFOb326vr5sXaGy2mV_RjIrxqx2Mfj0YNSNKINdUR1tcRj22KeNgbzyog94shPEyvWoPgj-N9FYA3dRUzIHNZIzBbjlhOqIz-Vs4qMUa5nB6LtpDiCLfbvxN7pkrSH97m_5pNVql0iCTFUGfXNFZb0P4V1PiN_pkP7q6HMMWx9c0FLPSWVKjfHnPL4Xk4Id5F_Q2bnsQzkO2BK7vtU_b8CjzPCpTfX0AfloX3q15BGTKS2Ol5xNL-wJwhebVwX_K14QsdZBMM999z4lMAqCP-Ju-_99BFiz1qu0_zmyZXw49ATt3m4ytsFkbN-8qI9bUbveTmvjrzfsbjUHFP-J4u6-aMkXmTwK522EzO2G37MSlQTIQSinH6L8B1xOdkLi9l2F30AAcMGJBBTXPTJfOSAtVDmvixBEZejvQUxJEelRoD-LOSkbvhpZQpASfvCO0alKQ0LanUCY3g4LsPba8adnaGJz0GfN5H4lx0GtqN2pnq4E3ZGLJ37JoMq8=w1918-h1079-no?authuser=0

Also it was the only processor in-stock at our local Micro Center store, people went bonkers buying the 5600X and 5900X's.

 

Though truth be told, I play old MMO's and turn-based RPG's like Divinity Original Sin, so it's most definitely overkill and not worth the $800 I paid for most people. Still, I am happy with it and I'd buy it again if I were in the same situation.

 

Next build is going to be a SFF custom loop in a Conswole case and will likely be a mid-range intel or AMD with a mid-range GPU as well. Mostly because I want to see if I can do it.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, MageTank said:

I bought a 5950X because I didn't want to do a custom copper-pipe loop and get meme'd on for cooling a 5600X with 4 radiators:

  Reveal hidden contents

tpd6Hzsd-wr89hab9EG_otUaM4GcBHN8glRtGHnlbqIzZI8DpjGmnUWdHqmlDwH7jiv8odbIL3bJqMy02K8LyHkyVAsnI2vb6QMFZcasP12SR7I7h5P9XQH4fZIHjwpx53ZPazliQV4XK5B2NnRR_IP6dhAZKzUeYxjASUj7vVhOINqZ1XyYxAt0t4d70XBLM0ijBmnv2YajBHPnMNW6bb8EuHfdiFOb326vr5sXaGy2mV_RjIrxqx2Mfj0YNSNKINdUR1tcRj22KeNgbzyog94shPEyvWoPgj-N9FYA3dRUzIHNZIzBbjlhOqIz-Vs4qMUa5nB6LtpDiCLfbvxN7pkrSH97m_5pNVql0iCTFUGfXNFZb0P4V1PiN_pkP7q6HMMWx9c0FLPSWVKjfHnPL4Xk4Id5F_Q2bnsQzkO2BK7vtU_b8CjzPCpTfX0AfloX3q15BGTKS2Ol5xNL-wJwhebVwX_K14QsdZBMM999z4lMAqCP-Ju-_99BFiz1qu0_zmyZXw49ATt3m4ytsFkbN-8qI9bUbveTmvjrzfsbjUHFP-J4u6-aMkXmTwK522EzO2G37MSlQTIQSinH6L8B1xOdkLi9l2F30AAcMGJBBTXPTJfOSAtVDmvixBEZejvQUxJEelRoD-LOSkbvhpZQpASfvCO0alKQ0LanUCY3g4LsPba8adnaGJz0GfN5H4lx0GtqN2pnq4E3ZGLJ37JoMq8=w1918-h1079-no?authuser=0

Also it was the only processor in-stock at our local Micro Center store, people went bonkers buying the 5600X and 5900X's.

 

Though truth be told, I play old MMO's and turn-based RPG's like Divinity Original Sin, so it's most definitely overkill and not worth the $800 I paid for most people. Still, I am happy with it and I'd buy it again if I were in the same situation.

 

Next build is going to be a SFF custom loop in a Conswole case and will likely be a mid-range intel or AMD with a mid-range GPU as well. Mostly because I want to see if I can do it.

I have a slightly more ambitious plan for the future

https://pcpartpicker.com/list/NMRBNP

 

 

either way, if you ask someone in the future if you made a good choice, they'd say "no, they followed the trend for the past 50 years and doubled performance after 2 years, your PC is considered mid-tier by 2023"

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:
  Hide contents

tpd6Hzsd-wr89hab9EG_otUaM4GcBHN8glRtGHnlbqIzZI8DpjGmnUWdHqmlDwH7jiv8odbIL3bJqMy02K8LyHkyVAsnI2vb6QMFZcasP12SR7I7h5P9XQH4fZIHjwpx53ZPazliQV4XK5B2NnRR_IP6dhAZKzUeYxjASUj7vVhOINqZ1XyYxAt0t4d70XBLM0ijBmnv2YajBHPnMNW6bb8EuHfdiFOb326vr5sXaGy2mV_RjIrxqx2Mfj0YNSNKINdUR1tcRj22KeNgbzyog94shPEyvWoPgj-N9FYA3dRUzIHNZIzBbjlhOqIz-Vs4qMUa5nB6LtpDiCLfbvxN7pkrSH97m_5pNVql0iCTFUGfXNFZb0P4V1PiN_pkP7q6HMMWx9c0FLPSWVKjfHnPL4Xk4Id5F_Q2bnsQzkO2BK7vtU_b8CjzPCpTfX0AfloX3q15BGTKS2Ol5xNL-wJwhebVwX_K14QsdZBMM999z4lMAqCP-Ju-_99BFiz1qu0_zmyZXw49ATt3m4ytsFkbN-8qI9bUbveTmvjrzfsbjUHFP-J4u6-aMkXmTwK522EzO2G37MSlQTIQSinH6L8B1xOdkLi9l2F30AAcMGJBBTXPTJfOSAtVDmvixBEZejvQUxJEelRoD-LOSkbvhpZQpASfvCO0alKQ0LanUCY3g4LsPba8adnaGJz0GfN5H4lx0GtqN2pnq4E3ZGLJ37JoMq8=w1918-h1079-no?authuser=0

 

Hmm yea, that's a pretty great PC you got there:

image.thumb.png.80fd31bb5b76948f6499dd5cafbe7079.png

/s

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, J-from-Nucleon said:

Hmm yea, that's a pretty great PC you got there:

image.thumb.png.80fd31bb5b76948f6499dd5cafbe7079.png

/s

I uh... I don't know what to tell you, lol.

image.thumb.png.f6a00c57722c5c333f2ee98cc63910c3.png

Shows up for me. Clearly this is the work of space goblins.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MageTank said:

Shows up for me. Clearly this is the work of space goblins.

Doesn't work for me either, so goblins it is.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, MageTank said:

I bought a 5950X because I didn't want to do a custom copper-pipe loop and get meme'd on for cooling a 5600X with 4 radiators:

  Hide contents

tpd6Hzsd-wr89hab9EG_otUaM4GcBHN8glRtGHnlbqIzZI8DpjGmnUWdHqmlDwH7jiv8odbIL3bJqMy02K8LyHkyVAsnI2vb6QMFZcasP12SR7I7h5P9XQH4fZIHjwpx53ZPazliQV4XK5B2NnRR_IP6dhAZKzUeYxjASUj7vVhOINqZ1XyYxAt0t4d70XBLM0ijBmnv2YajBHPnMNW6bb8EuHfdiFOb326vr5sXaGy2mV_RjIrxqx2Mfj0YNSNKINdUR1tcRj22KeNgbzyog94shPEyvWoPgj-N9FYA3dRUzIHNZIzBbjlhOqIz-Vs4qMUa5nB6LtpDiCLfbvxN7pkrSH97m_5pNVql0iCTFUGfXNFZb0P4V1PiN_pkP7q6HMMWx9c0FLPSWVKjfHnPL4Xk4Id5F_Q2bnsQzkO2BK7vtU_b8CjzPCpTfX0AfloX3q15BGTKS2Ol5xNL-wJwhebVwX_K14QsdZBMM999z4lMAqCP-Ju-_99BFiz1qu0_zmyZXw49ATt3m4ytsFkbN-8qI9bUbveTmvjrzfsbjUHFP-J4u6-aMkXmTwK522EzO2G37MSlQTIQSinH6L8B1xOdkLi9l2F30AAcMGJBBTXPTJfOSAtVDmvixBEZejvQUxJEelRoD-LOSkbvhpZQpASfvCO0alKQ0LanUCY3g4LsPba8adnaGJz0GfN5H4lx0GtqN2pnq4E3ZGLJ37JoMq8=w1918-h1079-no?authuser=0

Also it was the only processor in-stock at our local Micro Center store, people went bonkers buying the 5600X and 5900X's.

 

Though truth be told, I play old MMO's and turn-based RPG's like Divinity Original Sin, so it's most definitely overkill and not worth the $800 I paid for most people. Still, I am happy with it and I'd buy it again if I were in the same situation.

 

Next build is going to be a SFF custom loop in a Conswole case and will likely be a mid-range intel or AMD with a mid-range GPU as well. Mostly because I want to see if I can do it.

 

Spoiler

image.thumb.png.e631d1b76ea889a2aa4866c95662d876.png

Same on my end. It's google user content for the URL, probably a permission issue.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, yesyes said:

it is and it is

yK4AVBWrEuggZAQb9osYq7-970-80.png.webp

how often do you actually perform tasks that use 16 cores, and of those tasks, how many of them is it actually important to have 16 cores over 12 (answer: probably 0)

 

and as you can see, the 12600k is clearly not a joke like 11th gen was

Straying slightly off-topic, I have a problem with the gaming benchmarks for CPUs. Using a high-end GPU at 1080p isn't something many people are doing. Even in esports titles, it would make little sense, as you can't buy a monitor whose frame rate can match the FPS a RTX3080ti or 6900XT can put out. If you're gaming at 1080p you're probably using something around 6600/RTX3060 performance and are GPU bound on performance, or if you are using a 3080 upwards you're probably on playing at 1440p, or above and are still GPU bound. If you're GPU bound either you have enough CPU performance to avoid stuttering, with little to gain in frame rate by exceeding this, or you don't, and you're having a sad time.

 

I'd like to see the CPU benchmarks include more cards and more resolutions. More CPU, GPU and resolution combinations that people actually game with.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Monkey Dust said:

Straying slightly off-topic, I have a problem with the gaming benchmarks for CPUs. Using a high-end GPU at 1080p isn't something many people are doing

The point is to put as much load on the CPU as possible so you can see just how much additional work each tier of CPU can manage. It's the best way to show differences in gaming benchmarks when testing specifically the CPU's performance.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

1080p gaming is not that uncommon given such monitors are still super popular and a 1080p 240Hz monitor can be hugely better experience than 4K 60Hz one. Not only it'll actually run at stupid high framerates to match the refresh, it'll look smoother while doing it. Only reason I went with 1440p was because I bought a larger monitor and was concerned 1080p might look a bit pixelated. 4K just seems pointless to me because I value framerate over crispness of image you can't see anyway in fast paced games.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/8/2021 at 2:45 AM, leadeater said:

Yep I remember that, it was my understanding that it was going to be completely disabled, to the point of also being so in the microcode and impossible to enable. I don't know where things got missed/mixed in information flow or if that is actually mistake and it's actually not supposed to be possible, who knows at this point and likely never will.

 

Worst case a new stepping is released that actually does outright disabled it, which in that case the old stepping CPU value will go up (no I'm not saying investment buy the bloody things).

First patch from intel acknowledging AVX-512 on alder lake, but marking it as unsupported. 

Phoronix article here: https://phoronix.com/scan.php?page=news_item&px=Intel-Alder-Lake-Tuning-GCC

actual code for the compiler here: https://gcc.gnu.org/pipermail/gcc-patches/2021-November/583958.html

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

older thread, so apologies for the bump everyone.

Wanted to let people know, I've since gotten my hands on the 12700k and begun testing AVX 512 performance in different HPC applications.

some quick data can be found here https://openbenchmarking.org/result/2112040-TJ-2111077TJ72&hgv=i7-12700K+P-Cores+%2B+AVX-512+DDR4&ppt=D

 

 

As of now I'm working on testing for how different cache size scales per core as a way of previewing the Golden Cove core's that will be in Sapphire rapids (same basic topology and by disabling a given amount of cores we can approximate a given amount of shared L3 per core)

only thing that this doesn't allow to test for is the new AMX instructions directly, but they seem to have an AVX backup possibility (at a reciprocal performance hit)

feel free to search for #avx512 on twitter and you should be able to find any experiments myself or other collaborators work on!

An interesting part is that, the i7 in AVX 512 mode can obliterate the i9 in all core mode for AVX-512 workloads like CFD and other engineering workloads

Link to comment
Share on other sites

Link to post
Share on other sites

Not interested until they get rid of the E-cores. It doesn't make any sense for desktop just give us a real core instead of two useless cores. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, kuddlesworth9419 said:

Not interested until they get rid of the E-cores. It doesn't make any sense for desktop just give us a real core instead of two useless cores. 

I think you're underestimating the E cores. They are quite powerful and requires way less area and power. 

 

I am fairly sure Intel said they could fit roughly 4 E cores in the same area as one P core, but an E core has ~60% the performance of a P core. 

 

Let's say you're running a program that scales well with multiple cores. Which CPU would you rather have:

1) A single P core that gets a score of 100 points. 

2) Four E core that each gets 60 points, for a total score of 240 points. 

 

Having 4 E cores is clearly the better use of die are and gives significantly better multi core performance. 

Besides, if you need high single core performance then you can just put that load on the P cores. You get the best of both worlds. 

Link to comment
Share on other sites

Link to post
Share on other sites

Also, this thread is about a month old......

ADL has been out for quite sometime now, with reviews as well.

"The most important step a man can take. It’s not the first one, is it?
It’s the next one. Always the next step, Dalinar."
–Chapter 118, Oathbringer, Stormlight Archive #3 by Brandon Sanderson

 

 

Older stuff:

Spoiler

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, kuddlesworth9419 said:

Not interested until they get rid of the E-cores. It doesn't make any sense for desktop just give us a real core instead of two useless cores. 

They're not getting rid of the E-cores, and leak online suggests they're actually going to make configurations with more e-cores than p-cores.

https://www.techradar.com/news/intel-raptor-lake-cpu-spotted-in-first-benchmark-leak

Quote

According to the details given here, the spec that the chip runs with is eight performance cores (new Raptor Cove cores), and 16 efficiency cores (Gracemont cores – the same as with Alder Lake). This is what was previously rumored, and gives the processor a theoretical 32-threads (as efficiency cores don’t have hyper-threading).

 

Keep in mind that:

e-cores are not replacements for p-cores, they're Intel Atom cores. If you arrange a CPU a certain way so these 10w cores space out the P cores, you can get a higher TDP while advertising more cores. But they're not like ARM big.little. The OS has to implicitly support the CPU core configuration, which Windows 10 does not. So benchmarks on Windows 10 and on games not designed to use the core configuration, will result in completely meaningless values, because the only way to compare the CPU's is to turn the e-cores off.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×