Jump to content

New Details of Intel Rocket Lake Officially Revealed

Random_Person1234
Go to solution Solved by Random_Person1234,
4 hours ago, spartaman64 said:

i always get baited by this. im like 18% improvement thats pretty decent and then im like why are they still comparing to skylake

Because Skylake is the latest architecture from them on desktop processors...?

I mean, it makes sense to compare your new product to your last product, right?

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/29/2020 at 8:06 PM, AlexGoesHigh said:

Also even if that's the case, the 10 cores CPU should still be better at everything except gaming and Adobe which is ridiculous.

They will have to have some serious clockspeed improvments to justify cutting 2 cores from their i9 and still outperform the previous generations in productivity cases.
Of course IPC is a point to improve but could the 11th gen really be that much of a  improvment considering its still 14nm

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, JouanDeag said:

They will have to have some serious clockspeed improvments to justify cutting 2 cores from their i9 and still outperform the previous generations in productivity cases.
Of course IPC is a point to improve but could the 11th gen really be that much of a  improvment considering its still 14nm

I vaguely remember hearing there are.  The problem is iirc SOI just can’t really to much over 5ghz at temperatures humans can put up with.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

I vaguely remember hearing there are.  The problem is iirc SOI just can’t really to much over 5ghz at temperatures humans can put up with

Still I can boil my water for my tea on the current gen i9 and 11th gen is gonna outperform that without me selling my stove and replacing it with this

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/29/2020 at 2:59 PM, porina said:

8 cores I think is the sweet spot for the mainstream, and also it is probably the practical limit due to still being on 14nm. These cores/GPU were designed to fit on 10nm, so they're going to take a lot more space at 14nm. IMO AMD went too far in offering more cores. I'm sure there are some who welcome higher core counts, but to me it seems more fitting for them to go on a HEDT platform without the limitations of a consumer platform. I'd like to see HEDT costs come down in general.

 

Alder Lake desktop will complicate matters with its mix of big and little cores, but don't expect more than 8 big cores on the mainstream.

 

Intel could still attack more cores on desktop through HEDT, but Ice Lake server is seemingly delayed some more so I'm not holding my breath for that one.

 

 

People said the about 4 cores a long time ago and now people are saying the same thing about 8 cores. Personally I think 8 cores is enough for most people but having more doesnt hurt and if companies keep increasing core count it will change how software is developed and change the way we are able to utilize our pc so I think increasing core count is a very good thing. While I understand the idea of wanting HEDT to come down in price instead of adding more core to the main stream platform i Personally disagree. Most people who use to need to go to HEDT platform can now do the same stuff on the main stream platform for cheaper rather than having to go to HEDT for the extra cores. For those who need the others aspects of the HEDT platform they can still buy HEDT platform cpus for cheaper on Intel due to competition from AMD adding more cores. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

People said the about 4 cores a long time ago and now people are saying the same thing about 8 cores. Personally I think 8 cores is enough for most people but having more doesnt hurt and if companies keep increasing core count it will change how software is developed and change the way we are able to utilize our pc so I think increasing core count is a very good thing. While I understand the idea of wanting HEDT to come down in price instead of adding more core to the main stream platform i Personally disagree. Most people who use to need to go to HEDT platform can now do the same stuff on the main stream platform for cheaper rather than having to go to HEDT for the extra cores. For those who need the others aspects of the HEDT platform they can still buy HEDT platform cpus for cheaper on Intel due to competition from AMD adding more cores. 

They also said it about two cores. There’s more stuff like that too.  First there was 4 bit, then there was 8 bit, then there was 16 bit then there was 32 but, then there was 64 bit.  I was a kid for the 8 to 16 bit switch, but I remember the 16 to 32 bit one pretty well.  Funny thing.  People are apparently saying the same thing now as they were then.  “It’s wasteful” no one will need addresses that big”, “we do fine with 16 bit” yet now 32 bit is dead and we’re on 64.  This “we’ll never need more than X” has never held up in computing that I’ve seen.  Not once.  Happens every time though.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Bombastinator said:

They also said it about two cores. There’s more stuff like that too.  First there was 4 bit, then there was 8 bit, then there was 16 bit then there was 32 but, then there was 64 bit.  I was a kid for the 8 to 16 bit switch, but I remember the 16 to 32 bit one pretty well.  Funny thing.  People are apparently saying the same thing now as they were then.  “It’s wasteful” no one will need addresses that big”, “we do fine with 16 bit” yet now 32 bit is dead and we’re on 64.  This “we’ll never need more than X” has never held up in computing that I’ve seen.  Not once.  Happens every time though.

 

Very true. People forget that software does eventually catches up if it's not there already pushing the current hardware.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Brooksie359 said:

People said the about 4 cores a long time ago and now people are saying the same thing about 8 cores. Personally I think 8 cores is enough for most people but having more doesnt hurt and if companies keep increasing core count it will change how software is developed and change the way we are able to utilize our pc so I think increasing core count is a very good thing.

I never said we should stop at 8 cores. I'm saying 8 cores is the current performance sweet spot. Many more cores than that will be much more niche in application.

 

BTW for a lot of uses, 4 cores are still enough. Maybe dual is only starting to struggle a bit in light workloads, but that's more a Win10 era problem than a hardware problem.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/29/2020 at 7:30 PM, Blademaster91 said:

The 11th gen Rocket Lake is the their last 14nm cpu, next is Alder lake on 10nm using LGA 1700.

 

shit.

unless 11th gen beats single thread Ryzen 5000 i really bought lga 1200 for nothing.

On a sidenote, i can see lga 1700 will use DDR5

is this the first time Intel will do something next gen before AMD? 

(in case my question is confusing, here's an example: AMD did pcie gen 4 before intel)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Lord Szechenyi said:

On a sidenote, i can see lga 1700 will use DDR5

is this the first time Intel will do something next gen before AMD? 

(in case my question is confusing, here's an example: AMD did pcie gen 4 before intel)

I suppose one part of your question is what counts as "something next gen"?

 

If we're looking at ram, with DDR4 it was kinda consumer accessible with Haswell-E X99 platform in 2014, with mainstream consumer availability with Skylake in 2015. AMD didn't reach DDR4 until 2017 with Ryzen's launch.

 

With DDR5 I don't think it is exactly a race. It will more be a case of whoever happens to refresh their product cycle at the time. The gap between Intel and AMD DDR5 support I don't think is meaningful regardless who comes out ahead.

 

I feel overall AMD is still lagging when it comes to introducing new features or catching up on competitive features. Intel were first in SIMD instructions and continue to extend them. AMD just kinda follows on eventually. nvidia introduced variable rate refresh technology, as well as raytracing and other areas. AMD's model has been do the basics, but cheaper/faster. Obviously it works to a point.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, porina said:

I feel overall AMD is still lagging when it comes to introducing new features or catching up on competitive features. Intel were first in SIMD instructions and continue to extend them. AMD just kinda follows on eventually. nvidia introduced variable rate refresh technology, as well as raytracing and other areas. AMD's model has been do the basics, but cheaper/faster. Obviously it works to a point.

In all fairness to AMD though, its not like they could really risk putting the money into something until it was proven for a good while, verge of bankruptcy and all that. Hopefully they start trying to be first in introducing new cools features now that theres not a chance it could kill them. But thats for future us to see and present AMD to thing about.

Spoiler

My system is the Dell Inspiron 15 5559 Microsoft Signature Edition

                         The Austrailian king of LTT said that I'm awesome and a funny guy. the greatest psu list known to man DDR3 ram guide

                                                                                                               i got 477 posts in my first 30 days on LinusTechTips.com

 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, porina said:

I suppose one part of your question is what counts as "something next gen"?

 

If we're looking at ram, with DDR4 it was kinda consumer accessible with Haswell-E X99 platform in 2014, with mainstream consumer availability with Skylake in 2015. AMD didn't reach DDR4 until 2017 with Ryzen's launch.

If we're talking bulldozer era AMD, they could barely hold onto solvency, let alone introduce DDR4 capable processors. Can you imagine DDR4 bulldozer? That might have failed financially more than the original. They only thing AMD could have done, and did do was wait until their brand new arch was done, and also put the latest feature into that new arch (DDR4). If we look at pre and post bulldozer, we could see AMD pushing lots of next gen features, but bulldozer era is a black stain on AMD's history.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Bombastinator said:

They also said it about two cores. There’s more stuff like that too.  First there was 4 bit, then there was 8 bit, then there was 16 bit then there was 32 but, then there was 64 bit.  I was a kid for the 8 to 16 bit switch, but I remember the 16 to 32 bit one pretty well.  Funny thing.  People are apparently saying the same thing now as they were then.  “It’s wasteful” no one will need addresses that big”, “we do fine with 16 bit” yet now 32 bit is dead and we’re on 64.  This “we’ll never need more than X” has never held up in computing that I’ve seen.  Not once.  Happens every time though.

Yeah, that's why I usually wanna pick up the highest. Remember the days when people said "don't get the i7 but get the i5 if you want gaming"? Well, jokes on them as old i7 are better than i5 todays in some games. My 4770 is almost 7 years old! Of course it won't be good anymore for newer games but I'm surprised it still works fine. Sometimes it's better to go overkill than just perfectly enough, especially if you plan on using it for more than 5 years.

 

It's same bullshit about "no one will need more then 640kb".

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, CTR640 said:

Yeah, that's why I usually wanna pick up the highest. Remember the days when people said "don't get the i7 but get the i5 if you want gaming"? Well, jokes on them as old i7 are better than i5 todays in some games. My 4770 is almost 7 years old! Of course it won't be good anymore for newer games but I'm surprised it still works fine. Sometimes it's better to go overkill than just perfectly enough, especially if you plan on using it for more than 5 years.

 

It's same bullshit about "no one will need more then 640kb".

The jury is still out on how not good.  The one game considered new enough to matter is flight sim 2020 but it’s written in such a weird way it’s hard to get anything useful.  It uses a lot of really really ancient stuff like dx11 instead of dx12 and runs on 2 cores almost entirely.  Without actual testable stuff all that is available is the “anything you can do I can do better” model.  By that metric it fails, but so does any 6/12 processor too.   Exactly what will happen is to some degree unknown.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

The jury is still out on how not good.  The one game considered new enough to matter is flight sim 2020 but it’s written in such a weird way it’s hard to get anything useful.  It uses a lot of really really ancient stuff like dx11 instead of dx12 and runs on 2 cores almost entirely.  Without actual testable stuff all that is available is the “anything you can do I can do better” model.  By that metric it fails, but so does any 6/12 processor too.   Exactly what will happen is to some degree unknown.

Is Flight Sim 2020 really horribly optimized? Anyway, I'm very sure I'm not into Flight Sim, more like open world games like GTA.

And Microsoft the inventor of the DX12 and uses ancient shit like DX11?

 

But I also think it's a matter of what the usages and goals are for any individual person.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CTR640 said:

Is Flight Sim 2020 really horribly optimized? Anyway, I'm very sure I'm not into Flight Sim, more like open world games like GTA.

And Microsoft the inventor of the DX12 and uses ancient shit like DX11?

 

But I also think it's a matter of what the usages and goals are for any individual person.

I don’t have the game.  “Horribly optimized” with the exception of having no dx12, is not a term I have heard used myself. There was a good bit written about it and a lot of side comments made about it.  The assumption seems to be it was written to run not so much on current PCs as on upcoming xboxes, so “horribly optimized” can’t really be compared until the xboxes are out.   It was looked at as a possible indicator of what devs might do with such a machine.  The consensus seems to have been that the thing was so odd it’s not particularly useful for that purpose.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CTR640 said:

Is Flight Sim 2020 really horribly optimized? Anyway, I'm very sure I'm not into Flight Sim, more like open world games like GTA.

And Microsoft the inventor of the DX12 and uses ancient shit like DX11?

 

But I also think it's a matter of what the usages and goals are for any individual person.

Actual flight simulators are insanely system taxing. If you look at today's actual plane simulators by large companies, they have graphics of 10 years ago. 

X-Plane is the closest one with the best visuals, which is pretty amazing by itself. Microsoft Flight Sim really pushes things further. So, in my opinion, it is really spectacular what where able to achieve. So, I don't think it is poorly optimized.

 

Microsoft Flight Simulator is actually developed by: Asobo Studio

They license the game IP from Microsoft and Microsoft is the publisher.

 

DirectX12 was not out when Microsoft Flight Simulator started its development.

Also, DX12 GPUs which delivers good performance are this new gen of GPUs (Radeon 6000 series, and GeForce 30 series). While the previous gen can be argued to be fine, I don't consider the high-end chips, as those don't reflect what most people sports in their PCs. Most people get med and med-low range chips, and in my opinion, this new gen GPUs should be able to deliver on that.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, porina said:

I never said we should stop at 8 cores. I'm saying 8 cores is the current performance sweet spot. Many more cores than that will be much more niche in application.

 

BTW for a lot of uses, 4 cores are still enough. Maybe dual is only starting to struggle a bit in light workloads, but that's more a Win10 era problem than a hardware problem.

I guess I see your point to an extent but also would say that more than 8 cores is still an improvement in quite a few applications that are becoming more main stream like video editing and streaming. Also while more than 8 cores doesn't always translate to higher performance in all games it does in quite a few and often times results in better 1% and .1% lows making the experience better overall. I think we will find that many more games are going to start to really leverage more cores in the coming years like we have seen already. Also as someone who often does multiple things at once on my computer having more cores is always nice. I guess I still see your point to extent as 8 cores is probably where it starts to be diminishing returns but if the priced right getting more cores would probably still be worth it. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Lord Szechenyi said:

is this the first time Intel will do something next gen before AMD? 

(in case my question is confusing, here's an example: AMD did pcie gen 4 before intel)

You got to be kidding. AMD is ahead of Intel for like 2 years and all of a sudden people question if Intel has ever been ahead of AMD with anything?

Intel was consistently ahead of AMD with a ton of stuff if we go back more than 2 years. If my answer is confusing, here's an example: Intel did PCIe 3.0 before AMD.

 

If you had asked "is this the first time Intel will do something next gen before AMD, in the last 2 years and only limiting us to consumer electronics stuff that isn't too complicated" then the answer would probably be yes. But as the question is phrased right now the answer is an obvious no. This is not the first time. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

You got to be kidding.

People got short memory, especially the youngsters.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Orange1 said:

People got short memory, especially the youngsters.

not exactly, it's just that i wasn't an enthusiast 2 years ago

(well at least an enthusiast for modern hardware, because i know all about Zilog VS Motorola)

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/29/2020 at 2:30 PM, Blademaster91 said:

The 11th gen Rocket Lake is the their last 14nm cpu, next is Alder lake on 10nm using LGA 1700.

 

Making Rocket Lake a second-gen Pointless Lake, right there with its spiritual brother 7000 series.

Aerocool DS are the best fans you've never tried.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Lord Szechenyi said:

not exactly, it's just that i wasn't an enthusiast 2 years ago

(well at least an enthusiast for modern hardware, because i know all about Zilog VS Motorola)

Intel has been stagnating the cpu divisions for many of years and as a result: we've been stuck on the fucking 4cores bullshit.

It's inevitable 4cores cpu's will be problematic some day, especially for games. Games would have to be nerfed because of that and want 6cores? Pay up over 600 bucks or more! 6cores and above were the HEDT like the i7-5930K and i7-5960X. And not to forget about the yearly 3-5% performance for the same prices.

 

And most of us have thought Intel would have something more powerfull and better up their sleeve if AMD would kick back and guess?! They didn't! It's obvious Intel have been sleeping all those years.

 

Shit like that is what has kept me my i7-4770 since 2014.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Lord Szechenyi said:

not exactly, it's just that i wasn't an enthusiast 2 years ago

(well at least an enthusiast for modern hardware, because i know all about Zilog VS Motorola)

Heh.  Zilog vs Motorola sound like the title of a bad black n’ white 1960’s b grade Japanese monster movie to me.  Maybe that’s because Motorola sounds like Mothra a little bit. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, CTR640 said:

Intel has been stagnating the cpu divisions for many of years and as a result: we've been stuck on the fucking 4cores bullshit.

It's inevitable 4cores cpu's will be problematic some day, especially for games. Games would have to be nerfed because of that and want 6cores? Pay up over 600 bucks or more! 6cores and above were the HEDT like the i7-5930K and i7-5960X. And not to forget about the yearly 3-5% performance for the same prices.

 

And most of us have thought Intel would have something more powerfull and better up their sleeve if AMD would kick back and guess?! They didn't! It's obvious Intel have been sleeping all those years.

 

Shit like that is what has kept me my i7-4770 since 2014.

the number of old games that won't run because i have multicore cpus.

also, look at the ps2.

just because you have more cores, you also need to know how to exploit them well.

(and believe me its not easy)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×