Jump to content

Intel Preparing Broadwell Launch For CES January 2015, Skylake Launch To Follow

You may not but there is a huge market for mobile chips, laptops of all sorts, phones and tablets including ipads. They could stand to gain enormous cash revenue if they take their markets which would help back n the desktop side of things years down the road.

Link to comment
Share on other sites

Link to post
Share on other sites

Skylake is 14nm or 10nm ? Why would anybody want broadwell with skylake behind the door, doesnt make sense until we get full official info about each chip. Talking about desktop chips, dont care about mobile.

 

Skylake is on the same 14nm node as Broadwell. That's the Intel tick-tock.

 

And remember all these rumors may well be blurring the line between releases for desktops, laptops and ultrabooks/tablets. Maybe Skylake for ultrabooks/tablets will arrive right after Broadwell for desktops, with Skylake only arriving on desktops later on.

Link to comment
Share on other sites

Link to post
Share on other sites

I said this a while ago.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

It's unfortunate, but it looks like Intel is shifting strategies and unlocked processors will be a generation behind from this point forward.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry but this has to be done. I don't see the reasoning behind why Intel would release Skylake and Broadwell so close together. They'll just end up losing money on Broadwell and Skylake would be rushed out the door.
This is the road map that has been floating around for a few months. It's legit. The catch is simple: there won't be any locked Broadwell CPUs on the desktop.
Link to comment
Share on other sites

Link to post
Share on other sites

It's unfortunate, but it looks like Intel is shifting strategies and unlocked processors will be a generation behind from this point forward.

Fine by me. Overclocking headroom is diminished with every process shrink anyway.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Why do I now feel as though my 4690k is garbage...

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Fine by me. Overclocking headroom is diminished with every process shrink anyway.

 

If we take a lesson from the massive increase in overclocking potential offered by NVIDIAs 970/980 power efficiency, the new Intel chips may maintain good overclocking potential because of the very significant increase in power efficiency... Maybe, maybe not. Maybe I/we don't have any real world test information to go on anyways. I'm not overly concerned about losing overclocking potential with the smaller node Broadwell chips simply because power efficiency has been the name of the game for a while now. Food for thought I guess.

 

5+ Ghz on AIO Cooling yet?

 

Hopefully! But maybe not... we'll see.

Fractal Design Arc Midi R2 | ASUS Z97-A | Intel Core i5 4690K | G.Skill Ripjaw 4 GB (desperately needs upgrade) | MSI Radeon R9 290 | EVGA SuperNOVA 750 G2 | Samsung 840 EVO 250 GB SSD | Seagate Constellation 1 TB HDD (also needs upgrade) | NZXT Kraken X61 for CPU | NZXT Kraken X61 + G10 for GPU | Corsair M45 | CM Storm Quickfire TK | ASUS Xonar DSX | Denon AD-H2000

Link to comment
Share on other sites

Link to post
Share on other sites

If we take a lesson from the massive increase in overclocking potential offered by NVIDIAs 970/980 power efficiency, the new Intel chips may maintain good overclocking potential because of the very significant increase in power efficiency... Maybe, maybe not. Maybe I/we don't have any real world test information to go on anyways. I'm not overly concerned about losing overclocking potential with the smaller node Broadwell chips simply because power efficiency has been the name of the game for a while now. Food for thought I guess.

Hopefully! But maybe not... we'll see.

Complexity of CPU circuitry vs. A GPU isn't even a contest. You don't have to deal with control logic, out of order processing, and all the stuff which makes CPUs as powerful as they are on a GPU. In a GPU all you do is send floating point numbers down a pipe. It's much easier to overclock a GPU and keep it stable.

In the future please avoid false equivalences.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is intel did a haswell refresh instead of releasing broadwell this year so my guess is they are just trying to catch up to where they would have been. Broadwell is just the Haswell die shrink. Skylake brings to market all the things that are now in Haswell-E such as DDR4 memory. No clue where that leaves everyone that bought Haswell-E already. I wonder if they will release Broadwell-E or go straight to Skylake-E.

LTT conglomerate can use your support. Join us at https://robertsspaceindustries.com/orgs/UOLTT .   See you in the verse.

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is intel did a haswell refresh instead of releasing broadwell this year so my guess is they are just trying to catch up to where they would have been. Broadwell is just the Haswell die shrink. Skylake brings to market all the things that are now in Haswell-E such as DDR4 memory. No clue where that leaves everyone that bought Haswell-E already. I wonder if they will release Broadwell-E or go straight to Skylake-E.

 

Worst case scenario they let the enthusiast platform fall behind the mainstream platform by a full generation again. :(

Link to comment
Share on other sites

Link to post
Share on other sites

Worst case scenario they let the enthusiast platform fall behind the mainstream platform by a full generation again. :(

I hope not. I have been saving up to buy Haswell-E around the holidays, but if they are gonna push out stuff that fast next year I will probably wait. 

LTT conglomerate can use your support. Join us at https://robertsspaceindustries.com/orgs/UOLTT .   See you in the verse.

Link to comment
Share on other sites

Link to post
Share on other sites

Complexity of CPU circuitry vs. A GPU isn't even a contest. You don't have to deal with control logic, out of order processing, and all the stuff which makes CPUs as powerful as they are on a GPU. In a GPU all you do is send floating point numbers down a pipe. It's much easier to overclock a GPU and keep it stable.

In the future please avoid false equivalences.

 

By no means did I intend to make CPU and GPU overclocking seem equivalent. I simply meant that the increase in power efficiency MIGHT help for overclocking purposes, as it did for overclocking the 980. This has nothing to do with the specific difficulties of CPU or GPU overclocking, only about the lessening of thermal limits related to power efficiency. There isn't any reason to discount this as a false equivalency. I actually don't really see your point as your remark is applicable to all CPU/GPU comparisons going pretty far back.

 

Also, I don't mean for any of this to sound rude. I am genuinely confused as to why you called my comment out as a false equivalency. As far as I can tell, the (speculative) difficulties of overclocking CPUs has not increased so dramatically that improved power efficiency would be a non-factor.

Fractal Design Arc Midi R2 | ASUS Z97-A | Intel Core i5 4690K | G.Skill Ripjaw 4 GB (desperately needs upgrade) | MSI Radeon R9 290 | EVGA SuperNOVA 750 G2 | Samsung 840 EVO 250 GB SSD | Seagate Constellation 1 TB HDD (also needs upgrade) | NZXT Kraken X61 for CPU | NZXT Kraken X61 + G10 for GPU | Corsair M45 | CM Storm Quickfire TK | ASUS Xonar DSX | Denon AD-H2000

Link to comment
Share on other sites

Link to post
Share on other sites

By no means did I intend to make CPU and GPU overclocking seem equivalent. I simply meant that the increase in power efficiency MIGHT help for overclocking purposes, as it did for overclocking the 980. This has nothing to do with the specific difficulties of CPU or GPU overclocking, only about the lessening of thermal limits related to power efficiency. There isn't any reason to discount this as a false equivalency. I actually don't really see your point as your remark is applicable to all CPU/GPU comparisons going pretty far back.

Also, I don't mean for any of this to sound rude. I am genuinely confused as to why you called my comment out as a false equivalency. As far as I can tell, the (speculative) difficulties of overclocking CPUs has not increased so dramatically that improved power efficiency would be a non-factor.

Put another way, silicon can only flicker so fast, and when it does, electricity goes screaming through it generating a lot of heat. The smaller the wire(transistor) the more resistance it has per unit length, but the upside is all the electrical paths lose distance. The problem? The decreased diameter of the wires makes the heat density increase, making cooling more difficult. Furthermore, we're past the point of being able to use the high-leakage silicon of the AMD FX Vishera days. You can't sacrifice electrical efficiency because that also sacrifices thermal efficiency, which will make cooling an overclocked solution many times more difficult. Now, extrapolate all of this over all the advanced control/timing logic in a CPU, the very stuff which makes Haswell run hot in the first place...

We're at the end of 5Ghz CPUs until we abandon silicon, mark my words.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Put another way, silicon can only flicker so fast, and when it does, electricity goes screaming through it generating a lot of heat. The smaller the wire(transistor) the more resistance it has per unit length, but the upside is all the electrical paths lose distance. The problem? The decreased diameter of the wires makes the heat density increase, making cooling more difficult. Furthermore, we're past the point of being able to use the high-leakage silicon of the AMD FX Vishera days. You can't sacrifice electrical efficiency because that also sacrifices thermal efficiency, which will make cooling an overclocked solution many times more difficult. Now, extrapolate all of this over all the advanced control/timing logic in a CPU, the very stuff which makes Haswell run hot in the first place...

We're at the end of 5Ghz CPUs until we abandon silicon, mark my words.

 

Ah, we're definitely talking about different things, and I agree with you completely. I was really talking about the benefit of potentially lower heat output of the more power efficient Intel chips coming in terms of CPU cooling becoming easier for general users (those with air cooling or AIO watercooling or even simple custom loops) to achieve higher stable overclocks. You may be absolutely right that the higher heat density intrinsic to a smaller node may make the increase efficiency null or even decrease overall overclocking potential (I think this word was causing some confusion before, here I am referring to the highest clock speeds that can be achieved). I guess some real world testing will be needed to figure out which will be more impactful.

 

On a somewhat unrelated note, now Im curious what the future material for processors will be. Also, I love your avatar.

Fractal Design Arc Midi R2 | ASUS Z97-A | Intel Core i5 4690K | G.Skill Ripjaw 4 GB (desperately needs upgrade) | MSI Radeon R9 290 | EVGA SuperNOVA 750 G2 | Samsung 840 EVO 250 GB SSD | Seagate Constellation 1 TB HDD (also needs upgrade) | NZXT Kraken X61 for CPU | NZXT Kraken X61 + G10 for GPU | Corsair M45 | CM Storm Quickfire TK | ASUS Xonar DSX | Denon AD-H2000

Link to comment
Share on other sites

Link to post
Share on other sites

Ah, we're definitely talking about different things, and I agree with you completely. I was really talking about the benefit of potentially lower heat output of the more power efficient Intel chips coming in terms of CPU cooling becoming easier for general users (those with air cooling or AIO watercooling or even simple custom loops) to achieve higher stable overclocks. You may be absolutely right that the higher heat density intrinsic to a smaller node may make the increase efficiency null or even decrease overall overclocking potential (I think this word was causing some confusion before, here I am referring to the highest clock speeds that can be achieved). I guess some real world testing will be needed to figure out which will be more impactful.

 

On a somewhat unrelated note, now Im curious what the future material for processors will be. Also, I love your avatar.

Hehe, thanks, and it was partly my fault for not expounding in the first place. After a couple physics classes I feel like that should be common sense, but I must remind myself it's not.

 

We'll be moving to CNT after 2019 when Intel is scheduled to move from its 3nm node (no freaking clue how the company intends to get there in the first place). Graphene and Stanene are still way outside the industry experts' beliefs of being ready for mass production of CMOS and/or FinFET processes by then.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Hehe, thanks, and it was partly my fault for not expounding in the first place. After a couple physics classes I feel like that should be common sense, but I must remind myself it's not.

 

We'll be moving to CNT after 2019 when Intel is scheduled to move from its 3nm node (no freaking clue how the company intends to get there in the first place). Graphene and Stanene are still way outside the industry experts' beliefs of being ready for mass production of CMOS and/or FinFET processes by then.

 

Well, though I was an English and History double major, I've always had a strange desire to understand the more complex workings with computers and other electronics (especially audio reproduction as I'm also a musician). While I certainly don't get everything on as deep a level as I could, it just means there's more for me to read and learn  :)

 

I'll have to look into those materials and manufacturing processes, sounds fascinating!

Fractal Design Arc Midi R2 | ASUS Z97-A | Intel Core i5 4690K | G.Skill Ripjaw 4 GB (desperately needs upgrade) | MSI Radeon R9 290 | EVGA SuperNOVA 750 G2 | Samsung 840 EVO 250 GB SSD | Seagate Constellation 1 TB HDD (also needs upgrade) | NZXT Kraken X61 for CPU | NZXT Kraken X61 + G10 for GPU | Corsair M45 | CM Storm Quickfire TK | ASUS Xonar DSX | Denon AD-H2000

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

I hate to rehash an old topic but it is good to know that I am not the only one thinking about the future. Aside from the increased PCIe lanes on Haswell-E, Haswell seems like a total waste of time to me and Broadwell does not sound much better either.

 

Intel is primarily focused on tablets and notebooks since they represent the market majority. This is why we keep having die shrinks which are less overclocking friendly every time. This forces all of us PC gaming and overclocking enthusiasts to either wait or get sucked onto platforms that we might not be 100% happy with. I am actually thinking of trading my i7-3770k for an i7-2600k so I can get to 5GHz without any problem. Currently I can only get to 4.8GHz stable. Intel's should have had newer silicon chips that are more 5GHz friendly a long time ago. Tablets and notebooks are to blame. Intel will eventually get to 0nm, then no more die shrinks! I have heard rumors of 3D cores and other stuff but that is so far ahead that it is impossible to know anything with certainty.

 

I am looking forward to Skylake-E but if they skip it for some reason then I will be waiting for Cannonlake-E. I have also heard rumors that Cannonlake will be the last desktop compatible chip. I hope that is not true. Intel has really been pushing their 2-in1's. PC Gamers and OC Enthusiasts may represent a small percentage of the market but we are a percentage that will never go away. I am sure AMD would be absolutely delighted to take over completely.

 

When Intel has a new CPU architecture arrive that is more performance per watt then that is when I will want to upgrade. Many think I am crazy for thinking so far ahead and wanting to wait so long but if you look at the Skylake wikipedia page, there are quite a few perks that would make it worth the wait. Configurable TDP is pretty interesting, especially if it is available on the unlocked variety. Plus I get to focus on perfecting the rest of my build and getting it ready for the future. The real changes will come when Intel no longer uses silicon:

 

http://www.extremetech.com/computing/162376-7nm-5nm-3nm-the-new-materials-and-transistors-that-will-take-us-to-the-limits-of-moores-law

 

Last thing I wanted to share has to do with the misunderstanding of DDR4. DDR3 2133 and DDR4 2133 are 2 entirely different things. It may be higher latency numbers but it is roughly double the data transfer rate of DDR3 with lower volatge. To me that sounds good!

i7-3770K @ 4.5GHz, ASRock Z77 Extreme4, G.Skill Sniper 8GB DDR3 1866 @ CL9, ASUS GTX 780, CM HAF XM, Samsung 850 Pro 256GB, WD Black 1TB x 2, EVGA SuperNOVA G2 850W, BenQ XL2420TE 24" 144Hz @ 1080p, CM Nepton 280L, Noctua Industrial IP67 2000RPM 140mm PWM Fan x 6

Link to comment
Share on other sites

Link to post
Share on other sites

I hate to rehash an old topic but it is good to know that I am not the only one thinking about the future. Aside from the increased PCIe lanes on Haswell-E, Haswell seems like a total waste of time to me and Broadwell does not sound much better either.

 

Intel is primarily focused on tablets and notebooks since they represent the market majority. This is why we keep having die shrinks which are less overclocking friendly every time. This forces all of us PC gaming and overclocking enthusiasts to either wait or get sucked onto platforms that we might not be 100% happy with. I am actually thinking of trading my i7-3770k for an i7-2600k so I can get to 5GHz without any problem. Currently I can only get to 4.8GHz stable. Intel's should have had newer silicon chips that are more 5GHz friendly a long time ago. Tablets and notebooks are to blame. Intel will eventually get to 0nm, then no more die shrinks! I have heard rumors of 3D cores and other stuff but that is so far ahead that it is impossible to know anything with certainty.

 

I am looking forward to Skylake-E but if they skip it for some reason then I will be waiting for Cannonlake-E. I have also heard rumors that Cannonlake will be the last desktop compatible chip. I hope that is not true. Intel has really been pushing their 2-in1's. PC Gamers and OC Enthusiasts may represent a small percentage of the market but we are a percentage that will never go away. I am sure AMD would be absolutely delighted to take over completely.

 

When Intel has a new CPU architecture arrive that is more performance per watt then that is when I will want to upgrade. Many think I am crazy for thinking so far ahead and wanting to wait so long but if you look at the Skylake wikipedia page, there are quite a few perks that would make it worth the wait. Configurable TDP is pretty interesting, especially if it is available on the unlocked variety. Plus I get to focus on perfecting the rest of my build and getting it ready for the future. The real changes will come when Intel no longer uses silicon:

 

http://www.extremetech.com/computing/162376-7nm-5nm-3nm-the-new-materials-and-transistors-that-will-take-us-to-the-limits-of-moores-law

 

Last thing I wanted to share has to do with the misunderstanding of DDR4. DDR3 2133 and DDR4 2133 are 2 entirely different things. It may be higher latency numbers but it is roughly double the data transfer rate of DDR3 with lower volatge. To me that sounds good!

... Why? is having a bigger number really that important to you? Why not just get some LN2 going if the number on the overclock is all that matters to you? There is no benefit and possible a 1-2% performance loss going that route, assuming you don't end up with a crap chip.

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't broadwell released on laptops.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×