Jump to content

Intel Cannonlake 10nm Preliminary Production Begun, Comes at Cost of Money and Production Time

patrickjp93

I never said they were. Intel made the announcement only days after microsoft provided some information regarding 'catapult'.

 

You mean having both a x86 and ARM processor within the same server? No. It really haven't.

 

We are talking about incredible big companies. They will settle as long as there is a profit.

They are also aware of Intels grap of the server market.

Everything is pointing that they are going after certain smaller markets within the big server-market.

Nothing is pointing at going for the entire market, as you are suggesting.

These companies know what is possible and what is not possible.

 

By all definitions it is. I should have been more informative. I was meaning using heterogneous computing on a single chip.

Those who have the softare to utilize heterogeneous hardware, will mostlikely have the money. The software alone can be quite expensive.

 

By this definition every company is at constant threat.

Every living persson is at constant threat.

 

What are you talking about? Sometimes I wonder where you get your ideas from.

Intels iGP is about improving the graphical performance of their processors. All their consumer products feature a iGP, however most of their server processors don't.

Intel really don't care about office space when considering graphical performance. It is mobile. Mobile gaming is a big one.

As I predicted earlier, most of the consumer market will rely on a single SoC chip instead of having a dedicated CPU and GPU.

The gamer market is only going to increase. Intels iGP have nothing to compete against AMDs and Nvidias server environment.

Their SOCKET 2011 chips don't, but that's because the socket is outdated and they have to provide a new one. The 115x Xeons you have options. Intel is not after tiny markets that need graphics power. They want to stay on top of HSA's performance in direct response to the compute power of the Berlin APUs. You're naive. Most consumers don't buy an Intel CPU just to use integrated graphics unless you work in an office space. You buy an Intel CPU for the performance paired with a dGPU. Intel is not after gamers because it already has them, and while Skylake will be interesting vs. Carrizo, Intel is all about the compute power where it stands to lose money in the scientific computing markets. You really need to step back and look at the whole picture.

 

On mobile gaming Intel doesn't care much. 90% of laptops are built on its chips anyway, and most use dGPUs paired with its CPUs. Chromebooks and ultrathins aren't meant for gaming, which is where Intel went after Qualcomm and AMD with Core M. It has the TDP and power usage advantage most consumers care about and can already run dual graphics with any dGPU. It has no reason to improve mobile graphics until AMD can bring its TDP down while not losing performance, which it can't do because it's stuck on 28nm until 2016.

 

Evolution only occurs when organisms are under threat, and the ones who best handle it survive and reproduce. The same is analogous in the corporate world. Intel is only threatened in the phone market and in scientific computing. Laptops/tablets/ultrabooks it has locked up tight.

 

You really miss the big picture. If Intel wanted better graphics it would implement tesselation and polymorph engines AS A START. But no, it's improving raw compute power (Broadwell top SKU will have a 2 TFlop GPU for crying out loud!). You're completely wrong on all these fronts.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You've yet to try cooling them.

IBM and 3M found a solution

also lets say they will stack 4 single cores together the amount of transistors will be ~4x but the clock rate doesnt need to be very high

 

from samsung stacking NAND it has been shown that power usage and heat will go down and performance will go up

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

If the wccf article also floating around right now is true..

Never start a post like this, people..

We'll see if this was a good investment. I have a feeling lots of people won't care about the price hike and that is probably the scariest thing in the world to me right now when it comes to processors.

Link to comment
Share on other sites

Link to post
Share on other sites

IBM and 3M found a solution

also lets say they will stack 4 single cores together the amount of transistors will be ~4x but the clock rate doesnt need to be very high

 

from samsung stacking NAND it has been shown that power usage and heat will go down and performance will go up

Vapourware and unproven. I know it works in the 200Hz brain chip IBM built, but on a 4GHz Intel chip? Nothing has been demonstrated in real-world performance. Even the article admits there is uncertainty in real applications, and then that still requires a 3D surrounding heatspreader. It's not so easy as IBM putting together a theoretical solution and applying it to such an easy scenario.

 

NAND is easy by comparison and it isn't accessed nearly as often as a transistor flicks on and off.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Their SOCKET 2011 chips don't, but that's because the socket is outdated and they have to provide a new one.

The 115x Xeons you have options.

Intel is not after tiny markets that need graphics power.

They want to stay on top of HSA's performance in direct response to the compute power of the Berlin APUs.

You're naive.

Most consumers don't buy an Intel CPU just to use integrated graphics unless you work in an office space. You buy an Intel CPU for the performance paired with a dGPU.

Intel is not after gamers because it already has them, and while Skylake will be interesting vs. Carrizo, Intel is all about the compute power where it stands to lose money in the scientific computing markets.

You really need to step back and look at the whole picture.

On mobile gaming Intel doesn't care much. 90% of laptops are built on its chips anyway, and most use dGPUs paired with its CPUs.

Chromebooks and ultrathins aren't meant for gaming, which is where Intel went after Qualcomm and AMD with Core M.

It has the TDP and power usage advantage most consumers care about and can already run dual graphics with any dGPU.

It has no reason to improve mobile graphics until AMD can bring its TDP down while not losing performance, which it can't do because it's stuck on 28nm until 2016.

Evolution only occurs when organisms are under threat, and the ones who best handle it survive and reproduce.

The same is analogous in the corporate world. Intel is only threatened in the phone market and in scientific computing.

Laptops/tablets/ultrabooks it has locked up tight.

You really miss the big picture.

You're completely wrong on all these fronts.

I would argue that the lack of diespace is the reason why 2011 CPUs dont have iGPs.

Only a few lower ends.

You are saying that that the general consumer market is tiny?

Intel is securing a stronger position in the future.

The gaming market will at some point be dominated by integrated graphics (however this is 7-10+ years away).

Look at the effort intel is pulling into the iris pro graphics.

Why are they so suddenly interrested in HSA alone? You are constantly overrating the potential of HSA in the server and consumer environment especially with the lack of software support.

Why?

Intel have way to many productslines that you cannot even justify why one would buy a intel product.

After Intels marketing regarding their iris pro? Intel are investing in the future. There is a difference between theoretical throughput and actual throughput running software.

I am, but are you?

Intel dominate the laptop market today. They are investing in the future. Laptops with dedicated GPUs are even quicker to die out compared to desktop.

Again Intel is a huge company, with ton of differentieret productlines. Some productlines are dedicated for certain markets.

Because TDP and power usage is what consumers care about?

Most people dont even know what TDP is, and even more dont know the actual power usage on their device.

So because it is not an issue now, they cannot prepare for it? That is mostlikely the dumbest statement of today.

HSA wont be the difference between Intel surviving or not. They can however see the logical improvement with it. Intel are not afraid of dying out, but are investing into growing.

Are you saying that Intel only feel threat in 2 markets? That is obviously wrong.

Not really. Things can and will change.

And you miss the picture of the future.

Clearly.

Link to comment
Share on other sites

Link to post
Share on other sites

Vapourware and unproven. I know it works in the 200Hz brain chip IBM built, but on a 4GHz Intel chip? Nothing has been demonstrated in real-world performance. Even the article admits there is uncertainty in real applications, and then that still requires a 3D surrounding heatspreader. It's not so easy as IBM putting together a theoretical solution and applying it to such an easy scenario.

 

NAND is easy by comparison and it isn't accessed nearly as often as a transistor flicks on and off.

 

this was for 100 layers but its feasible  for ~4 layers

TSV is a bit complicated to do but intel is doing 2.5D

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

this was for 100 layers but its feasible  for ~4 layers

TSV is a bit complicated to do but intel is doing 2.5D

Those 100 layers were running at less than 1GHz. Intel's new interconnect was a power-shrinking move.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I would argue that the lack of diespace is the reason why 2011 CPUs dont have iGPs.

Only a few lower ends.

You are saying that that the general consumer market is tiny?

Intel is securing a stronger position in the future.

The gaming market will at some point be dominated by integrated graphics (however this is 7-10+ years away).

Look at the effort intel is pulling into the iris pro graphics.

Why are they so suddenly interrested in HSA alone? You are constantly overrating the potential of HSA in the server and consumer environment especially with the lack of software support.

Why?

Intel have way to many productslines that you cannot even justify why one would buy a intel product.

After Intels marketing regarding their iris pro? Intel are investing in the future. There is a difference between theoretical throughput and actual throughput running software.

I am, but are you?

Intel dominate the laptop market today. They are investing in the future. Laptops with dedicated GPUs are even quicker to die out compared to desktop.

Again Intel is a huge company, with ton of differentieret productlines. Some productlines are dedicated for certain markets.

Because TDP and power usage is what consumers care about?

Most people dont even know what TDP is, and even more dont know the actual power usage on their device.

So because it is not an issue now, they cannot prepare for it? That is mostlikely the dumbest statement of today.

HSA wont be the difference between Intel surviving or not. They can however see the logical improvement with it. Intel are not afraid of dying out, but are investing into growing.

Are you saying that Intel only feel threat in 2 markets? That is obviously wrong.

Not really. Things can and will change.

And you miss the picture of the future.

Clearly.

Socket 2011 has PLENTY of die space, but all the motherboard pins are used up, and there need to be new ones allocated for graphics output.

 

Intel's iGP Xeons are the same level as their non-iGP on the Zx7 platform. All that's missing are the E5 and E7 on the 2011 platform.

 

high-end gaming market (desktop and mobile) << general consumer market (desktop and mobile) << corporate market which contains servers, supercomputers, desktops, and laptops/tablets. Intel has no reason to pay gamers or general consumers much mind. They get the reject pile bins of higher-end chips, hence Intel's market segmentation as it always has been.

 

Intel put effort into Iris Pro solely at Apple's request (the one and only Iris Pro mass purchaser until recently) due to Apple's user base and the premiums both the company and the consumer of its products are willing to pay.

 

HSA's potential is huge. Software support is the only piece missing now. You're the one who seriously underestimates how much time is wasted on feeding a GPU tasks instead of letting on onboard one take the task and run. With Iris 6200 boasting 2 Teraflops of compute performance, Nvidia is going to have to more than double the performance of its Teslas every 2 years to make them worthwhile. And we know Nvidia is against a brick wall right now on that front, hence putting ARM chips onboard the Volta chips (where we see unified memory appear for the first time on a discrete GPU).

 

Intel has so many products in so many price ranges to cater to all levels of consumption from the most barebones office pc or chrome book to the highest end desktop to the world class supercomputers. It's not called a chip giant frivolously.

 

There's a massive difference between theoretical throughput and software throughput. The latter is always much lower than the former.

 

dGPUs in laptops will die when Intel delivers drivers as well as tessellation and polymorph in addition to the 10+ other hardware engines most high-end GPUs provide, making it actually work well in gaming. $650 Iris Pro 5200 vs. Kaveri: Kaveri wins by a mile on gaming performance. Intel so far has not delivered anything it needs to to actually play games well on its iGPUs.

 

Most laptop owners care whether or not their machine bakes their legs, how quiet it is, and how long the battery lasts. Get a clue.

 

Intel CAN prepare for it, but without seeing a reason it's an unnecessary expense, so they won't. Business 101: if you don't need it you don't pay for it.

 

Intel will never die out unless IBM returns to the chip market. However, it wants to keep ALL of its cash cows which is why it's investing in GPGPU compute performance rather than gaming graphics.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×