Jump to content

Intel: Tick-Tock cycle dead, now instead "Process-Architecture-Optimization"

Just now, Coaxialgamer said:

wait. They have quad SMT in the works ?

 

But doing that would mean doing major changes to the core , and they arent going to.

Patrick can tell you that.  They've had Quad-SMT and beyond for awhile, I believe, but it'll be coming to consumers with AVX2 full-support around Kaby-lake / Skylake-E.  Right, @patrickjp93?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, patrickjp93 said:

And in that case it just comes down to a price war if Intel wants to keep an iron grip on the market.

That's only a good thing for the end consumer. We need some competition, I'm tired of paying $300+ for quad-cores.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SurvivorNVL said:

I would prefer them keep the 4-cores and instead introduce their 4-way multi-threading and inverse-multi-threading for specific applications.  4c/16t and then inverse multi-threading for some specific applications.  That would be wonderful.

Without doing a massive change in the microarchitecture, you are not going to be seen any performance increase, by expanding to 4-way SMT.

That you have a x86 4-way SMT frankenstein chip designed to do a few workloads, isn't going to get the same gains in normal applications.

 

2 minutes ago, Coaxialgamer said:

wait. They have quad SMT in the works ?

 

But doing that would mean doing major changes to the core , and they arent going to.

In a frankenstein atom chip designed for HPC market. 

 

6 minutes ago, SurvivorNVL said:

Patrick can tell you that.  They've had Quad-SMT and beyond for awhile, I believe, but it'll be coming to consumers with AVX2 full-support around Kaby-lake / Skylake-E.  Right, @patrickjp93?

You are not going to be seen 4-way SMT on consumer products, atleast not for a while.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, so it means it'll be better to get a Skylake right now instead of waiting for Kaby Lake?

Motherboard - ROG Maximus VIII Formula
CPU - Intel Core i5 6600 @ 3.9 GHz
RAM - Corsair Vengeance LPX 16GB 2400MHz
GPU -
Storage - Samsung 850 EVO 250GB
PSU - Corsair RM750i
Case - Corsair Carbide 400C
Keyboard - Razer BlackWidow X Tournament Edition Chroma
Mouse - Razer Naga Chroma

Headset - Razer Kraken 7.1 v2 Chroma

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Coaxialgamer said:

NEVER underestimate intel's engineers .

Companies went under because they did just that.

amd: I'm not dead yet!

amd: No really, I think I'm getting better!

 

/montypython

 

 

Ryzen 7 2700x | MSI B450 Tomahawk | GTX 780 Windforce | 16GB 3200
Dell 3007WFP | 2xDell 2001FP | Logitech G710 | Logitech G710 | Team Wolf Void Ray | Strafe RGB MX Silent
iPhone 8 Plus ZTE Axon 7 | iPad Air 2 | Nvidia Shield Tablet 32gig LTE | Lenovo W700DS

Link to comment
Share on other sites

Link to post
Share on other sites

If Intel is truly trying to minimize R&D due to their incredible marketshare than this is the starting point of how all giants fall.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Reaver Seijuuro said:

Wait, so it means it'll be better to get a Skylake right now instead of waiting for Kaby Lake?

The best thing to get would be Haswell-E. Skylake is still too expensive for what it's worth now, especially with the 5820K being $349 off of newegg.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, i_build_nanosuits said:

seriously...an old dvd cut into halves for a graph...intel that's not serious right?

That's supposed to be a CPU wafer. You were joking, right?

 

Chips-big.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Maybach123 said:

thats a given rule like never try to invade Russia xD 

Nononono, invading Russia is fine. Many have done it, and quite successfully. Hitler made amazing progress, and likely would have taken them out had he listened to his generals and attacked Moscow instead of targeting Stalingrad (a city of 0 strategic importance, and the turning point of the Russian campaign for Germany, Hitler was basically trying to wave his dick in Stalin's face and it was bitten off). Napoleon also made it to the doors of Moscow, but in perhaps the ballsiest defense in all of history, the Russians (not the French) razed most of Moscow to the ground, depleting the city's food supplies and leaving Napoleon's army stranded, in the middle of winter, with very little food, shelter, and supplies. Both failed because they sought to stay the winter and were utterly unprepared for it. So essentially what one should NOT do is wage a winter campaign in Russia. The Russian Winter has been her greatest ally, and in all of history only one outsider has successfully conquered Russia, and that would be the great Mongol General: Chinggis (see westernization: Ghengis) Khan. Because the Mongols came from an area with winters just as harsh as the Russian winter, they were well prepared to deal with the cold. Perhaps even more important, the Mongolian horses did not need to eat grass, as most other horses do; able to survive off of roots and otherwise inedible food sources for other horses, the diets of their horses saved the Mongols from having to bring additional food for the horses, allowing their armies to last much longer in the field. In fact, his son Ogedei Khan marched the Mongol Armies all the way to the gates of Vienna, Austria, and had it not been for Ogedei's untimely death (and the subsequent mongol civil war) there's no telling how far the Mongols could have kept going, for in the 13th and 14th centuries, theirs was truly the greatest military might.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, JoeyDM said:

That's supposed to be a CPU wafer. You were joking, right?

 

 

 

(Sorry if links are prohibited, but holy hell WTF is that thing, it is ugly.)

 

Can someone send me a chronological order of not only Intel CPU names but also nVidia and AMD GPU code names etc. please and thank you.

 

P.S. I swear Pascal is probably not only going to kill my wallet, it'd take a new Mobo to even use the bloody thing I am using a Z97X Gaming 3 Gigabyte MB, cuz why not?

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Tomsen said:

Intel's tick-tock model is long buried dead in the ground. The tick-tock-toe model is already in place. We saw it with haswell, and the "devil's canyon" following.

 

In regards to your remark on optimization, you're not going to see changes in instruction latency. The core is the same (Only changes would be fixing some bug). I wont expect any new extentions, not already supported. If you are thinking of the individual SKUs having more "unlocked" features, like we saw with devil's canyon, then sure.

 

The changes will most likely be in uncore, and process and library optimizations. Intel will be fabbing bigger dies. Also be slightly more aggressive with the clock-speed.

That means higher core-count and clock-rate for servers, and higher clock-rate for consumers (and perhaps more "unlocked" features). 

I'll be vindicated on that with the release of Kaby Lake. Why bother to release a 3rd generation? Clock speed alone? Platform alone? No, there will be minor and few architectural improvements. Since Kaby Lake is supporting AVX 512 when (consumer) Skylake doesn't, there goes the theory about no new extensions.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Coaxialgamer said:

wait. They have quad SMT in the works ?

 

But doing that would mean doing major changes to the core , and they arent going to.

It's already implemented on Knight's Corner and Knight's Landing.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, patrickjp93 said:

I'll be vindicated on that with the release of Kaby Lake. Why bother to release a 3rd generation? Clock speed alone? Platform alone? No, there will be minor and few architectural improvements. Since Kaby Lake is supporting AVX 512 when (consumer) Skylake doesn't, there goes the theory about no new extensions.

Couldn't care about AVX-512, because because there's little incentive to use AVX extensions as-is in consumer software. It's not like Intel is helping matters by relegating it to purely Core-I processors, once again placing a premium on features that should just be there.

 

It'll be more interesting to see what kind of optimizations Kaby Lake brings to the table, maybe we'll see something a la Maxwell.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, shdowhunt60 said:

Couldn't care about AVX-512, because because there's little incentive to use AVX extensions as-is in consumer software. It's not like Intel is helping matters by relegating it to purely Core-I processors, once again placing a premium on features that should just be there.

 

It'll be more interesting to see what kind of optimizations Kaby Lake brings to the table, maybe we'll see something a la Maxwell.

Try telling that to one of Valve's senior game and engine developers. 

It's not just limited to Core-I processors. You can find AVX2 on Cherry Trail Atoms as well.

 

Seriously, where do you ignoramuses come from?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Congratulations? AVX 2 is still not an extension that's commonly available on consumer hardware. You still have plenty gamers sitting on Nehalem, Phenoms, Sandy Bridges, and Bulldozer/Piledriver. Not everyone is using Haswell and newer. I mean, hell, look at the shitstorm that Kojima Studios got, because MGS4 depended on SSE4.1, and that's something that Intel DOES put on Pentiums.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, shdowhunt60 said:

Congratulations? AVX 2 is still not an extension that's commonly available on consumer hardware. You still have plenty gamers sitting on Nehalem, Phenoms, Sandy Bridges, and Bulldozer/Piledriver. Not everyone is using Haswell and newer. I mean, hell, look at the shitstorm that Kojima Studios got, because MGS4 depended on SSE4.1, and that's something that Intel DOES put on Pentiums.

That's the compulsion for people to get hardware upgrades. If you can multiply your performance by 2,4, 8, or even 16x, then do it. And Intel does put AVX on Pentiums, just not all of them.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, patrickjp93 said:

That's the compulsion for people to get hardware upgrades. If you can multiply your performance by 2,4, 8, or even 16x, then do it. And Intel does put AVX on Pentiums, just not all of them.

And there's been very little compulsion to ever get them. So why put the effort into optimizing for hardware features that very certain end users might have? 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, shdowhunt60 said:

The best thing to get would be Haswell-E. Skylake is still too expensive for what it's worth now, especially with the 5820K being $349 off of newegg.

I would if there are affordable options for Haswell-E here. A 5920K here is about 100.00 USD more than the 6700K. It's the same situation with the motherboards.

Motherboard - ROG Maximus VIII Formula
CPU - Intel Core i5 6600 @ 3.9 GHz
RAM - Corsair Vengeance LPX 16GB 2400MHz
GPU -
Storage - Samsung 850 EVO 250GB
PSU - Corsair RM750i
Case - Corsair Carbide 400C
Keyboard - Razer BlackWidow X Tournament Edition Chroma
Mouse - Razer Naga Chroma

Headset - Razer Kraken 7.1 v2 Chroma

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, shdowhunt60 said:

And there's been very little compulsion to ever get them. So why put the effort into optimizing for hardware features that very certain end users might have? 

Because it's not hard to do so, and providing a better experience sells more copies. And it's the prestige of being truly next-gen. SISD instructions are already as optimized as they can be for the most part. People will have to get over that.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Reaver Seijuuro said:

I would if there are affordable options for Haswell-E here. A 5920K here is about 100.00 USD more than the 6700K. It's the same situation with the motherboards.

The 5820K is cheaper, at least in the U.S.. If you mean the 5930K, well, did you expect an additional 24 PCIe lanes over the 6700K to come for free?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, patrickjp93 said:

Because it's not hard to do so, and providing a better experience sells more copies. And it's the prestige of being truly next-gen. SISD instructions are already as optimized as they can be for the most part. People will have to get over that.

"Better experience"? To what extent? The majority of gaming is GPU bound more than anything. Once again, there's plenty of people out there with 5 year old CPU's that do just fine.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, shdowhunt60 said:

"Better experience"? To what extent? The majority of gaming is GPU bound more than anything. Once again, there's plenty of people out there with 5 year old CPU's that do just fine.

I'm sorry but did you forget the CPU has processing power that can augment what the GPU is doing? split-frame rendering is not limited to GPUs. Heck it's been part of OpenGL for a very long time.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, patrickjp93 said:

I'm sorry but did you forget the CPU has processing power that can augment what the GPU is doing? split-frame rendering is not limited to GPUs. Heck it's been part of OpenGL for a very long time.

I'm well aware. But GPU's aren't being fully utilized by drivers and modern API's as is. That would be a more worthy venture without the risk of possibly excluding the bottom margin.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, shdowhunt60 said:

I'm well aware. But GPU's aren't being fully utilized by drivers and modern API's as is. That would be a more worthy venture without the risk of possibly excluding the bottom margin.

If you really believe that, you're nuts. GPUs are being strained about as far as they can be, even if imperfectly. 

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, patrickjp93 said:

If you really believe that, you're nuts. GPUs are being strained about as far as they can be, even if imperfectly. 

I actually do believe that. Especially for AMD's hardware in particular.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×