Jump to content

Intel and AMDs Furture

9 minutes ago, Mihle said:

Then Intel would just buy it and have monopoly

That's terrifying.

I personally would like to see some stiff competition from the two 5, 10, and 15 years down the road.  That outcome would benefit consumers the most.

I don't want to live in a world where all Intel does is Kaby Lake levels of improvement, due to the lack of competition.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Cinnabar Sonar said:

That's terrifying.

I personally would like to see some stiff competition from the two 5, 10, and 15 years down the road.  That outcome would benefit consumers the most.

I don't want to live in a world where all Intel does is Kaby Lake levels of improvement, due to the lack of competition.

Intel still has competition. It's call ARM. Even if ARM doesn't compete in the desktop PCs, tablets, phones, and other devices that use ARM is still competition. It's like saying the Indian restaurant isn't competing to the Mexican restaurant next door because they're not serving the same style of food.

 

Also I'm very inclined to believe that "Kaby Lake levels of improvement" have been the norm throughout most of x86's history after reviewing as many benchmarks as I could. That is, IPC improvements have never really been more than 20% between generations. Most of what launched performance in the 90s and early 2000s was merely clock speed improvements. Think about it. When the Pentium III was launched, it had a 500 MHz model. A year later it had a 1 GHz model. Now a 500 MHz difference today is nothing to really bark up a tree at. But back then, that's a 200% improvement in performance in one year.

 

I mean there were exceptions. Core and Zen being the only two that I can think of that had a huge IPC improvement. But that's because they were succeeding architectures that had poor IPC to begin with.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, M.Yurizaki said:

Intel still has competition. It's call ARM. Even if ARM doesn't compete in the desktop PCs, tablets, phones, and other devices that use ARM is still competition. It's like saying the Indian restaurant isn't competing to the Mexican restaurant next door because they're not serving the same style of food.

 

Also I'm very inclined to believe that "Kaby Lake levels of improvement" have been the norm throughout most of x86's history after reviewing as many benchmarks as I could. That is, IPC improvements have never really been more than 20% between generations. Most of what launched performance in the 90s and early 2000s was merely clock speed improvements. Think about it. When the Pentium III was launched, it had a 500 MHz model. A year later it had a 1 GHz model. Now a 500 MHz difference today is nothing to really bark up a tree at. But back then, that's a 200% improvement in performance in one year.

 

I mean there were exceptions. Core and Zen being the only two that I can think of that had a huge IPC improvement. But that's because they were succeeding architectures that had poor IPC to begin with.

Also Intel has IBM in the enterprise industry as competition.

 

HMM maybe I can talk IBM into buying AMD.... 5ghz, 8core, 8way SMT..

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

It's like saying the Indian restaurant isn't competing to the Mexican restaurant next door because they're not serving the same style of food.

They aren't in direct competition, if I want a burrito, I'm only going to look at the Mexican restaurants.  Or more likely in my area, the taco stands.  :P

If there is no competition in the desktop PC realm, that is an issue.  Gains on a mobile chip doesn't necessarily translate to gains on a desktop chip.

7 minutes ago, M.Yurizaki said:

Also I'm very inclined to believe that "Kaby Lake levels of improvement" have been the norm throughout most of x86's history after reviewing as many benchmarks as I could. That is, IPC improvements have never really been more than 20% between generations. Most of what launched performance in the 90s and early 2000s was merely clock speed improvements. Think about it. When the Pentium III was launched, it had a 500 MHz model. A year later it had a 1 GHz model. Now a 500 MHz difference today is nothing to really bark up a tree at. But back then, that's a 200% improvement in performance in one year.

Correct me if I am wrong, but from what I have read, Kaby Lake was particularly bad, and now we have Coffee Lake, with an extra 2 cores thus highly improved multi threaded performance.  I have a feeling if AMD was competitive 2 years ago, we would have seen more cores on Intel's mainstream lineup sooner.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Cinnabar Sonar said:

They aren't in direct competition, if I want a burrito, I'm only going to look at the Mexican restaurants.  Or more likely in my area, the taco stands.  :P

If there is no competition in the desktop PC realm, that is an issue.  Gains on a mobile chip doesn't necessarily translate to gains on a desktop chip.

It doesn't matter. ARM devices compete for people's computer needs. More people probably use their phones or tablets to do things computer related than desktops or even laptops. Therefore, those devices are in direct competition with desktop PCs simply because people use said devices over the PC.

 

I mean hell, I don't see much point in taking my laptop anywhere anymore unless I'm staying at a place for an extended period of time (i.e., a week or more) simply because I can get by with my tablet. There's also another plus: I don't have to take my tablet out at TSA checkpoints.

1 minute ago, Cinnabar Sonar said:

Correct me if I am wrong, but from what I have read, Kaby Lake was particularly bad, and now we have Coffee Lake, with an extra 2 cores thus highly improved multi threaded performance.  I have a feeling if AMD was competitive 2 years ago, we would have seen more cores on Intel's mainstream lineup sooner.

Well I'll bite that Kaby Lake was basically Skylake with extra features. However IPC improvement trends haven't changed over the years and I'm going to argue that any improvements we see by adding two more cores is merely because multithreaded application development is maturing to a point where it doesn't require a software developer with 10 years of experience to get it down right because there are tools that do that for them.

 

And I also want to know if Coffee Lake's configuration has been in the works long before anyone knew about it. It's not like Intel knew anything about Zen and how it would perform and suddenly went "oh shit, AMD's competing again, let's slap on two cores and release a new generation in five months" because IC manufacturing doesn't work that way.

 

I think people are giving AMD way too much credit for what Intel's been doing lately.

Link to comment
Share on other sites

Link to post
Share on other sites

Personally
Laptop>Tablet
All the time, I personally dont see the point of tablets

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD goes down. We're screwed, simply putting it. Profit margins on computer parts are already massive, and I don't look forward to them getting any more expenisve. 

Custom pinewood case, Corsair CX 600WRampage 3 Extreme, i7 980x (@4.2ghz) with ML240 Cooler MSI GTX 970, 24gb DDR3, 240gb OCZ Tr150 SSD + 2Tb Seagate Baracuda. 

 

Advocate for used/older hardware. Also one of the resident petrol heads. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mihle said:

Personally
Laptop>Tablet
All the time, I personally dont see the point of tablets

Same. Its not like tablets are THAT much easier to carry around and nothing can replace a nice keyboard for me. Typing on a phone right now and my hands are cramping like crazy.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel vs AMD in CPUs - May be a sluggfest... or either side may make a mistake and put themselves in a bad position, tough to say.

 

AMDs RTG - They need fundamental change of thinking when it comes to the GPU arena.. Vega had so much hype that it turned into a shit show. We NEED good shit from RTG to push NVidia, if RTG can't produce anything worth a damned with their next generation of graphics chip, its over. AMD RTG is in the position that the CPU division was in before the Ryzen launch, if Ryzen was a dud AMD wouldn't exist.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mr.Meerkat said:

Hold my drink...

I'm looking at the other end of the spectrum, the mid and low range is excellent. If i'm not mistaken isn't the TR4 platform rather hard to cool?

Aurora | Built April '22
CPU: Intel i7 11700K | RAM: 64GB Corsair Vengeance RGB Pro 3200 | GPU: NVidia GTX 1080 Founders Storage: Samsung 970 EVO Plus (500GB), Corsair MP400 (1TB) | PSU: Corsair RM1000

 

Moonlight | Built March '17
CPU: Intel i7 6900K | RAM: 64GB Corsair Vengeance LPX 2666 | GPU: NVidia GTX 750Ti Storage: Intel 600P (512GB), WD Red (4TB) | PSU: Corsair AX860 | Cooling: Corsair H105 | Setup post >here<|

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mudflapman said:

I'm looking at the other end of the spectrum, the mid and low range is excellent. If i'm not mistaken isn't the TR4 platform rather hard to cool?

Still valid for one simple fact, AMD uses solder, Intel uses toothpaste ;) 

Nonetheless, when you're hitting 60-70C on a 4c/8t (or 8c/16t) CPU, what made you think a 16c or a 18c CPU would be easy to cool? Both LGA2066 and TR4 CPUs are extremely hard to cool so...

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/19/2017 at 10:24 AM, M.Yurizaki said:

It doesn't matter. ARM devices compete for people's computer needs. More people probably use their phones or tablets to do things computer related than desktops or even laptops. Therefore, those devices are in direct competition with desktop PCs simply because people use said devices over the PC.

 

I mean hell, I don't see much point in taking my laptop anywhere anymore unless I'm staying at a place for an extended period of time (i.e., a week or more) simply because I can get by with my tablet. There's also another plus: I don't have to take my tablet out at TSA checkpoints.

While smartphones are still quite popular, tablet sales have been declining since 2015.

http://www.telegraph.co.uk/technology/2017/01/11/tablet-sales-fall-third-successive-year-pc-market-stabilises/

While still more sales than laptops and desktops, there has been a sharper decline in new tablet sales every year, if this continues, tablets will be less popular than laptops, and possibly even desktops.

 

Also, while smartphones and tablets are pretty capable with content creation content consumption and light internet browsing, they fail against laptops and PCs just about everywhere else.  Ether due to less powerful hardware, or interface restrictions.

 

The fact of the matter is that PCs and laptops still are a real large market (and that's not going to change anytime soon), If I wanted to buy a laptop, or build a desktop, I would be stuck with Intel or AMD, and if Intel or AMD would have a monopoly on that particular market, it could be very problematic.

On 10/19/2017 at 10:24 AM, M.Yurizaki said:

Well I'll bite that Kaby Lake was basically Skylake with extra features. However IPC improvement trends haven't changed over the years and I'm going to argue that any improvements we see by adding two more cores is merely because multithreaded application development is maturing to a point where it doesn't require a software developer with 10 years of experience to get it down right because there are tools that do that for them.

I don't fully understand your point, multi threaded application development is defiantly better then it was even a few years ago.  However that means nothing if the CPU doesn't have many cores to begin with.  It's very much a chicken and an egg scenario, and like I said before, if AMD was more competitive a few years ago, I have a feeling that we would have seen 6 cores on Intel's mainstream lineup earlier.

On 10/19/2017 at 10:24 AM, M.Yurizaki said:

And I also want to know if Coffee Lake's configuration has been in the works long before anyone knew about it. It's not like Intel knew anything about Zen and how it would perform and suddenly went "oh shit, AMD's competing again, let's slap on two cores and release a new generation in five months" because IC manufacturing doesn't work that way.

We don't know for sure whether Coffee Lake was a reaction to Zen or not.  However, Intel releasing the platform early most likely was.  Furthermore, the fact that Intel announced higher core Skylake X chips, ones that weren't even ready yet, should say something.  I seriously doubt that those higher core chips weren't a knee jerk reaction to Threadripper.

On 10/19/2017 at 10:24 AM, M.Yurizaki said:

I think people are giving AMD way too much credit for what Intel's been doing lately.

I think AMD has had, at least some influence on Intel's decisions.

 

Edit: I meant content consumption, not content creation.  Content creation on a smartphone/tablet sounds,,,  Unpleasant. 

FL Studio might work well on a tablet.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Cinnabar Sonar said:

Wile smartphones are still quite popular, tablet sales have been declining since 2015.

http://www.telegraph.co.uk/technology/2017/01/11/tablet-sales-fall-third-successive-year-pc-market-stabilises/

While still more sales than laptops and desktops, there has been a sharper decline in new tablet sales every year, if this continues, tablets will be less popular than laptops, and possibly even desktops.

Sales means nothing compared to actual usage. For all we know, the people buying new computers are just replacing their 4+ year old ones and the tablet market has reached critical mass such that few people are interested in buying a new tablet because everyone and their mother already has one.

 

If we take a look at this page http://gs.statcounter.com/os-market-share#monthly-201709-201709-bar While Windows has a large share, Android + iOS easily beat it.

 

EDIT: Looking at the US usage share, Android + iOS is almost double that of Windows

Quote

Also, while smartphones and tablets are pretty capable with content creation and light internet browsing, they fail against laptops and PCs just about everywhere else.  Ether due to less powerful hardware, or interface restrictions.

And until a person actually encounters a use case their phone or tablet can't perform, their PC will remain sitting there unused.

 

Quote

I don't fully understand your point, multi threaded application development is defiantly better then it was even a few years ago.  However that means nothing if the CPU doesn't have many cores to begin with.  It's very much a chicken and an egg scenario, and like I said before, if AMD was more competitive a few years ago, I have a feeling that we would have seen 6 cores on Intel's mainstream lineup earlier.

I doubt they would even if AMD was only-just competitive. They would've done so only if what AMD offers had a clear advantage over what they had in terms of performance alone. If Ryzen R7 was only say six core, that wouldn't really be enough for Intel to go six core as long as their quad cores could keep up. AMD was touting for a long time Bulldozer was 8-core. In some sense I'm inclined to agree because most of what a computer does is integer based. But it performed so badly that Intel didn't see a need to play a numbers game with AMD. And even if AMD's IPC had kept up with Intel's more or less, unless they were actually releasing more cores to begin with, I wouldn't have any reason to be Intel would budge all that much either in getting more cores out sooner.

 

Plus really there hasn't been much of decent use case for multiple cores until recently, what with games using threads more effectively and people streaming their video game sessions.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/18/2017 at 2:54 AM, DeezNoNos said:

And.....................

FIGHT

amd_vs_intel.jpg.46dc4976688b7057fc0ba3a9ee162329.jpg

Shouldn't there be spikes pins on the AMD chip?

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

Sales means nothing compared to actual usage. For all we know, the people buying new computers are just replacing their 4+ year old ones and the tablet market has reached critical mass such that few people are interested in buying a new tablet because everyone and their mother already has one.

 

If we take a look at this page http://gs.statcounter.com/os-market-share#monthly-201709-201709-bar While Windows has a large share, Android + iOS easily beat it.

 

EDIT: Looking at the US usage share, Android + iOS is almost double that of Windows

You talking about this?

Capture.PNG.d87b6ca9c686ad3f54bdffdf6d4cd524.PNG

 

I never said that windows had the largest market share.  However it is still very large, definitely large enough to be considered a significant percentage.

 

The US usage share.

Captureus.PNG.0dd8cfd4264ab85799e3caae71ceded8.PNG

 

Also

1 hour ago, Cinnabar Sonar said:

While smartphones are still quite popular, tablet sales have been declining since 2015.

Cdfghfghapture.PNG.73a159e144e4df147f158fad571299f1.PNG

 

US

Capturgfhghjhg65y765r7ye.PNG.ec2384fa2ba02e8c4be315617329c46b.PNG

 

As you can see, tablets are less than 10% market share on both world wide and in america.

Also, i'm not surprised that smart PHONE operating systems have a larger market share.  Phones are kind of a necessity nowadays.

1 hour ago, M.Yurizaki said:

And until a person actually encounters a use case their phone or tablet can't perform, their PC will remain sitting there unused.

According to the statistics that you showed me, that happens a lot.

1 hour ago, M.Yurizaki said:

I doubt they would even if AMD was only-just competitive. They would've done so only if what AMD offers had a clear advantage over what they had in terms of performance alone. If Ryzen R7 was only say six core, that wouldn't really be enough for Intel to go six core as long as their quad cores could keep up. AMD was touting for a long time Bulldozer was 8-core. In some sense I'm inclined to agree because most of what a computer does is integer based. But it performed so badly that Intel didn't see a need to play a numbers game with AMD. And even if AMD's IPC had kept up with Intel's more or less, unless they were actually releasing more cores to begin with, I wouldn't have any reason to be Intel would budge all that much either in getting more cores out sooner.

I suppose there isn't any way for ether of us to know for sure.

1 hour ago, M.Yurizaki said:

Plus really there hasn't been much of decent use case for multiple cores until recently, what with games using threads more effectively and people streaming their video game sessions.

True, the 7700K is plenty for just gaming, but when you start streaming as well, it can start to stumble.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Nuluvius said:

Indeed, Quantum Tunneling is the blocker. There's two context specific articles on this here and here.

This may be true this time however I lived through the switch from µm to sub µm transistor sizes in the late 1980s and early 1990s and we had exactly the same kind of articles, about similar types of "new materials" that would be needed to push down into the nm scale. So while the physical limitations may actually be true this time, I am a bit skeptical because I read the exact same kinds of "it can't be done because of Quantum Tunneling" about anything below µm scale. Before surprise there was a breakthrough and suddenly nobody is talking about how hard nm scale fabs are anymore just how expensive they are.  So, a great big well lets wait and see if it's true this time from someone who has actually lived through the "impossible" scale switch now twice in his lifetime.

 

Unfortunately I can't find the articles that I read back then because none of them were online, since in  1980-85 the world didn't have an single "online" and they were all paper which I later recycled. Sorry.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Cinnabar Sonar said:

You talking about this?

Capture.PNG.d87b6ca9c686ad3f54bdffdf6d4cd524.PNG

 

I never said that windows had the largest market share.  However it is still very large, definitely large enough to be considered a significant percentage.

 

The US usage share.

Captureus.PNG.0dd8cfd4264ab85799e3caae71ceded8.PNG

 

Also

Cdfghfghapture.PNG.73a159e144e4df147f158fad571299f1.PNG

 

US

Capturgfhghjhg65y765r7ye.PNG.ec2384fa2ba02e8c4be315617329c46b.PNG

 

As you can see, tablets are less than 10% market share on both world wide and in america.

..

There are a couple of interesting things about this chart, first it is missing laptops completely which is what I see replacing desktops more than phones (maybe they are grouping laptops under mobile?). Second how would they measure someone like me who currently has a desktop, laptop, and tablet all doing different tasks at the same time, with a phone in my pocket for if I get a message on one of the apps that is phone specific?

 

Basically I don't believe there is a "one system" that works for anyone anymore, I have a phone for when I am moving around, a tablet for when I am not at my desk, a laptop for when I am traveling/away from my main system or in a conference call, a desktop for my heavy local computing tasks and my cloud accounts to tie it all together (with file storage, processing nodes and databases available as I need them). I may be an early adopter of this style but I can see a variant of this happening, with the "desktop" being a home theater PC or VR PC for a great many people as opposed to the development machine I have. 

Link to comment
Share on other sites

Link to post
Share on other sites

my guess is that by the mid 2020s India or China will develop a super low cost cpu

that competes with both amd and intels top of the line cpus at a tenth of the cost

resulting in a mass migration away from both western companies

Link to comment
Share on other sites

Link to post
Share on other sites

I personally see the trend going away from x86 in the next 10 years. Maybe more ARM and possibly POWER (doubtful) being used in the mainstream markets. Possibly something new even, especially considering the rapid progression into making radical technologies like quantum computing a reality. I can't comment on how well these new technologies would integrate with software that the mainstream consumers use, but who knows...

 

With Windows 10 (kind of) supporting ARM, that's the first timid step into that journey of divorcing mainstream computing from x86. 

 

Both Intel and AMD will have to adapt to these trends. They have no choice if they want to stay relevant.

 

 

New Build (The Compromise): CPU - i7 9700K @ 5.1Ghz Mobo - ASRock Z390 Taichi | RAM - 16GB G.SKILL TridentZ RGB 3200CL14 @ 3466 14-14-14-30 1T | GPU - ASUS Strix GTX 1080 TI | Cooler - Corsair h100i Pro | SSDs - 500 GB 960 EVO + 500 GB 850 EVO + 1TB MX300 | Case - Coolermaster H500 | PSUEVGA 850 P2 | Monitor - LG 32GK850G-B 144hz 1440p | OSWindows 10 Pro. 

Peripherals - Corsair K70 Lux RGB | Corsair Scimitar RGB | Audio-technica ATH M50X + Antlion Modmic 5 |

CPU/GPU history: Athlon 6000+/HD4850 > i7 2600k/GTX 580, R9 390, R9 Fury > i7 7700K/R9 Fury, 1080TI > Ryzen 1700/1080TI > i7 9700K/1080TI.

Other tech: Surface Pro 4 (i5/128GB), Lenovo Ideapad Y510P w/ Kali, OnePlus 6T (8G/128G), PS4 Slim.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, AncientNerd said:

This may be true this time however I lived through the switch from µm to sub µm transistor sizes in the late 1980s and early 1990s and we had exactly the same kind of articles, about similar types of "new materials" that would be needed to push down into the nm scale. So while the physical limitations may actually be true this time, I am a bit skeptical because I read the exact same kinds of "it can't be done because of Quantum Tunneling" about anything below µm scale. Before surprise there was a breakthrough and suddenly nobody is talking about how hard nm scale fabs are anymore just how expensive they are.  So, a great big well lets wait and see if it's true this time from someone who has actually lived through the "impossible" scale switch now twice in his lifetime.

I was there for that as well. In my opinion however, this time I think that a paradigm shift is required.

The single biggest problem in communication is the illusion that it has taken place.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Nuluvius said:

I was there for that as well. In my opinion however, this time I think that a paradigm shift is required.

Sure it might be just because we are much closer to the actual atomic scale this time. However, I just hear all the same arguments and read papers that say the same thing that was said when I was getting my first degree. And I can't help but think that I have heard these arguments before, and roll my eyes. Remember there was a paradigm shift, it just wasn't a materials paradigm shift it was a process paradigm shift, changing the actual etching process was what moved it along last time. So maybe it will take a material shift this time, or maybe it will be yet another process/manufacturing shift, or maybe it will be a structural change in the actual design of the chips so they are laid out differently. There are lots of options of what kind of paradigm shift it could me and not all of them involve a hard switch away from the existing fabs/manufacturing process.  

Link to comment
Share on other sites

Link to post
Share on other sites

I think that a lot of Computers With Processors like 1st or 2nd Gen Intel Processors will not be usable for Gaming and other stuff anymore On 5-8 Years, You would at least need a 4th or 5th Generation Processor, However, we do not know how fast will technology will advance, I hope that My Computer lasts a little bit longer, The Future is Great and Bad at the same time for some people...

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, AncientNerd said:

There are a couple of interesting things about this chart, first it is missing laptops completely which is what I see replacing desktops more than phones (maybe they are grouping laptops under mobile?).

That is a huge oversight, that information is quite important.  I think that laptops and desktops are grouped together.

 

From the site's FAQ page:

Quote

MOBILE: How do you define a mobile device?

We define a mobile device as a pocket-sized computing device, typically having a display screen with touch input or a miniature keyboard.

I wouldn't consider laptops "pocket-sized"

9 hours ago, AncientNerd said:

Second how would they measure someone like me who currently has a desktop, laptop, and tablet all doing different tasks at the same time, with a phone in my pocket for if I get a message on one of the apps that is phone specific?

Not sure, I think that there is a lot of overlap with those that own a computer and a phone.  These graphs are lacking too much information to honestly be helpful.

 

How many users have a phone, and not a desktop/laptop?

Frequency of use?

The demographics, of those that own these devices?  (I don't expect a farmer to have a computer, but I do expect a programmer to.)

Hell, even where these devices are being used would be helpful.

 

Without this information, it's hard to say what is replacing what.

9 hours ago, AncientNerd said:

Basically I don't believe there is a "one system" that works for anyone anymore,

Agreed, there are pros and cons to each device.

9 hours ago, AncientNerd said:

I have a phone for when I am moving around, a tablet for when I am not at my desk, a laptop for when I am traveling/away from my main system or in a conference call, a desktop for my heavy local computing tasks and my cloud accounts to tie it all together (with file storage, processing nodes and databases available as I need them).

For me, the desktop is the "main base" so to speak.  While I promised my laptop to a family member, I could see myself using one for less intensive tasks when away/desktop is rendering (or if i'm feeling lazy and want to have a computer when lying in bed).  My phone is well, a phone, self explanatory for what I use it for.  I used to have a tablet, before it got stolen.  I only really used it for content consumption.

9 hours ago, AncientNerd said:

I may be an early adopter of this style but I can see a variant of this happening, with the "desktop" being a home theater PC or VR PC for a great many people as opposed to the development machine I have. 

Your style is more cost prohibitive than others.

Although, I can definitely see a desktop take the role as "main base" while others in a family have their own laptops.  Maybe even "light server" or NAS.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Cinnabar Sonar said:

Your style is more cost prohibitive than others.

Although, I can definitely see a desktop take the role as "main base" while others in a family have their own laptops.  Maybe even "light server" or NAS.

Maybe but it seems to be what I am seeing in a fair number of people I interact with in the "real world". From my kids friends (mid-20 something professionals), people I know from work and family friends. It may just be that I interact with a tech savvy/rich bunch but the price of laptops/desktops are down from what just a desktop was a few years ago (well...twenty, I guess my definition of a few may be skewed). Where you could expect to spend $3K on a basic desktop, now you could expect to get, laptop, desktop, phone and tablet for ~$3K if you don't go crazy with any single element of the whole set. Say $1K laptop, $800 desktop (or equivalent home PC of some sort) , $800 phone, $400 tablet and you are about what the equivalent desktop would have run you in the mid to late 1990's when the explosion of desktops happened.

 

So maybe it is cost prohibitive, or maybe it just needs the right set of applications to make it viable.   

Link to comment
Share on other sites

Link to post
Share on other sites

I see ARM based processors made by multiple companies competing to replace x86-64 at least on Laptops. Windows 10 (full version) now supports ARM, we can expect to start seeing the first ARM based laptops Q2 of 2018. If you look at the performance of Apples newest 64-bit A11 Bionic ARM based chip its beating Intel's i5 7360U mobile chip and uses less power.

 

6965821-15054453547624524.png

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×