Jump to content

Is Apple's Betrayal the END of Intel?

Intel has infinity thousand server contracts, not to mention their OEM orders. Oh, and did I mention that lots of huge companies out there run 100% Intel on their tens of thousands of PC laptops?

 

Intel will be just fine. The only thing I would be even slightly worried about is Intel deciding that Ryzen has made it not worth their while to keep sinking resources into the enthusiast market. Even that, I think, is very unlikely, because Intel would still be producing those CPUs for prebuilts. Why not sell them as standalone products on the side?

Aerocool DS are the best fans you've never tried.

Link to comment
Share on other sites

Link to post
Share on other sites

LTT dropping my man DHH in a video. Nice!

Link to comment
Share on other sites

Link to post
Share on other sites

My personal opinion (see also, opinion):

 

Intel has been eeking out its 14nm for 1 reason - its still profitable.  Else they wouldn't.

 

I see Intel allowing AMD to "Catch Up" post FX lineup failure in an effort to not be a monopoly in the CPU/Server space potentially targeted for dismantling if they ever got to big

https://www.ftc.gov/tips-advice/competition-guidance/guide-antitrust-laws/single-firm-conduct/monopolization-defined

 

And we will be blown away, as will AMD, soon here like before.  

 

Or they fucked up by not advancing (But I just don't see that in a Tech company).

 

My 2 copper coins.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've always wondered why every workplace I've been in use Intel...do these businesses get big contracts where they receive discounts for buying Intel based workstations in bulk?

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, aisle9 said:

Intel has infinity thousand server contracts, not to mention their OEM orders. Oh, and did I mention that lots of huge companies out there run 100% Intel on their tens of thousands of PC laptops?

 

Intel will be just fine. The only thing I would be even slightly worried about is Intel deciding that Ryzen has made it not worth their while to keep sinking resources into the enthusiast market. Even that, I think, is very unlikely, because Intel would still be producing those CPUs for prebuilts. Why not sell them as standalone products on the side?

 

Not to mention that they're likely developing specialized co-processors and/or integrating into existing Xeons the ability to efficiently run arm64 and RISC-V code:

 

https://www.nextplatform.com/2018/08/30/intels-exascale-dataflow-engine-drops-x86-and-von-neuman/

Link to comment
Share on other sites

Link to post
Share on other sites

I am not really surprised by this move. With Apples policy of Anti self repair bringing everything under one roof they can make full claim that only they can repair the device. I was not much of a fan of the Apple brand for years because of their practices and this does not help in that regard. They want and have always wanted full proprietary. The only reason Apple even adopted the USB Type C is because of public and investor pressure to do so.

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This LTT video is not well done. First, ARM macs do not mean Apple-store only apps, as clearly stated in during this long interview. Here Federighi and Joswiak give a few examples of how they see iOS and macOS as different platforms, they insist that merging them makes no sense (around 20min mark): a mac has overlapping windows (by default, not as an exception), runs apps from outside the store, has ways to disable integrity features, has a native terminal app (maybe others, watch the interview).

 

 

 

On the TDP calculation: it makes a lot of sense, but also I think Apple will try to fit as many things as possible inside their chips, it will not be just a CPU, memory controller and GPU, but also the T2 chip that runs encryption. All these things will have to share that TDP.

On microcode: microcode is, at least to the best of my knowledge, standard practice in microarchitectures since decades ago. Maybe the layer of microcode on an ARM cpu is thinner than on Intel. Even intel claims their CPUs are RISC and that the x86_64 ISA is just an abstraction done in microcode. Wendell on Level1Linux did an interview with GKH where they talked also about microcode (if I remember correctly), they somewhat give an idea of how complex this abstraction layer is.

On missing equivalent instructions requiring workaround: also shown in the video below. If I am the developer, I will recompile for ARM and x86. A single missing instruction will not matter for a developer (....if I'm not the compiler developer), what will matter will be things like memory access, cache sizes: what is faster, a 1MB statically allocated array or a different data structure in heap? But this kind of optimization level does not even carry from one generation to another of the same architecture.

Jason Turner did an interesting video where he explains what to expect from Rosetta 2 with a very obscure example 

 

Aside from mobile chips, it could be interesting what kind of beast could be put inside of a mac pro... I do not know how Apple could afford to build a cpu for such a low volume product. Maybe current mac pro pricing is so high that this is not an issue?

Love you @GabenJr😇

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, SansVarnic said:

They want and have always wanted full proprietary

Really? Is it always proprietary? Notwithstanding the fact that Apple supported open web standards before some other companies like Microsoft?

  • In 2005, Apple open sourced Safari's rendering engine (WebKit) and its Javascript engine (Nitro). In turn, the likes of Symbian, RIM, and Android used WebKit. Even Chrome used WebKit until it was forked by Google to become Chromium. Microsoft on the other hand for the longest time used its proprietary Trident and EdgeHTML rendering engine for IE and Edge.
  • Between Internet Explorer and Safari, the latter was the first to adopt HTML5 standards.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, captain_to_fire said:

Really? Is it always proprietary? Notwithstanding the fact that Apple supported open web standards before some other companies like Microsoft?

  • In 2005, Apple open sourced Safari's rendering engine (WebKit) and its Javascript engine (Nitro). In turn, the likes of Symbian, RIM, and Android used WebKit. Even Chrome used WebKit until it was forked by Google to become Chromium. Microsoft on the other hand for the longest time used its proprietary Trident and EdgeHTML rendering engine for IE and Edge.
  • Between Internet Explorer and Safari, the latter was the first to adopt HTML5 standards.

It was a ploy. But in the end it worked against them. Similar to what happened when they partnered with Intel to develop Thunderbolt. Why?

Go back further to the 80's & 90's Apple was very, very staunch about opening up to anyone with their software, polar opposite to Microsoft which why Microsoft and IBM pc's basically ruled the market and why Microsoft was indicted for monopolizing the market during the Clinton and Bush Administration. Apple pretty much had to co-operate in order to give Microsoft a competitor. Linux was not going to be that competitor since it was already open and freely available and mostly used by and for severs.

Today it is not the software being opensource, its about the Brand name, their hardware product and their approach to right to repair. Apple is trying very hard to keep all repair under their control. This is not and has not been a secret and matter of fact Apple has been quite vocal about it.

Now, my disclaimer, I am not a right-to-repair advocate but I do  agree that we should be able fix our own stuff if we want to.

 

As a specific example look at how many times Apple has changed their connectors instead of utilizing and adopting already developed and tried& true connectors? Do you remember the meme video that cam out about how apple would change their connectors so many times? it is funny, satire but funny. To quote myself:

5 hours ago, SansVarnic said:

The only reason Apple even adopted the USB Type C is because of public and investor pressure to do so.

 

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, captain_to_fire said:

Really? Is it always proprietary? Notwithstanding the fact that Apple supported open web standards before some other companies like Microsoft?

  • In 2005, Apple open sourced Safari's rendering engine (WebKit) and its Javascript engine (Nitro). In turn, the likes of Symbian, RIM, and Android used WebKit. Even Chrome used WebKit until it was forked by Google to become Chromium. Microsoft on the other hand for the longest time used its proprietary Trident and EdgeHTML rendering engine for IE and Edge.
  • Between Internet Explorer and Safari, the latter was the first to adopt HTML5 standards.

It is not. Apple has done OSS for a very long time, without doing much noise about it. Parts of macOS and iOS are open source and available (even if they have no legal reason to release them) here: https://developer.apple.com/opensource/

Other than WebKit, there is LLVM/Clang that is huge.

Aside from this, their hardware is proprietary. From a repair point of view: common PCB failures are (by far) connectors, then power delivery/controllers. Fried CPUs are not common, so using a in-house built chip will make repair more difficult... to a point. Is that easy buying mobile intel chips from a reputable source?

The big issue is no schematics and sourcing replacements parts.

This switch just makes sense (which is why this move is not surprising😞 apple chips are able to reach intel performance, and Apple is making SoCs, where they can integrate CPU, GPU, memory controller, T2 storage, display controllers and so on, allowing to reduce PCB size and cost, increasing power efficiency and performance. It costs a lot of money upfront. To think that someone at Apple thought "let's stop using intel to make repairs even harder" is just... I don't know.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, fulminemizzega said:

It is not. Apple has done OSS for a very long time, without doing much noise about it. Parts of macOS and iOS are open source and available (even if they have no legal reason to release them) here: https://developer.apple.com/opensource/

Other than WebKit, there is LLVM/Clang that is huge.

Aside from this, their hardware is proprietary. From a repair point of view: common PCB failures are (by far) connectors, then power delivery/controllers. Fried CPUs are not common, so using a in-house built chip will make repair more difficult... to a point. Is that easy buying mobile intel chips from a reputable source?

The big issue is no schematics and sourcing replacements parts.

This switch just makes sense (which is why this move is not surprising😞 apple chips are able to reach intel performance, and Apple is making SoCs, where they can integrate CPU, GPU, memory controller, T2 storage, display controllers and so on, allowing to reduce PCB size and cost, increasing power efficiency and performance. It costs a lot of money upfront. To think that someone at Apple thought "let's stop using intel to make repairs even harder" is just... I don't know.

Let me finish your post for you: ”...tin foil hat stupid”

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 6/25/2020 at 4:40 PM, TheLostTree said:

Do you think that thunderbolt 3 will also become proprietary to either apple or intel?

Since Thunderbolt 3 is available to only Intel devices so the new apple silicon macs probably won't use Thunderbolt and may be using USB 4 which aims to mimic Thunderbolt and be similar.   

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×