Jump to content

Apple promises to support Thunderbolt on its new ARM Macs

Mario5

Apple is moving away from Intel’s chipsets, but still wants to keep Intel’s Thunderbolt USB-C connectivity standard on new Apple silicon computers, despite the lack of Intel processors.

 

 

Quote

“Over a decade ago, Apple partnered with Intel to design and develop Thunderbolt, and today our customers enjoy the speed and flexibility it brings to every Mac. We remain committed to the future of Thunderbolt and will support it in Macs with Apple silicon,” commented an Apple spokesperson.

 

My thoughts

 So is it going to come to the Ipad Pro too?

**this is my first time writing a tech news article**

Sources

https://www.theverge.com/circuitbreaker/2020/7/8/21317980/apple-silicon-intel-thunderbolt-arm-macs-support-usb-c

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't say it's a surprise, but it's nice to see it stated.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

If the rumors are true, Apple is also going to use their own in-house GPUs and would no longer rely on AMD Radeon for their high end Macs. Basically TB4 will only serve as a high speed data transfer I/O for Apple Silicon Macs unless they make their own discrete GPU. https://www.pcgamer.com/apple-ditch-amd-gpus-and-intel-cpus/

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, captain_to_fire said:

If the rumors are true, Apple is also going to use their own in-house GPUs and would no longer rely on AMD Radeon for their high end Macs. Basically TB4 will only serve as a high speed data transfer I/O for Apple Silicon Macs unless they make their own discrete GPU. https://www.pcgamer.com/apple-ditch-amd-gpus-and-intel-cpus/

Why not do what they did with the Intel chips? Use the low power mode iGPU or dedicated when doing heavy task?

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, captain_to_fire said:

If the rumors are true, Apple is also going to use their own in-house GPUs and would no longer rely on AMD Radeon for their high end Macs. Basically TB4 will only serve as a high speed data transfer I/O for Apple Silicon Macs unless they make their own discrete GPU. https://www.pcgamer.com/apple-ditch-amd-gpus-and-intel-cpus/

i expect the price to performance of future macs to be even more laughable. 40,000 dollar mac with 5 tflops of gpu performance

Link to comment
Share on other sites

Link to post
Share on other sites

The good thing is that Apple uses Thunderbolt on their computers.

the bad thing is that they only use Thunderbolt 

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

But imagine running a Radeon 5950XT on an iPhone.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, captain_to_fire said:

If the rumors are true, Apple is also going to use their own in-house GPUs and would no longer rely on AMD Radeon for their high end Macs. Basically TB4 will only serve as a high speed data transfer I/O for Apple Silicon Macs unless they make their own discrete GPU. https://www.pcgamer.com/apple-ditch-amd-gpus-and-intel-cpus/

Which is pretty damn laughable honestly. The GPU licensing pathway is as complex or more than the x86 cpu situation, so good luck ever actually making a gpu competitive with AMD/Nvidia or heck, even Intel's iGPUs.

 

There is a reason mobile games suck so much and why the switch based off of an ancient platform using a tiny fraction of Maxwell based compute is still light-years ahead of flagship phones in performance.

 

**shrugs**

 

On topic anyways, supporting tb is good, indicates we might still hopefully see the death of the shitstorm that is lightning (which yes at the time was a big improvement, but there are still tons of issues with that connector and durability,  particularly with corrosion if people leave the cables plugged in to the wall while disconnected from the device.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Curufinwe_wins said:

but there are still tons of issues with that connector and durability

This is very subjective but I think the lightning connector feels sturdier and less likely to get pulled off easily than USB C, imo. 

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, spartaman64 said:

i expect the price to performance of future macs to be even more laughable. 40,000 dollar mac with 5 tflops of gpu performance

That makes no sense in any contect whatsoever

1 hour ago, Curufinwe_wins said:

Which is pretty damn laughable honestly. The GPU licensing pathway is as complex or more than the x86 cpu situation, so good luck ever actually making a gpu competitive with AMD/Nvidia or heck, even Intel's iGPUs.

 

There is a reason mobile games suck so much and why the switch based off of an ancient platform using a tiny fraction of Maxwell based compute is still light-years ahead of flagship phones in performance.

 

**shrugs**

 

On topic anyways, supporting tb is good, indicates we might still hopefully see the death of the shitstorm that is lightning (which yes at the time was a big improvement, but there are still tons of issues with that connector and durability,  particularly with corrosion if people leave the cables plugged in to the wall while disconnected from the device.)

Careful what you say. The same people who said ARM macs will never be competitve with x86 are eating their words right now.

 

Regarding licensing costs/nightmare, similar thing happened when they announced they were shifting their GPU in house away from imation. And look what happened? Absolutely nothing and Apple is making amazing strides in mobile GPUs too (1000x performcance gains according to their own keynote)

 

The iPad Pro from 2018 have the same graphics caliber as the Xbox One S, so it would be stupid to actually think Apple will never be able to reach GPU performance competitive to Nvidia or AMD (and that Maxwell GPU from your example is more powerful mostly becasue it has much higher power and thermal budget than phone can ever hope to have)

 

For once, we're ending the duoply Intel and AMD enjoyed on CPU side. Let the same thing happen in the GPU side. It's good for everyone

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RedRound2 said:

That makes no sense in any contect whatsoever

 

its an exaggeration but im just saying they are probably not going to pass the cost savings onto the consumer and in fact i predict they are going to make it even worse since they can slap on the custom hardware excuse

Link to comment
Share on other sites

Link to post
Share on other sites

In comment related to the topic, I dont even know why this was even quetion as Apple had plenty of Mac minis with A12Z running on a Pro Display XDR during the keynote. Plus if TB3 had issues with non Intel platform, USB4 (essentially the same) wouldve been the replacement

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, spartaman64 said:

its an exaggeration but im just saying they are probably not going to pass the cost savings onto the consumer and in fact i predict they are going to make it even worse since they can slap on the custom hardware excuse

I disagree. They'll have powerful chips in lower end Macs at a more affordabe pricepoints like 799-999.

Example? The fact that you can buy an iPhone SE with an A13 chipset for $399

 

For higher end macs, theyll keep charging the same, while having even more of a margin

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RedRound2 said:

I disagree. They'll have powerful chips in lower end Macs at a more affordabe pricepoints like 799-999.

Example? The fact that you can buy an iPhone SE with an A13 chipset for $399

 

For higher end macs, theyll keep charging the same, while having even more of a margin

i dont consider 999 low end and affordable. also thats the thing for the higher end I dont see how they are going to make an arm chip that rivals threadripper 5000 in a few years and certainly not a gpu that rivals a idk 5080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, spartaman64 said:

i dont consider 999 low end and affordable. also thats the thing I dont see how they are going to make an arm chip that rivals threadripper in a few years

Well, Apple doesn't play in lower end market. The CPU alone isnt the cost of the device. There's build quality, software, and rest of the hardware. THe shitty low end Intel processors does for sure cost a lot more than slapping on an ARM chip on their lowest end MacBook Air. And from the looks and promise, their own chips are definitely going to be faster.

 

They ran full scale professional software on an exisitng iPad chip. Ran Maya, Tomb Raider through translation layer. Basically, they weren't even trying

 

And they promised a 2 year transition time. Do you think, they dont have a full scale lineup far into the future? They're confident with what they have. And transition to ARM chips by Apple will also force developers to make their apps more optimized for multi core heterogenous compute. So, honestly, I feel it's just a matter of time.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, spartaman64 said:

its an exaggeration but im just saying they are probably not going to pass the cost savings onto the consumer and in fact i predict they are going to make it even worse since they can slap on the custom hardware excuse

I agree, apple is going to say it's better because it's their hardware so it would probably have the usual apple tax,and people assuming apple is going to pass on any savings are extremely biased towards apple.

As for Thunderbolt 4, nice for connectivity with devices you might already own but I don't see the point in needing Thunderbolt if there won't be any AMD GPU support.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RedRound2 said:

Well, Apple doesn't play in lower end market. The CPU alone isnt the cost of the device. There's build quality, software, and rest of the hardware. THe shitty low end Intel processors does for sure cost a lot more than slapping on an ARM chip on their lowest end MacBook Air. And from the looks and promise, their own chips are definitely going to be faster.

 

They ran full scale professional software on an exisitng iPad chip. Ran Maya, Tomb Raider through translation layer. Basically, they weren't even trying

 

And they promised a 2 year transition time. Do you think, they dont have a full scale lineup far into the future? They're confident with what they have. And transition to ARM chips by Apple will also force developers to make their apps more optimized for multi core heterogenous compute. So, honestly, I feel it's just a matter of time.

yeah but i would not pay 999 dollars for a desktop computer with mobile cpu even if its made out of titanium and sculpted by michaelangelo because i buy a computer to do work or entertain myself on it I dont just sit and stare at it. if they had said a 5 year transition time then i might have thought that it could work but 2 years is too little time to develop a high end replacement

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, spartaman64 said:

 2 years is too little time to develop a high end replacement

they have also been developing this for quite some time, too. It's not like they started working on it this year.

 

I imagine, Apple being Apple, thinking ahead a lot, and making a couple prototypes in 2013 or something, but more as a side project. And then there was Skylake. And they started going full speed ahead. Look at their iPhones and iPads. You can freaking run Final Cut Pro (they did in the keynote) and stuff on an already existing chip for a different platform. Think about what might happen with a custom chip, with

 

  1. better cooling
  2. Custom for more workloads
  3. 5nm (no doubt)
  4. more power

Just imagine. I bet you the A12Z can replace most people's MacBook Air, MacBook (if there was one), or low end MacBook Pro, without much difference.

Of course, I would personally wait for 2nd gen, like all products either completely new to the market or redesigned products, or major OS updates because first time around there will be bugs. No matter what. And that might be what you are trying to say. But you just have to wait 1 more year for an improvement in speed and reliability.

 

57 minutes ago, spartaman64 said:

yeah but i would not pay 999 dollars for a desktop computer with mobile cpu

 

Also, Apple might have 2 CPU lines, one for MacBooks and one for Macs. So you aren't running a mobile chip on a Mac Pro. Even if they don't do that, they will at least under clock the MacBooks.

please quote me or tag me @wall03 so i can see your response

motherboard buying guide      psu buying guide      pc building guide     privacy guide

ltt meme thread

folding at home stats

 

pc:

 

RAM: 16GB DDR4-3200 CL-16

CPU: AMD Ryzen 5 3600 @ 3.6GHz

SSD: 256GB SP

GPU: Radeon RX 570 8GB OC

OS: Windows 10

Status: Main PC

Cinebench R23 score: 9097 (multi) 1236 (single)

 

don't some things look better when they are lowercase?

-wall03

 

hello dark mode users

goodbye light mode users

Link to comment
Share on other sites

Link to post
Share on other sites

Correct me if I’m wrong. Isn’t thunderbolt an Intel standard? Wonder how much money Apple is going to throw at Intel to license it? 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, captain_to_fire said:

This is very subjective but I think the lightning connector feels sturdier and less likely to get pulled off easily than USB C, imo. 

The lightning connector itself only problematic that exposed contacts in moist air can experience rapid corrosion and degredation if left plugged in (moist being like just sitting in a car or something, not particularly damp, just not bone dry). The cable itself is intentionally fragile, which is a related and separate issue.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, captain_to_fire said:

This is very subjective but I think the lightning connector feels sturdier and less likely to get pulled off easily than USB C, imo. 

Looking at a lightning and usbC connector morphological at least a UsbC is basically a sort of miniaturized inside out lightning connector that as a result tends to collect garbage in its crannies unlike lightning.  As far as rumors of Apple doing its own gpu I have not seen these.  Only cpu.  Apple has been doing GPUs since iphoneX, but I can’t see them being 10 times as efficient as intel at this, and Intel is only just starting to even talk about GPUs large enough to be given consideration beyond minimal internal stuff. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RedRound2 said:

 

Careful what you say. The same people who said ARM macs will never be competitve with x86 are eating their words right now.

 

Regarding licensing costs/nightmare, similar thing happened when they announced they were shifting their GPU in house away from imation. And look what happened? Absolutely nothing and Apple is making amazing strides in mobile GPUs too (1000x performcance gains according to their own keynote)

 

The iPad Pro from 2018 have the same graphics caliber as the Xbox One S, so it would be stupid to actually think Apple will never be able to reach GPU performance competitive to Nvidia or AMD (and that Maxwell GPU from your example is more powerful mostly becasue it has much higher power and thermal budget than phone can ever hope to have)

 

For once, we're ending the duoply Intel and AMD enjoyed on CPU side. Let the same thing happen in the GPU side. It's good for everyone

 

Actually the licensing agreements continued with imitation that's how they were able to keep going relatively undisturbed. There were tons of lawsuits and a huge stink over it. And the complexity and capabilities of those gpus are still massively far behind the non-mobile space. Apple made a big deal out of tile based rendering for the first time this past DC and that is old tech by now.

 

The iPad Pro does not have the same graphics caliber as the Xbox One S. It can't do anything close to the same level of complexity. In low complexity renders, it can be as fast. Apple's comment on it was a marketing stunt, nothing more. It didn't even say it was as good. It said "Xbox One Class" as if that means anything and supposedly proved it by using a mobile game with a hilariously low complexity render. Not that low complexity renders can't make good games, but it isn't remotely the same (and those Jaguar cores ugh).

 

The Tegra X1 chip that powers the switch is 5 years old at this point. Launched in Jan of 2015. In the Switch, the undocked power consumption is <8W with screen and all that. Sounds like a lot more than a phone except that you remember that includes joycons, a much less efficient LCD than the Samsung-built OLEDs on top of the range iPhones, a much older and lower performance/higher power processing node, and BTW iPhones still use almost the same amount of power at peak in GPU loads

image.thumb.png.58226fa9194579ecb807f6f61425d5cf.png

 

Apple has a LONG way to go to catch up with AMD/Nvidia GPU's of 5 years ago. Let alone today or 5 years from today. It wouldn't be terribly surprising though to see them licence AMD's IP for their own designs just as Samsung announced they were doing (in conjunction with actually using those designs).

 

 

This isn't a situation where Nvidia has been sitting on their ass for almost 10 years (Sandy Bridge was 2011), and making a GPU has always been much more modular and scalable than those CPU systems as well.

 

Not saying eventually Apple can't do it, but there will be one heck of a transition period before any similar complexity games can actually be rendered by Apple's own bottom-up design, and that's IF they get licencing from one of the players to do it. The honest truth is that Apple doesn't care about that, clearly hasn't cared about that for ages otherwise they wouldn't have been sticking with inferior GPUs from AMD and abhorrently poor desktop/laptop game support (considering their status as a luxury item) on Mac. Clearly they had the money around to get game engines and developers using Metal if they wanted to. They just honestly didn't/don't care.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Curufinwe_wins said:

It wouldn't be terribly surprising though to see them licence AMD's IP for their own designs just as Samsung announced they were doing (in conjunction with actually using those designs

Honestly odd to me since AMD sold their Mobile Radeon graphics (Adreno) to Qualcomm. Like how could it compete atleast on the mobile phone side?

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Drama Lama said:

The good thing is that Apple uses Thunderbolt on their computers.

the bad thing is that they only use Thunderbolt 

That’s not too much of an issue, since those ports are also USB-C Gen 3. Add a port adapter or dock into the mix and you have all the IO required. 
 

Is it annoying that they no longer have USB-A ports? Kind of, yeah. But it’s not that annoying. 

1 hour ago, Donut417 said:

Correct me if I’m wrong. Isn’t thunderbolt an Intel standard? Wonder how much money Apple is going to throw at Intel to license it? 

Apple is co-creator of Thunderbolt. Intel originally made it *for use* in Macs. I’m not sure how the licensing situation is concerned but either Apple will already implicitly have licenses or they will be able to easily secure them. 

4 hours ago, RedRound2 said:

The iPad Pro from 2018 have the same graphics caliber as the Xbox One S

Woooooah there. *HIGHLY* doubt that claim. Please back it up. 
 

While I have no doubt that the 2018 iPad Pro has excellent performance, that’s a stretch. And even if it was true, that means they’ve matched 2013 lower mid-range Desktop graphics performance, at best. 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×