Jump to content

Apple promises to support Thunderbolt on its new ARM Macs

Mario5
5 hours ago, RedRound2 said:

The iPad Pro from 2018 have the same graphics caliber as the Xbox One S...

 

35 minutes ago, dalekphalm said:

Woooooah there. *HIGHLY* doubt that claim. Please back it up. 

From AnandTech

 

Quote

Apple’s custom GPU in the iPad Pro is the same one found in the iPhone, but with more cores available. And with the larger surface area of the iPad compared to the phone, likely a higher frequency as well. There’s now seven of the A12 GPU cores, compared to just four on the iPhone, and Apple claims the GPU in the iPad Pro is equivalent to an Xbox One S, although how they came to thise conclusion is difficult to say since we know so little about the underpinnings of the GPU.

 

In rough terms, the Xbox One S is roughly 1.4 TFLOPS at its peak. But for better or worse, when the PC moved to unified shaders, the industry moved to FP32 for all GPU functions. This is as oppposed to the mobile world, where power is an absolute factor for everything, Vertex shaders are typically 32bpc while Pixel and Compute shaders can often be 16bpc. We’ve seen some movement on the PC side to use half-precision GPUs for compute, but for gaming, that’s not currently the case.

 

Overall, that makes like-for-like PC comparisons difficult. An AMD Ryzen 2700U SoC has a Vega GPU which offers 1.66 TFLOPS of FP32 performance, in theory. If run at 16-bit, that number would double, in theory. The iPad Pro would likely use half-precision for some of the GPU workload. This has been an issue for years and has made it difficult easily compare any cross-platform benchmark against the PC.

-AnandTech

Quote

So is the iPad Pro an Xbox One S class of GPU? Likely it is. The Xbox One S is only slightly quicker than the original Xbox One launched in 2013, and that console would struggle to achieve 1080p in games of that vintage. The Vega iGPU in the Ryzen 7 2700U offers more theoretical FLOPS than the Xbox One S, although at a higher TDP of 15-Watts, compared to the iPad Pro. In the synthetic tests, the iPad Pro scored higher than the Vega GPU, albeit at a lower precision, but regardless, there’s little doubt that the GPU in the iPad Pro is quite powerful. Add in the efficiency and the lower TDP, and results are even stronger. On the sustained performance run, the iPad was averaging just under 8 Watts of draw for the entire device. -AnandTech

There you have it. Interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Orcblood said:

Honestly odd to me since AMD sold their Mobile Radeon graphics (Adreno) to Qualcomm. Like how could it compete atleast on the mobile phone side?

These days mobile (amd/nvidia) graphics cards are just normal graphics cards, the work on idle power consumption and modularity has already resulted in the vast majority of the required optimizations (again note the Tegra X1 on a 20nm process powering the switch using 2015 ARM cores and a 256 Cuda Core Maxwell arrangement).

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, StDragon said:

 

From AnandTech

 

There you have it. Interesting.

At lower precision which is literally double+ rate, also it isn't feature complete/competitive. The raw flops, sure. And in a low complexity scene, maybe. That's it. 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, spartaman64 said:

yeah but i would not pay 999 dollars for a desktop computer with mobile cpu even if its made out of titanium and sculpted by michaelangelo because i buy a computer to do work or entertain myself on it I dont just sit and stare at it. if they had said a 5 year transition time then i might have thought that it could work but 2 years is too little time to develop a high end replacement

Wha--? Why do you think that the 999 computer theyll be offering with ARM chips will be somehow 'mobile' or slower than current gen Intel chips? What is mobile? Just because the processor was derived from an iPhone processor?

 

It would be stupid if ARM macs weren't as powerful as Intel. But they're clearly confident and their current production chip runs heavy apps really well. So, honestly im not even sure what your argument is. It's more like you can't possibly think of a scenario where they're cooking up more powerful processors - and that's quite a dumb assumption to make since you don't work for Apple and have no idea what they have running in their labs

 

12 hours ago, Curufinwe_wins said:

The Tegra X1 chip that powers the switch is 5 years old at this point. Launched in Jan of 2015. In the Switch, the undocked power consumption is <8W with screen and all that. Sounds like a lot more than a phone except that you remember that includes joycons, a much less efficient LCD than the Samsung-built OLEDs on top of the range iPhones, a much older and lower performance/higher power processing node, and BTW iPhones still use almost the same amount of power at peak in GPU loads

But the Switch doesn't have good graphics. And the Tegra X1 does need more cooling than what a thin phone or iPad can provide. What you mentioned is the peak power consumption, while sustained is nowhere near <8W like the switch

12 hours ago, Curufinwe_wins said:

Apple has a LONG way to go to catch up with AMD/Nvidia GPU's of 5 years ago. Let alone today or 5 years from today. It wouldn't be terribly surprising though to see them licence AMD's IP for their own designs just as Samsung announced they were doing (in conjunction with actually using those designs).

I still don't understand how you can definitely say that for sure. You speak as if Nvidia and AMD have access to some pixie dust. Maybe they might end up licensing, at least in the beginning, but I still dont see how that's going to stop or prevent Apple from making powerful GPUs

12 hours ago, Curufinwe_wins said:

Not saying eventually Apple can't do it, but there will be one heck of a transition period before any similar complexity games can actually be rendered by Apple's own bottom-up design, and that's IF they get licencing from one of the players to do it. The honest truth is that Apple doesn't care about that, clearly hasn't cared about that for ages otherwise they wouldn't have been sticking with inferior GPUs from AMD and abhorrently poor desktop/laptop game support (considering their status as a luxury item) on Mac. Clearly they had the money around to get game engines and developers using Metal if they wanted to. They just honestly didn't/don't care.

Why they use AMD GPUs isnt a secret. It's a well known secret that Apple and Nvidia aren't really on talking terms. And I'm not sure why they would force game developers to take advantage of Metal and make a Mac game from ground up when the existing user base is very small. They already had a gaming community on iOS, and they just capatailized their existing customers, not create new. 

 

And investing in and trying to get game developers on board Mac platform is quite stupid. Like what will they achieve with that?

12 hours ago, dalekphalm said:

Woooooah there. *HIGHLY* doubt that claim. Please back it up. 
 

While I have no doubt that the 2018 iPad Pro has excellent performance, that’s a stretch. And even if it was true, that means they’ve matched 2013 lower mid-range Desktop graphics performance, at best. 

It does have the raw performance close to an Xbox, you mustve seen the anadtech article, but platform limitations prevents any actual AAA developments. And it doesn't look like that's Apple's priority.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, RedRound2 said:

And investing in and trying to get game developers on board Mac platform is quite stupid. Like what will they achieve with that?

 

If apple Did was MS or Sony doe and just paid off a lot of developers for exclusives they would be like the playstation or xcode. Content is always king. Apple have the needed tec (in many ways they have much better tec) but that does not matter much what matters is content and large AAA games cost money (and time) to make, if apple pays (they might be already as apple only started the apple arcade payments in Jan 2019, big games take a lot loner than what we have seen so far) for larger AAA exclusives they could compete with Xbox or Playstations if they want to.  Apple have been putting enough money into the AppleTV+ area, if they are putting the same money into games we might a get some large AAA titles.
 

The thing to remember unlike the TV/Film industry the game industry is very good at keeping secrets (apples kind of industry) so apple could have an entire line of AAA games in development that might ship in 2022 or 2023 (they only started funding in 2019 so i would not expect anything large for many years still).

 

16 minutes ago, RedRound2 said:

prevent Apple from making powerful GPUs

Apple can also license any such IP. Apple have more R&D budget than Nvidia and AMD combined, much of that ends up in hardware.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RedRound2 said:

Wha--? Why do you think that the 999 computer theyll be offering with ARM chips will be somehow 'mobile' or slower than current gen Intel chips? What is mobile? Just because the processor was derived from an iPhone processor?

 

It would be stupid if ARM macs weren't as powerful as Intel. But they're clearly confident and their current production chip runs heavy apps really well. So, honestly im not even sure what your argument is. It's more like you can't possibly think of a scenario where they're cooking up more powerful processors - and that's quite a dumb assumption to make since you don't work for Apple and have no idea what they have running in their labs

 

you think a company could just cook up an arm chip as strong as a threadripper idk 5990x with maybe 128 cores in like 2 years and probably even intel would have stepped up their game significantly by then. and what about gpus you think intel can make a gpu thats more powerful than like 7500 xt thats maybe like 30% faster than a 2080 ti in 2 years because thats what apple would have to do to have a competitive offering on the high end and you can probably build that 5990x and 7500 xt computer for like 4500 dollars and what is apple going to charge like 50000 with a cpu that probably doesnt even match up with the 3990x and a gpu that probably doesnt even match with the 5700xt. you see the problem apple not only has to compete with hardware today they have to compete with hardware 2 years from now 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RedRound2 said:

But the Switch doesn't have good graphics. And the Tegra X1 does need more cooling than what a thin phone or iPad can provide. What you mentioned is the peak power consumption, while sustained is nowhere near <8W like the switch

"Good graphics" is subjective. Mario Kart 8 Deluxe is fantastic; the effects are done well without dropping below 60 FPS (as near as I can tell). Now, if you define "good graphics" as having lots of GPU RAM to support high resolution texture mapping, then you would have a point.

 

Also, keep in mind that the Nintendo Switch has two modes. *Undocked @ 720P native LCD pulling 8.9 Watts, and docked @ 1080p pulling 11 watts.

 

Nintendo Switch power consumption -AnandTech

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, spartaman64 said:

you think a company could just cook up an arm chip as strong as a threadripper idk 5990x with maybe 128 cores in like 2 years and probably even intel would have stepped up their game significantly by then. and what about gpus you think intel can make a gpu thats more powerful than like 7500 xt thats maybe like 30% faster than a 2080 ti in 2 years because thats what apple would have to do to have a competitive offering on the high end and you can probably build that 5990x and 7500 xt computer for like 4500 dollars and what is apple going to charge like 50000 with a cpu that probably doesnt even match up with the 3990x and a gpu that probably doesnt even match with the 5700xt. you see the problem apple not only has to compete with hardware today they have to compete with hardware 2 years from now 

What is this mindless rant?

 

Do you think Apple just somehow made a one time miracle chip, and then decided to completely shift based on that? Do you really think they dont have a roadmap 2-5, years down the line? The rate of perf improvements in ARM CPUs have been massive compared to 10-15% we've been getting on desktops for past decade. Threadripper is just Ryzen scaled up to more cores and more watts. Do you think Apple can't do that if they want to? Scale up their low power cosumption processor to 120W peak, or whatever Threadripper's peak is? Of course it's not as easy, but Apple definitely has the budget to look into it. And you're forgetting about some fundamentals RISC architecture is better than CISC.

 

And what about hetrogenous computing that you glossed over? It's pretty obvious that many workloads today can easily be offloaded to Neural engines for faster execution. Apple has the advantage here for making developers start from scratch and actually optimize their software for the mix of low powererd, high power, NN and GPUs for the most optimal performance. So in a nutshell, there's so much more room to grow in ARM processors. And only Apple can pull something like this.

 

If you have trouble imagining it, just wait and let's see what happens. You're the same person who kept raving on how ARM will never catcup to x86-64 three years ago. And there were plenty of people who said that. But given what they demoed with A12Z, i think its stupid to assume Apple doesnt have anything else

Quote me in 2 years if theyhorribly failed at their endavour

3 hours ago, StDragon said:

"Good graphics" is subjective. Mario Kart 8 Deluxe is fantastic; the effects are done well without dropping below 60 FPS (as near as I can tell). Now, if you define "good graphics" as having lots of GPU RAM to support high resolution texture mapping, then you would have a point.

 

Also, keep in mind that the Nintendo Switch has two modes. *Undocked @ 720P native LCD pulling 8.9 Watts, and docked @ 1080p pulling 11 watts.

 

Nintendo Switch power consumption -AnandTech

By good graphics I mean a lot of textures and realism. Wasn't talking about subjective animation styly. And im quite sure iPads and even iPhones can handle games with same fidelity as switch games with no issues. That was the point

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RedRound2 said:

What is this mindless rant?

 

Do you think Apple just somehow made a one time miracle chip, and then decided to completely shift based on that? Do you really think they dont have a roadmap 2-5, years down the line? The rate of perf improvements in ARM CPUs have been massive compared to 10-15% we've been getting on desktops for past decade. Threadripper is just Ryzen scaled up to more cores and more watts. Do you think Apple can't do that if they want to? Scale up their low power cosumption processor to 120W peak, or whatever Threadripper's peak is? Of course it's not as easy, but Apple definitely has the budget to look into it. And you're forgetting about some fundamentals RISC architecture is better than CISC.

 

And what about hetrogenous computing that you glossed over? It's pretty obvious that many workloads today can easily be offloaded to Neural engines for faster execution. Apple has the advantage here for making developers start from scratch and actually optimize their software for the mix of low powererd, high power, NN and GPUs for the most optimal performance. So in a nutshell, there's so much more room to grow in ARM processors. And only Apple can pull something like this

By good graphics I mean a lot of textures and realism. Wasn't talking about subjective animation styly. And im quite sure iPads and even iPhones can handle games with same fidelity as switch games with no issues. That was the point

I don’t know if I agree with you or not.  I stopped reading after “what is this mindless rant”. Name calling and prestatements like that usually mean BS is forthcoming.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

I don’t know if I agree with you or not.  I stopped reading after “what is this mindless rant”. Name calling and prestatements like that usually mean BS is forthcoming.

And you didnt have anything better to do than quote me? I didnt name call anyone. If you look at the conversation we're having, you can clearly see how he went from legible english to mindlesss rant about how Apple's chips will never catchup with threadripper in 2 years. Which for numerous reasons I stated, is quite a dumb statement to make this early on

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RedRound2 said:

What is this mindless rant?

 

Do you think Apple just somehow made a one time miracle chip, and then decided to completely shift based on that? Do you really think they dont have a roadmap 2-5, years down the line? The rate of perf improvements in ARM CPUs have been massive compared to 10-15% we've been getting on desktops for past decade. Threadripper is just Ryzen scaled up to more cores and more watts. Do you think Apple can't do that if they want to? Scale up their low power cosumption processor to 120W peak, or whatever Threadripper's peak is? Of course it's not as easy, but Apple definitely has the budget to look into it. And you're forgetting about some fundamentals RISC architecture is better than CISC.

 

And what about hetrogenous computing that you glossed over? It's pretty obvious that many workloads today can easily be offloaded to Neural engines for faster execution. Apple has the advantage here for making developers start from scratch and actually optimize their software for the mix of low powererd, high power, NN and GPUs for the most optimal performance. So in a nutshell, there's so much more room to grow in ARM processors. And only Apple can pull something like this

if they had been working on a high power arm chip for the past 5 years pretty sure they would have told investors that when they announce the move away from intel. also i dont think thats how processor design works you cant just scale up the TDP of a chip designed for low power and expect performance to follow and it seems pretty disingenuous to announce that you are going to use all custom hardware and then make your computers into basically thin clients and expect people to buy third party hardware to do the actual work

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RedRound2 said:

And you didnt have anything better to do than quote me? I didnt name call anyone. If you look at the conversation we're having, you can clearly see how he went from legible english to mindlesss rant about how Apple's chips will never catchup with threadripper in 2 years. Which for numerous reasons I stated, is quite a dumb statement to make this early on

if you cant read my english i think you need to work on your english skills since everyone else seemed to be able to understand me just fine

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RedRound2 said:

And you didnt have anything better to do than quote me? I didnt name call anyone. If you look at the conversation we're having, you can clearly see how he went from legible english to mindlesss rant about how Apple's chips will never catchup with threadripper in 2 years. Which for numerous reasons I stated, is quite a dumb statement to make this early on

The phrase “what is this mindless rant” speaks for itself.  I notice you are careful to say you didn’t call anyone (meaning any person) a name.  You did name all the post though.  When that takes place a persuasive attempt is generally forthcoming. It’s an attempt to set the stage for an argument and often means the argument can’t hold up without set dressing.  It’s a common technique amongst the ultra right of my country when they want to put forth a crap argument.  It’s so common I take it as a “beyond here lies garbage” flag.  Decent chance the statement was partially wrong. I dunno.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, spartaman64 said:

if they had been working on a high power arm chip for the past 5 years pretty sure they would have told investors that when they announce the move away from intel. also i dont think thats how processor design works you cant just scale up the TDP of a chip designed for low power and expect performance to follow and it seems pretty disingenuous to announce that you are going to use all custom hardware and then make your computers into basically thin clients and expect people to buy third party hardware to do the actual work

Why would Apple tell their upcoming plans to investors? When has Apple ever done that? It's been rumored for like 4-5 years now that Apple's been planning this. Skylake was the point of no return for Apple. THe intel chip from 4 years ago. They didn't have like a epiphany earlier this year. They've been planning this for a long time.

 

I specifically said it's not as easy as I stated. But you get the point. How are you still fixated on the fact that they're going to use iPhone chips on Macs? They're not. 

1 minute ago, spartaman64 said:

if you cant read my english i think you need to work on your english skills since everyone else seemed to be able to understand me just fine

Are you actually going to make me point out grammatical mistakes and your absurdly long sentences? Please, I'm not going to, but it's easy to sense how a conversation turns from rational thinking to rants when people become too preoccupied. Just reread, your comment. That's all im going to say

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, RedRound2 said:

Why would Apple tell their upcoming plans to investors? When has Apple ever done that? It's been rumored for like 4-5 years now that Apple's been planning this. Skylake was the point of no return for Apple. THe intel chip from 4 years ago. They didn't have like a epiphany earlier this year. They've been planning this for a long time.

 

I specifically said it's not as easy as I stated. But you get the point. How are you still fixated on the fact that they're going to use iPhone chips on Macs? They're not. 

Are you actually going to make me point out grammatical mistakes and your absurdly long sentences? Please, I'm not going to, but it's easy to sense how a conversation turns from rational thinking to rants when people become too preoccupied. Just reread, your comment. That's all im going to say

we were talking about the entry level i never said they are going to use the iphone chip for the high end thats a strawman. and why even mention that as a counterpoint if you know its not that easy. they probably would tell investors the same reason why they told them a lot of the main mac apps are already translated to arm to prevent panic but ok lets say for whatever reason they didnt care enough to mention that you did make a good point that they could have started right after skylake but I still have trouble believing they are going to be able to make a chip that competes with threadripper but i guess we'll just have to wait and see. when someone starts getting fixated on grammar instead of substance its a sign that they know they lost the argument i would agree if my post is unreadable as you said but everyone else understood me just fine. i know im not using proper punctuation and grammar but thats irrelevant in an informal discussion 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RedRound2 said:

By good graphics I mean a lot of textures and realism. Wasn't talking about subjective animation styly. And im quite sure iPads and even iPhones can handle games with same fidelity as switch games with no issues. That was the point

Fair enough.

 

I think the bigger question to ask is this - Does Apple have any motive for jumping into console or serious gaming beyond mobile apps? I would say "no", they don't. It's not that Apple can't develop a capable SoC, because clearly they've proven to be capable of it, more so if they actually had gaming ambitions. But they don't, and that's really the point.

 

So back on topic then. What does thunderbolt really attempt to solve for an Apple A series laptop?? The only primary advantage is docking solutions and disk / networking I/O capability. I don't ever see an external GPU being doable; specifically as that requires an ARM driver. And well, that defeats the whole point of an SoC platform, so that ain't going to happen.

 

My gut feeling is that Apple announced this purely as a marking ploy to ensure their fandom base doesn't go cold of support for this in-house CPU move. I don't see it actually getting much use above and beyond the aforementioned I/O reasons.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/10/2020 at 9:50 PM, spartaman64 said:

we were talking about the entry level i never said they are going to use the iphone chip for the high end thats a strawman.

Whoa, starwman argument? Didn't you just say how they wont be able to compete with threadripper? Looks like you dont even seem to know what you're trying to say

Quote

and why even mention that as a counterpoint if you know its not that easy. they probably would tell investors the same reason why they told them a lot of the main mac apps are already translated to arm to prevent panic but ok lets say for whatever reason they didnt care enough to mention that you did make a good point that they could have started right after skylake but I still have trouble believing they are going to be able to make a chip that competes with threadripper but i guess we'll just have to wait and see.

What are you even trying to say here? 

Quote

when someone starts getting fixated on grammar instead of substance its a sign that they know they lost the argument i would agree if my post is unreadable as you said but everyone else understood me just fine. i know im not using proper punctuation and grammar but thats irrelevant in an informal discussion 

I did reply to whatever things i can understand from your replies. Sure, let's all pretend that you weren't able to see that. And degrading grammer is a strong indication of someone going fanboy and not thinking straight. In other words, they go into a mindless rant about how they think they're right, without any direct replies and with gaslighting techniques. And how did you conclude that everyone else understood you when nobody else bothered replying to you? Proper grammer and punctuation is important, otherwise we'll all just be screaming a bunch of words at each other.

On 7/10/2020 at 10:47 PM, StDragon said:

Fair enough.

 

I think the bigger question to ask is this - Does Apple have any motive for jumping into console or serious gaming beyond mobile apps? I would say "no", they don't. It's not that Apple can't develop a capable SoC, because clearly they've proven to be capable of it, more so if they actually had gaming ambitions. But they don't, and that's really the point.

They don't have any gaming ambition. Max I can see is them growing off of Apple arcade if thats proves to be a sucess. Otherwise I doubt they'll deliberately try and go after the AAA or competitive gaming segment.

Quote

So back on topic then. What does thunderbolt really attempt to solve for an Apple A series laptop?? The only primary advantage is docking solutions and disk / networking I/O capability. I don't ever see an external GPU being doable; specifically as that requires an ARM driver. And well, that defeats the whole point of an SoC platform, so that ain't going to happen.

Docking solutions is a lot more popular than you think among Mac users. The one cable solution. That is so Apple's forte.

Quote

My gut feeling is that Apple announced this purely as a marking ploy to ensure their fandom base doesn't go cold of support for this in-house CPU move. I don't see it actually getting much use above and beyond the aforementioned I/O reasons.

They could eventaully one day offer a desktop grade in house GPU specifically for Mac laptops, the same ones they might end up using in iMacs and Mac Pro. Or offer things like the afterburner card. There are so many possibiities, so Im confused as to why you would oppose a move that wouldn't restrict themseleves

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, RedRound2 said:

Whoa, starwman argument? Didn't you just say how they wont be able to compete with threadripper? Looks like you dont even seem to know what you're trying to say

What are you even trying to say here? 

I did reply to whatever things i can understand from your replies. Sure, let's all pretend that you weren't able to see that. And degrading grammer is a strong indication of someone going fanboy and not thinking straight. In other words, they go into a mindless rant about how they think they're right, without any direct replies and with gaslighting techniques. And how did you conclude that everyone else understood you when nobody else bothered replying to you? Proper grammer and punctuation is important, otherwise we'll all just be screaming a bunch of words at each other.

They don't have any gaming ambition. Max I can see is them growing off of Apple arcade if thats proves to be a sucess. Otherwise I doubt they'll deliberately try and go after the AAA or competitive gaming segment.

Docking solutions is a lot more popular than you think among Mac users. The one cable solution. That is so Apple's forte.

They could eventaully one day offer a desktop grade in house GPU specifically for Mac laptops, the same ones they might end up using in iMacs and Mac Pro. Or offer things like the afterburner card. There are so many possibiities, so Im confused as to why you would oppose a move that wouldn't restrict themseleves

“Straw man”?  is this team competitive debate?  Calling an argument straw man does not mean it is invalid or incorrect.  It merely means it does not conform to debate team rules for policy debate.  “Straw man” is frequently used simply to attack summaries of a previous statement when an actual negation of that summary is impossible.
 

I rarely see that word used in a way that is not an attempt to divert attention.  Generally if there is a flaw in the summary that flaw is drawn to light.  “Straw man” is most commonly called out when there isn’t one.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bombastinator said:

“Straw man”?  is this team competitive debate?  Calling an argument straw man does not mean it is invalid or incorrect.  It merely means it does not conform to debate team rules for policy debate.  “Straw man” is frequently used simply to attack summaries of a previous statement when an actual negation of that summary is impossible.
 

I rarely see that word used in a way that is not an attempt to divert attention.  Generally if there is a flaw in the summary that flaw is drawn to light.  “Straw man” is most commonly called out when there isn’t one.

Do you even bother seeing what I was replying to? I was quoting him, smh.

And sure I was trying to divert my replies, as evidenced by the 15 lines I wrote after that /s.

You got to find something better to do than this

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RedRound2 said:

Do you even bother seeing what I was replying to? I was quoting him, smh

Did.  Pseudo-rhetorical questions aren’t a reasonable tactic either.  It’s just another pure attack vector.  You did follow with a negation but a fairly weak one.  Hiding the weakness of an argument with bluster can be effective for “winning”, but it doesn’t make the argument stronger.  It mostly says “there is weakness in this stance” to me.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

Did.  Pseudo-rhetorical questions aren’t a reasonable tactic either.  It’s just another pure attack vector.  You did follow with a negation but a fairly weak one.  Hiding the weakness of an argument with bluster can be effective for “winning”, but it doesn’t make the argument stronger.  It mostly says “there is weakness in this stance” to me.

Cool story dude. I have absolutely no idea what you're even talking about. So unless you have something to actually contribute to either of our conversations, you need to find something else to do.

 

And if you did find weakness in my stance, I'd rather you point that out that wasting both our time with weird tactics like this

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, RedRound2 said:

Cool story dude. I have absolutely no idea what you're even talking about. So unless you have something to actually contribute to either of our conversations, you need to find something else to do.

 

And if you did find weakness in my stance, I'd rather you point that out that wasting both our time with weird tactics like this

“Cool story dude” would be an example of “straw man”.  You are summarizing my entire statement as worthless.  

I don’t believe you didn’t understand my statement.  You say it makes no sense and then go on clearly showing that you understood it.
I actually agree that it is too early to say what Apple May come up with, but ARM has advantages and disadvantages, it is by no means a sure thing. ARM has a reputation for lack of raw horsepower.  Whether it deserves it or not is something still unanswered.  ARM has had time to develop such yet it has not so far emerged.

 

 I was always more interested in the competitive debate argumentative techniques you are injecting.  No one is scoring this. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RedRound2 said:

Whoa, starwman argument? Didn't you just say how they wont be able to compete with threadripper? Looks like you dont even seem to know what you're trying to say

What are you even trying to say here? 

I did reply to whatever things i can understand from your replies. Sure, let's all pretend that you weren't able to see that. And degrading grammer is a strong indication of someone going fanboy and not thinking straight. In other words, they go into a mindless rant about how they think they're right, without any direct replies and with gaslighting techniques. And how did you conclude that everyone else understood you when nobody else bothered replying to you? Proper grammer and punctuation is important, otherwise we'll all just be screaming a bunch of words at each other.

They don't have any gaming ambition. Max I can see is them growing off of Apple arcade if thats proves to be a sucess. Otherwise I doubt they'll deliberately try and go after the AAA or competitive gaming segment.

Docking solutions is a lot more popular than you think among Mac users. The one cable solution. That is so Apple's forte.

They could eventaully one day offer a desktop grade in house GPU specifically for Mac laptops, the same ones they might end up using in iMacs and Mac Pro. Or offer things like the afterburner card. There are so many possibiities, so Im confused as to why you would oppose a move that wouldn't restrict themseleves

you said that i said they are going to use the phone chip to compete with threadripper which i never said im saying they are probably going to use it for their entry level. grammar* and you seem to be the fanboy here since you always attack anyone that thinks apple make any mistake. and vegetablestu seemed to understand me just fine and by that logic how do you know you are not going into mindless rants since few people are replying to you also 

and i clearly summed up my point at the end if you think of it as apple needs to make a chip that competes with a 9900k or something then ok they might be able to do that but they have to make a chip that competes with a 11900k and on the higher end xeon 1100s and threadripper 5000s and apparently they are planning to make their own gpus also so they have to at the same time create a gpu that competes with an 7700 xt 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/10/2020 at 1:47 AM, RedRound2 said:

 

Careful what you say. The same people who said ARM macs will never be competitve with x86 are eating their words right now.

 

 

 

1. Whether they will be competitive or not still remains to be seen, so no word eating is occurring.    Besides that, the fact that what defines competitive to most end users is very specific at best subjective at worst.   There are faster ARM processors for somethings but they are not competitive if the software you use doesn't leverage it or work on it very well.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, spartaman64 said:

and apparently they are planning to make their own gpus also so they have to at the same time create a gpu that competes with an 7700 xt 

While also not getting shot to pieces with patent infringement claims by AMD, Nvidia or Intel. Unlike ARM CPUs making high performance GPUs is much more restricted due to patents, not quite as badly but the issue is similar to x86 CPUs. Being capable of doing something isn't the same as being able to something.

 

20 hours ago, RedRound2 said:

And degrading grammer is a strong indication of someone going fanboy and not thinking straight

 

18 hours ago, spartaman64 said:

you seem to be the fanboy here since you always attack anyone that thinks apple make any mistake

Just be mindful of using the word fanboy, intent does matter the most but better choice of words is safer and also less likely to be misunderstood/mischaracterized.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×