Jump to content

Apple promises to support Thunderbolt on its new ARM Macs

Mario5
On 7/11/2020 at 3:53 AM, RedRound2 said:

And you're forgetting about some fundamentals RISC architecture is better than CISC.

Well no it isn't, neither is better than the other and with the amount of extended instruction sets and hardware accelerated paths in CPUs today of all ISAs, architectures and brand names etc almost nothing is RISC or CISC.

 

What is best is what works and by "what works" is just the cover page to an immensely huge discussion and factors comparison. Because of how big this sort of discussion topic is I'm really not willing to get in to it and if I were I'd only do it with people I know to take such a thing seriously and have the knowledge to do so.

 

But if RISC was so fundamentally better why would consoles have moved from RISC origins to CISC origin based hardware. Why would Sony, the company that has always been willing to do highly custom specialized hardware, move to AMD x86 hardware while still using their own customized console OS and graphics APIs. It's not like with Microsoft where they just used a modified Windows NT kernel and unified on DirectX, Microsoft had a lot more to gain by such a move but Sony?

 

RISC is better than CISC or CISC is better than RISC doesn't do justice to the complexity of the matter.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Well no it isn't, neither is better than the other and with the amount of extended instruction sets and hardware accelerated paths in CPUs today of all ISAs, architectures and brand names etc almost nothing is RISC or CISC.

 

What is best is what works and by "what works" is just the cover page to an immensely huge discussion and factors comparison. Because of how big this sort of discussion topic is I'm really not willing to get in to it and if I were I'd only do it with people I know to take such a thing seriously and have the knowledge to do so.

 

But if RISC was so fundamentally better why would consoles have moved from RISC origins to CISC origin based hardware. Why would Sony, the company that has always been willing to do highly custom specialized hardware, move to AMD x86 hardware while still using their own customized console OS and graphics APIs. It's not like with Microsoft where they just used a modified Windows NT kernel and unified on DirectX, Microsoft had a lot more to gain by such a move but Sony?

 

RISC is better than CISC or CISC is better than RISC doesn't do justice to the complexity of the matter.

To make a statement in support of your point, I think.

 

I don’t want to attempt to debate your statement as I am one of those people without proper understanding.  It’s sounding like there is a disconnect between the simplified explanations commonly given and the actual reality of the situation.  A common problem in a lot of things.  I’ve run into it in medicine several times, and it seems to be a primary driver in a lot of choices made by laymen that experts view as poor.

 

I can recount a couple apparently oversimplified explanations which might perhaps describe the issue that is being had amongst laymen such as myself, if not actually do anything to describe the actual issue .  So to parade my ignorance:

 

The explanation I read for the switch was they wanted to take advantage of systems developed for Windows such as DX, which would allow them to simplify game development.  A convenience.
 

I also read that the difference between risc and cisc was that risc doesn’t use microcode which was originally developed primarily because old systems were so very slow that it was a useful way to do work while the cpu was trying to cycle.  This implies cisc would be less efficient on modern systems.  It sounds  kind of weird to me though because those delays would still be there, they would just be shorter.
 

This is why I said advantages and disadvantages.  One is an advantage of risc and the other is a disadvantage.


You seem to be saying it’s actually a lot more complicated than that which I find at the very least to be reasonable and highly likely.  
 

Simplification is very often Quite inaccurate.  There is generally a large difference between what amounts to science writing and actual science.  So it makes sense if only science writing level knowledge is available.  Doesn’t make it true though.   

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Bombastinator said:

“Cool story dude” would be an example of “straw man”.  You are summarizing my entire statement as worthless.  

I don’t believe you didn’t understand my statement.  You say it makes no sense and then go on clearly showing that you understood it.

Exactly, whatever you said was worthless piece of info to me.

 

When i read stuff where the commentator (taling about the other guy) is intentionally using gasligting techniques, and claiming that their original statement is something else, along with clear show of sentences that extends to 5-6 lines of unfiltered thoughts, I may have made a comment on what are you trying to say and what is this mindless rant.

 

If you need to convince your view point or idea, it is necessary to articulate it in a clear way and replying to everything I originally said rather than just going on about how it cannot be possible with no relevance wahtsoever to my post. When I see bullshit, I call out on it. Doesn't mean I see them any less of a person. If that was the case, I wouldn't honestly bother replying to anyone and get back to my life

22 hours ago, Bombastinator said:

I actually agree that it is too early to say what Apple May come up with, but ARM has advantages and disadvantages, it is by no means a sure thing. ARM has a reputation for lack of raw horsepower.  Whether it deserves it or not is something still unanswered.  ARM has had time to develop such yet it has not so far emerged.

 

 I was always more interested in the competitive debate argumentative techniques you are injecting.  No one is scoring this. 

"ARM has reputation for lack of horsepower". Except nobody like Apple ever needed to make some high powered device, or even try since x86 was so dominant. Power efficiency has always been the number one priority in mobile devices. And obviously that is a huge limiting factor in performance wise. Now they're shifting and they don't have the same restrictions anymore. And based on performance and optimizability of a 16W (theoretical peak) of the A12Z chip, it's safe to assume that a 25W or 60W chip can go much further than what was displayed. And what was displayed with 16W chip was impressive and I doubt a 10W Intel chip can run three 4k streams in final cut like it was nothing

 

21 hours ago, spartaman64 said:

you said that i said they are going to use the phone chip to compete with threadripper which i never said im saying they are probably going to use it for their entry level.

And you take these literally huh? You do realize that Ryzen and threadripper are based on the same architecture right? Zen 2. Threaripper is just a bigger chip (glued together - technical term Ryzen chips) with better power delivery and cooling. So do you for some reason think that whatever desktop chip they have in their labs wouldn't be able to scale similarly? And threaripper is out and available to public. Apple sure as hell would have a threadripper mac running in their labs to compare it with. And you know what, instead of switching to AMD, they're confident they can match or beat the competition in 2 years time with their own creation

21 hours ago, spartaman64 said:

grammar* and you seem to be the fanboy here since you always attack anyone that thinks apple make any mistake.

Lol, who did I attack? I didnt attack anybody. I pointed out flaws in your statement. People hate it because there are religious Apple haters here. If you want to me to point out mistakes of Apple, I'll gladly list them. But many people in this forum and elsewhere hate on every single thing Apple does and this thread is no exception.

21 hours ago, spartaman64 said:

and vegetablestu seemed to understand me just fine and by that logic how do you know you are not going into mindless rants since few people are replying to you also 

Did you have personal conversation with him where he directly addressed each word you said. I dont think so. That's an assumption you're making. As for Mr.Moose, he will agree with any comment against me, that's a very normal thing for him to do, so don't bother wearing that as a pride of owner.

 

Also, if you're going to reply with whatever you want to say, write it properly. You can't have debates with broken english. And i know you can write properly becasue your first two three comments were written properly

21 hours ago, spartaman64 said:

and i clearly summed up my point at the end if you think of it as apple needs to make a chip that competes with a 9900k or something then ok they might be able to do that but they have to make a chip that competes with a 11900k and on the higher end xeon 1100s and threadripper 5000s and apparently they are planning to make their own gpus also so they have to at the same time create a gpu that competes with an 7700 xt 

I replied to all this above. Also heterogenous computing that you keep ignoring

3 hours ago, mr moose said:

1. Whether they will be competitive or not still remains to be seen, so no word eating is occurring.    Besides that, the fact that what defines competitive to most end users is very specific at best subjective at worst.   There are faster ARM processors for somethings but they are not competitive if the software you use doesn't leverage it or work on it very well.

Sure, show me a 16W Intel (assuming all cores on the A12Z was pinned at 100% - which is highly unlikely, so realestically 6W-8W if not 4W peak that iPad runs at) chip that can run 3 streams of 4K videos simultaneously in final cut or premier pro. Or a AAA title like Tomb raider running on a similar TDP after accounting for at least 30-40% loss in performance after translation.

 

At this point you're just in denial. And Apple rarely ever makes a short sigted move like this. Everybody knows that, but of course you wanted to reply something here, didnt you.

2 hours ago, leadeater said:

Well no it isn't, neither is better than the other and with the amount of extended instruction sets and hardware accelerated paths in CPUs today of all ISAs, architectures and brand names etc almost nothing is RISC or CISC.

 

What is best is what works and by "what works" is just the cover page to an immensely huge discussion and factors comparison. Because of how big this sort of discussion topic is I'm really not willing to get in to it and if I were I'd only do it with people I know to take such a thing seriously and have the knowledge to do so.

 

But if RISC was so fundamentally better why would consoles have moved from RISC origins to CISC origin based hardware. Why would Sony, the company that has always been willing to do highly custom specialized hardware, move to AMD x86 hardware while still using their own customized console OS and graphics APIs. It's not like with Microsoft where they just used a modified Windows NT kernel and unified on DirectX, Microsoft had a lot more to gain by such a move but Sony?

 

RISC is better than CISC or CISC is better than RISC doesn't do justice to the complexity of the matter.

You're talking about the processors as a whole. Intel is majorily CISC, but uses RISC architecture in small segements where it makes sense. Back in the day, CISC was much easier to program for  scale performance wise. And power consumption during inital days were never too big of a deal because desktops. And RISC has historically been more difficult due to the complexity in programming.

 

But things have changed now. We can't go much more on with the same forumla of shrinking transistors to increase performance. Now the potential gains lie in microptimizing the processors in fundamental levels, and with RISC it's much easier to do that, due to the more granular control. I dont know if this is a valid comparison, but its like Python and C++. Python is much easier and hence easier to develop for complicated programs like ML and AI, but an equivalent C++ can have much faster execution time albiet being very complicated to program the same thing.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, RedRound2 said:

Sure, show me a 16W Intel (assuming all cores on the A12Z was pinned at 100% - which is highly unlikely, so realestically 6W-8W if not 4W peak that iPad runs at) chip that can run 3 streams of 4K videos simultaneously in final cut or premier pro. Or a AAA title like Tomb raider running on a similar TDP after accounting for at least 30-40% loss in performance after translation.

Specs don;t mean shit, real world is where it's at. always has been always will be.  Until we actually have the product in our hands you are making way more assumptions than the people you accuse of being wrong.

 

12 hours ago, RedRound2 said:

At this point you're just in denial. And Apple rarely ever makes a short sigted move like this. Everybody knows that, but of course you wanted to reply something here, didnt you.

Denial of what?   We literally have no ARM based macbooks, we have no prices, we have no performance metrics,  this is not about what you think is short sighted or what you think everyone "knows", this is about logical appraisal of the topic.  no product to appraise = no facts about competitiveness.

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, RedRound2 said:

Exactly, whatever you said was worthless piece of info to me.

 

When i read stuff where the commentator (taling about the other guy) is intentionally using gasligting techniques, and claiming that their original statement is something else, along with clear show of sentences that extends to 5-6 lines of unfiltered thoughts, I may have made a comment on what are you trying to say and what is this mindless rant.

 

If you need to convince your view point or idea, it is necessary to articulate it in a clear way and replying to everything I originally said rather than just going on about how it cannot be possible with no relevance wahtsoever to my post. When I see bullshit, I call out on it. Doesn't mean I see them any less of a person. If that was the case, I wouldn't honestly bother replying to anyone and get back to my life

"ARM has reputation for lack of horsepower". Except nobody like Apple ever needed to make some high powered device, or even try since x86 was so dominant. Power efficiency has always been the number one priority in mobile devices. And obviously that is a huge limiting factor in performance wise. Now they're shifting and they don't have the same restrictions anymore. And based on performance and optimizability of a 16W (theoretical peak) of the A12Z chip, it's safe to assume that a 25W or 60W chip can go much further than what was displayed. And what was displayed with 16W chip was impressive and I doubt a 10W Intel chip can run three 4k streams in final cut like it was nothing

 

And you take these literally huh? You do realize that Ryzen and threadripper are based on the same architecture right? Zen 2. Threaripper is just a bigger chip (glued together - technical term Ryzen chips) with better power delivery and cooling. So do you for some reason think that whatever desktop chip they have in their labs wouldn't be able to scale similarly? And threaripper is out and available to public. Apple sure as hell would have a threadripper mac running in their labs to compare it with. And you know what, instead of switching to AMD, they're confident they can match or beat the competition in 2 years time with their own creation

Lol, who did I attack? I didnt attack anybody. I pointed out flaws in your statement. People hate it because there are religious Apple haters here. If you want to me to point out mistakes of Apple, I'll gladly list them. But many people in this forum and elsewhere hate on every single thing Apple does and this thread is no exception.

Did you have personal conversation with him where he directly addressed each word you said. I dont think so. That's an assumption you're making. As for Mr.Moose, he will agree with any comment against me, that's a very normal thing for him to do, so don't bother wearing that as a pride of owner.

 

Also, if you're going to reply with whatever you want to say, write it properly. You can't have debates with broken english. And i know you can write properly becasue your first two three comments were written properly

I replied to all this above. Also heterogenous computing that you keep ignoring

Sure, show me a 16W Intel (assuming all cores on the A12Z was pinned at 100% - which is highly unlikely, so realestically 6W-8W if not 4W peak that iPad runs at) chip that can run 3 streams of 4K videos simultaneously in final cut or premier pro. Or a AAA title like Tomb raider running on a similar TDP after accounting for at least 30-40% loss in performance after translation.

 

At this point you're just in denial. And Apple rarely ever makes a short sigted move like this. Everybody knows that, but of course you wanted to reply something here, didnt you.

You're talking about the processors as a whole. Intel is majorily CISC, but uses RISC architecture in small segements where it makes sense. Back in the day, CISC was much easier to program for  scale performance wise. And power consumption during inital days were never too big of a deal because desktops. And RISC has historically been more difficult due to the complexity in programming.

 

But things have changed now. We can't go much more on with the same forumla of shrinking transistors to increase performance. Now the potential gains lie in microptimizing the processors in fundamental levels, and with RISC it's much easier to do that, due to the more granular control. I dont know if this is a valid comparison, but its like Python and C++. Python is much easier and hence easier to develop for complicated programs like ML and AI, but an equivalent C++ can have much faster execution time albiet being very complicated to program the same thing.

Re: horsepower

actually they have, sort of.


 

 Newly created risc server chips anyway.  Power5 is risc.  They make seriously large chips for servers.  Total power is awesome if you can use all the cores, but individually the cores aren’t awesome.

 

apple has made arm chips with better single thread performance than anyone else.  Part of the issue as I understand it is

 

A: there’s a lot less difference between chips labeled risc and chips labeled Cisc than there used to be.  This reduces the advantage/disadvantage spread.  Meanwhile x86 has had a lot of development.

 

B: the vast majority of code base and software architecture is designed around cisc. This is a primary reason risc chips developed cisc like features.  They needed to use that code base.  I mentioned advantages and disadvantages.  One of the advantages of cisc is current programming is based on it.  Risc stuff is less optimized which makes it slower.  Apple has to deal with that less because they can customize their code.  They can’t totally do it though because they’re not the only group that writes code for their machines.  From a user perspective it doesn’t matter how a chip does what it does.  As far as a user is concerned a computer could execute by frantically farting and tap dancing.  What matters is how fast it can make useful things happen. 
 

Did a bit of reading on this subject today.  Still what amounts to science writing.  I should really be quoting it but I don’t have links so I’m parroting what I remember.  The thing about science writing or any other massive  simplification is derivation produces errors.  You can’t combine stuff and expect it to have any connection to reality.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/9/2020 at 12:54 PM, Curufinwe_wins said:

The lightning connector itself only problematic that exposed contacts in moist air can experience rapid corrosion and degredation if left plugged in (moist being like just sitting in a car or something, not particularly damp, just not bone dry). The cable itself is intentionally fragile, which is a related and separate issue.

To be fair, Apple's lightning cables are fragile.  I have an Anker lightning cable that scoffs at the idea that it is fragile.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Warin said:

To be fair, Apple's lightning cables are fragile.  I have an Anker lightning cable that scoffs at the idea that it is fragile.

Gratz.  Tell Apple.  They’ll maybe actually fix it.  There was one instance where Apple paid out on a massive lawsuit regarding chargers.  The issue is while they designed the chargers correctly the oem cheaped  out on them and put on garbage strain arrestees because it was cheaper.  The chargers understandably broke at their base and it was an whole massive stink.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Gratz.  Tell Apple.  They’ll maybe actually fix it.  There was one instance where Apple paid out on a massive lawsuit regarding chargers.  The issue is while they designed the chargers correctly the oem cheapes our on them and put on garbage strain arrestees because it was cheaper.  The chargers understandably broke at their base and it was an whole massive stink.

Oh, you wont see me defend Apple's stock cables. They are absolute rubbish.  But that was exactly my point. Just because Apple makes shit cables doesn't mean it is an inherent flaw in the cable, since aftermarket vendors like Anker can make far better cables that are more than price competitive with Apple cables/

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Warin said:

Oh, you wont see me defend Apple's stock cables. They are absolute rubbish.  But that was exactly my point. Just because Apple makes shit cables doesn't mean it is an inherent flaw in the cable, since aftermarket vendors like Anker can make far better cables that are more than price competitive with Apple cables/

Which means if you tell them Apple will likely simply switch producers.  Possibly to the same one anker is using.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

To end the fanboy stuff though, do they mean thunderbolt4 or thunderbolt3?  Because usb4 has thunderbolt3. All they’d have to do is put in usb4 and maybe not even a dongle.  USB4 isn’t much different from thunderbolt4 it’s mostly improved cable specs.  Intel is even incorporating thunderbolt4 controllers into their cpu out of the gate.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Bombastinator said:

To end the fanboy stuff though, do they mean thunderbolt4 or thunderbolt3? 

They they have only said `thunderbolt` people are assuming they are talking about TB3 (at least)..

When it comes the the extra features of TB4 apple has all the needed parts to comply with, as they have the DMA protections for PCIe (they talked about this already). So likely they will be TB4 (even if they do not get certification due to not being an intel cpus...).

Apple co-developed TB3 (and TB2) with intel with people at apple working on it from the start so they clearly have the needed talent, they even added the T2 chip to macs to ensure they could init v-TD protections early enough in boot to ensure even if the TB3 controller were compromised the system would be save. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, mr moose said:

Specs don;t mean shit, real world is where it's at. always has been always will be.  Until we actually have the product in our hands you are making way more assumptions than the people you accuse of being wrong.

Who's talking about specs? 3 streams of 4K and running heavy applications through translation layer is real world performance metrics.

Apple generally never talks about specs (unless they designed it) and focus mostly on the end user experiance

Quote

Denial of what?   We literally have no ARM based macbooks, we have no prices, we have no performance metrics,  this is not about what you think is short sighted or what you think everyone "knows", this is about logical appraisal of the topic.  no product to appraise = no facts about competitiveness.

ARM based products are coming this year. It was officially annouced if you didnt know. And October event is usually when Apple focuses on Macs, especially since they're not planning to release new iPads this year. 

 

Prices will remain the same. There's no logical explaination for it to be more expensive. And my hunch is it's going to be lower, especially on entry level products like the MacBook Air and mac mini. And it's easy to see the pattern since they actually have a phone for $399 powered by the fastest A13 chip

 

We saw performance metrics. And that was on A12 architecutre. Their next CPU will at minimum 25% faster in iPhones. So who's to say what increase in performance will those chips have especially with good cooling and power delivery. They would've never annouced something as huge as this if they weren't able to make competitve processors. It's common sense at this point.

8 hours ago, Bombastinator said:

Re: horsepower

actually they have, sort of.


 

 Newly created risc server chips anyway.  Power5 is risc.  They make seriously large chips for servers.  Total power is awesome if you can use all the cores, but individually the cores aren’t awesome.

 

apple has made arm chips with better single thread performance than anyone else.  Part of the issue as I understand it is

 

A: there’s a lot less difference between chips labeled risc and chips labeled Cisc than there used to be.  This reduces the advantage/disadvantage spread.  Meanwhile x86 has had a lot of development.

 

B: the vast majority of code base and software architecture is designed around cisc. This is a primary reason risc chips developed cisc like features.  They needed to use that code base.  I mentioned advantages and disadvantages.  One of the advantages of cisc is current programming is based on it.  Risc stuff is less optimized which makes it slower.  Apple has to deal with that less because they can customize their code.  They can’t totally do it though because they’re not the only group that writes code for their machines.  From a user perspective it doesn’t matter how a chip does what it does.  As far as a user is concerned a computer could execute by frantically farting and tap dancing.  What matters is how fast it can make useful things happen. 
 

Did a bit of reading on this subject today.  Still what amounts to science writing.  I should really be quoting it but I don’t have links so I’m parroting what I remember.  The thing about science writing or any other massive  simplification is derivation produces errors.  You can’t combine stuff and expect it to have any connection to reality.

Are you replying to me or leadeater? This is similar to my reply I wrote for leadeater's comment

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RedRound2 said:

Who's talking about specs? 3 streams of 4K and running heavy applications through translation layer is real world performance metrics.

Apple generally never talks about specs (unless they designed it) and focus mostly on the end user experiance

ARM based products are coming this year. It was officially annouced if you didnt know. And October event is usually when Apple focuses on Macs, especially since they're not planning to release new iPads this year. 

 

Prices will remain the same. There's no logical explaination for it to be more expensive. And my hunch is it's going to be lower, especially on entry level products like the MacBook Air and mac mini. And it's easy to see the pattern since they actually have a phone for $399 powered by the fastest A13 chip

 

We saw performance metrics. And that was on A12 architecutre. Their next CPU will at minimum 25% faster in iPhones. So who's to say what increase in performance will those chips have especially with good cooling and power delivery. They would've never annouced something as huge as this if they weren't able to make competitve processors. It's common sense at this point.

Are you replying to me or leadeater? This is similar to my reply I wrote for leadeater's comment

Was replying to you but keeping leadeaters comment in mind as I think it had merit.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, RedRound2 said:

ARM based products are coming this year.

So we don't have any products yet,  yet you are confident you know how they are going to perform and what they are going to cost.  Well done, when you've finished with your crystal ball I want to see next weeks lotto numbers.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, mr moose said:

So we don't have any products yet,  yet you are confident you know how they are going to perform and what they are going to cost.  Well done, when you've finished with your crystal ball I want to see next weeks lotto numbers.

 

 

And you convinently ignored the part why I thought so like that. Is this to intentionally mislead others by taking my comment out of context?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RedRound2 said:

And you convinently ignored the wuote why I thought so like that. Is this to intentionally mislead others by taking my comment out of context?

Is this another attempt to distract by attempting to find procedural errors?  You don’t seem to be accepting the concept of “these things don’t exist yet”, and are using argumentative technique rather than actual points to back it. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Bombastinator said:

Is this another attempt to distract by attempting to find procedural errors?  You don’t seem to be accepting the concept of “these things don’t exist yet”, and are using argumentative technique rather than actual points to back it. 

I just don't know how else to explain it, we can't claim facts about an as yet unreleased product,  either positive or negative.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Bombastinator said:

Is this another attempt to distract by attempting to find procedural errors?  You don’t seem to be accepting the concept of “these things don’t exist yet”, and are using argumentative technique rather than actual points to back it. 

Procedural errors that is intentioanlly designed to misrepresent someone is something that should be called out and bashed on. This is exactly what happens in politics, if you didn't know (and you clearly dont seem to)

 

It's common sense. They demoed on a two year old chip, the A12. A device that is available to some people already (so they do exist). They already have the A13 that is faster. They're shifting nodes to 5nm and the new Macs will probably be based on A14. At minimum, the worst case, there will be a25% increase in performance. Making it more capable than what they demoed and showed to the world. And what they showed to the world is not something a similar grade Intel processor can do.

 

So I'm sorry, if you both don't get the concept of roadmaps and future lineups. Do you for any reasonable reason think Intel, AMD, Nvidia are going to release new graphics and processors that's worse performing than previous. I dont think so. Same concept goes here, except since it's Apple Mr.Moose has got to try and grab straws to counter point everything I say

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, RedRound2 said:

Procedural errors that is intentioanlly designed to misrepresent someone is something that should be called out and bashed on. This is exactly what happens in politics, if you didn't know (and you clearly dont seem to)

 

It's common sense. They demoed on a two year old chip, the A12. A device that is available to some people already (so they do exist). They already have the A13 that is faster. They're shifting nodes to 5nm and the new Macs will probably be based on A14. At minimum, the worst case, there will be a25% increase in performance. Making it more capable than what they demoed and showed to the world. And what they showed to the world is not something a similar grade Intel processor can do.

 

So I'm sorry, if you both don't get the concept of roadmaps and future lineups. Do you for any reasonable reason think Intel, AMD, Nvidia are going to release new graphics and processors that's worse performing than previous. I dont think so. Same concept goes here, except since it's Apple Mr.Moose has got to try and grab straws to counter point everything I say

What intent?  You seem to be missing the substance of the argument entirely.  This is not a competitive debate.  Procedural stuff is worthless. This is the point I’ve been making here.  I could for example accuse you of deliberately avoiding the facts of the matter.  I don’t actually know what your intent is though. Deliberate is silly.  You could have pointed out that @mr moose did not address your concern but instead make an accusation as if this action proves your point and weakens his.  
 

re: politics

it does now I suppose.  Didn’t used to as much.  Or maybe it did and the techniques used were merely less ham handed.  You do seem to be taking many pages from the rush Limbaugh school of competitive argument and he mostly argues politics.
 

 This isn’t politics.  

 

There is objective reality and perceptual reality.  Warping a perception, which is mostly what your attempt seems to be about won’t DO anything in this case.  Warping a perception away from objective reality is in some ways merely a form of lying if the state of objective reality is known or delusional if it’s not.  What kind of cpu power the new macs will have isn’t subjective.  It’s objective.  You can have faith in Apple as some have faith in other things, but it’s not functional in this case.  Objectively the performance of the Apple design is unknown.  It might possibly be more powerful.  Somewhat more likely it won’t be, if Apple is unable to come up with a new way to do this stuff simply because no one else so far seems to have done it.  I hold out the possibility.  Breakthroughs happen.  I even hope for it.  Apple is going to have to work awfully hard to do that though I think, and even then it might not be achieved simply because it hasn’t by others in the past.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, RedRound2 said:

 

So I'm sorry, if you both don't get the concept of roadmaps and future lineups. Do you for any reasonable reason think Intel, AMD, Nvidia are going to release new graphics and processors that's worse performing than previous. I dont think so. Same concept goes here, except since it's Apple Mr.Moose has got to try and grab straws to counter point everything I say

None of what you are saying actually addresses what I said or why I said it.    The premise is simple, you can't evaluate a product that doesn't yet exist. Therefore telling us that people are eating their words when you have nothing to prove their words were wrong is conjecture at best.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Bombastinator said:

What intent?  You seem to be missing the substance of the argument entirely.  This is not a competitive debate.  Procedural stuff is worthless. This is the point I’ve been making here.  I could for example accuse you of deliberately avoiding the facts of the matter.  I don’t actually know what your intent is though. Deliberate is silly.  You could have pointed out that @mr moose did not address your concern but instead make an accusation as if this action proves your point and weakens his.  
 

Which point am I ignoring? If you notice most of the time I quote everything somebody says (if i don't its probably something that doesnt require a reply). I don't take things out of context or cherry pick on certain lines. Please take a look at how other reply to me. Mr. Mosse delibertly cherry picks my lines out of context

 

I accuse him beause this isnt the first time he has done it. I don't call out on everybody, just the ones who intentionally takes things out of context or deliberatly ignores something I keep saying. You don't have to play a good samaritan here and just inject yourself into some debates I'm having. I've told you this before, either have some substantial thing to contribute here, or just move on and find better things to do

 

Quote

 This isn’t politics.  

It isn't, but the same techniques are being used, whether intetionally or not, i dont know

Quote

There is objective reality and perceptual reality.  Warping a perception, which is mostly what your attempt seems to be about won’t DO anything in this case.  Warping a perception away from objective reality is in some ways merely a form of lying if the state of objective reality is known or delusional if it’s not.  What kind of cpu power the new macs will have isn’t subjective.  It’s objective.  You can have faith in Apple as some have faith in other things, but it’s not functional in this case.  Objectively the performance of the Apple design is unknown.  It might possibly be more powerful.  Somewhat more likely it won’t be, if Apple is unable to come up with a new way to do this stuff simply because no one else so far seems to have done it.  I hold out the possibility.  Breakthroughs happen.  I even hope for it.  Apple is going to have to work awfully hard to do that though I think, and even then it might not be achieved simply because it hasn’t by others in the past.

First few lines are you going full nerd on a hypothetical problem that you created in your head somehow. So i really dont have anything to say, other than your action of reading way too much into everything or my dynamic with some people here.

 

Performance is objective. And you keep somehow ignoring that they showed us the performance on a 2 year old chip. And some people actually have those devices. Any logical conclusion would say they would have at minimum 25% increase in performance with A14 at same TDP. It's basic math here. You can keep talking about warping stuff and try to divert attention by bringing up some useless theories, but that a simple fact, observation and rule of life. If you're still not convinced, just keep quiet and wait till end of the year. You can quote me if the performance is worse than what was demoed (and common sense exceptions apply, like applications running under Rosetta)

 

And about no one else being able to do, hence apple can't - that's an incorrect conclusion becasue no invention would've ever occured if that was the case. Apple Apple has a more than good track record for developing their own chips and making breakthroughs

10 hours ago, mr moose said:

None of what you are saying actually addresses what I said or why I said it.    The premise is simple, you can't evaluate a product that doesn't yet exist. Therefore telling us that people are eating their words when you have nothing to prove their words were wrong is conjecture at best.

Above paragraph. Well it does exists to limited amount of people. That's a irrefutable fact

Link to comment
Share on other sites

Link to post
Share on other sites

Re: points 

a quote without a reply to the point would qualify.  I forget the specifics.  Quoting someone doesn’t help with that.  I’m vaguely recalling statements on the order of “this is like that” but there was no reply as to whether it wasn’t like that but a complaint about ant other part of the statement that had nothing to do with the body of the statement. So implies disagreement without actual treatment. 

 

re: not the first time

other times could be just as assumptive.  Not agreeing with something is not ignoring.  It came out of nowhere.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RedRound2 said:

Above paragraph. Well it does exists to limited amount of people. That's a irrefutable fact

it does not exist to you, people commenting on this forum, most of the media nor I and it certainly doesn't exist in an end product yet.  So there is nothing to claim about its performance, relevance or competitiveness yet.

 

Also I do not cherry pick.  You made an absolute statement statement that I questioned/corrected.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/13/2020 at 11:23 AM, RedRound2 said:

"ARM has reputation for lack of horsepower". Except nobody like Apple ever needed to make some high powered device, or even try since x86 was so dominant. Power efficiency has always been the number one priority in mobile devices. And obviously that is a huge limiting factor in performance wise. Now they're shifting and they don't have the same restrictions anymore. And based on performance and optimizability of a 16W (theoretical peak) of the A12Z chip, it's safe to assume that a 25W or 60W chip can go much further than what was displayed. And what was displayed with 16W chip was impressive and I doubt a 10W Intel chip can run three 4k streams in final cut like it was nothing

But there are really high powerful ARM CPUs available, you can rent one from AWS right now. They're usually slower than their x86 competitors in most tasks, and are able to match them in others. Here are some benchmarks, where an equivalent Epyc is 50% faster.

 

Quote

Sure, show me a 16W Intel (assuming all cores on the A12Z was pinned at 100% - which is highly unlikely, so realestically 6W-8W if not 4W peak that iPad runs at) chip that can run 3 streams of 4K videos simultaneously in final cut or premier pro. Or a AAA title like Tomb raider running on a similar TDP after accounting for at least 30-40% loss in performance after translation.

You're comparing apples to oranges. Most devices nowadays have dedicated hardware for video decoding that barely sip any power, so the CPU is barely used, heterogeneous computing isn't something exclusive to apple. Where they have the upper hand is with more purpose-specific devices, such as their neural IPs and cards like the Afterburner (a huge FPGA meant solely to decode media).

 

Quote

You're talking about the processors as a whole. Intel is majorily CISC, but uses RISC architecture in small segements where it makes sense. Back in the day, CISC was much easier to program for  scale performance wise. And power consumption during inital days were never too big of a deal because desktops. And RISC has historically been more difficult due to the complexity in programming.

 

But things have changed now. We can't go much more on with the same forumla of shrinking transistors to increase performance. Now the potential gains lie in microptimizing the processors in fundamental levels, and with RISC it's much easier to do that, due to the more granular control. I dont know if this is a valid comparison, but its like Python and C++. Python is much easier and hence easier to develop for complicated programs like ML and AI, but an equivalent C++ can have much faster execution time albiet being very complicated to program the same thing.

Except that it's the other way around. CISC is way more complex since they pack tons of functions behind a single instruction (good luck doing a VPMASKMOVQ on an ARM CPU), while RISC CPUs try to be way more simple, but in the end you have to write way more code due to that simplicity. Also, ARM is getting more complex by the day, I wouldn't dare to call it a RISC architecture anymore, Digikey even has a nice article about that.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, igormp said:

But there are really high powerful ARM CPUs available, you can rent one from AWS right now. They're usually slower than their x86 competitors in most tasks, and are able to match them in others. Here are some benchmarks, where an equivalent Epyc is 50% faster.

I am aware of ARM server processors. But they're designed for a specific use case in mind and not general desktop computing. Obviously that's what Apple is targetting here, and it's uncharted territory in this modern age, if you dont count the Qualcomm 8CX (which is a joke)

Quote

You're comparing apples to oranges. Most devices nowadays have dedicated hardware for video decoding that barely sip any power, so the CPU is barely used, heterogeneous computing isn't something exclusive to apple. Where they have the upper hand is with more purpose-specific devices, such as their neural IPs and cards like the Afterburner (a huge FPGA meant solely to decode media).

Isn't that for video playback. There is usually a lot of CPU usage when you do scrub in timelines.

Never said heterogenous computing is exclusive to Apple. Rather Apple Silicon itself can encompass a wide variety of different parts (for lack of better term), like Neural engine, specific for Metal, allowing them to make OS and device specific optimizations.

Quote

Except that it's the other way around. CISC is way more complex since they pack tons of functions behind a single instruction (good luck doing a VPMASKMOVQ on an ARM CPU), while RISC CPUs try to be way more simple, but in the end you have to write way more code due to that simplicity. Also, ARM is getting more complex by the day, I wouldn't dare to call it a RISC architecture anymore, Digikey even has a nice article about that.

CISC is more complex, its there in the name.

I think you didn't understand what I said. Please read it again.

RISC is more complicated to program for due to its very granular control, while CISC have usually had tonnes of function packed into single instruction, as you said. But with ARM, they can further optimize and drop those unnecessary computation that inevitable happens in x86. That's where I believe where there's huge huge efficieny gains in a RISC based arcitecture. And turns out, so far at least, you can squeeze out performance from a RIS based architecture to the current x86-64 processors. The iPad Pro A12Z processor is already more powerful than the lower end Intel processors, and they seem to have a clear roadmap to match the comeptition in 2 year timeframe. Let's just wait and see what happens

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×