Jump to content

Apple M1 = the rest of us are living in the stone age!?

3 minutes ago, Lord Vile said:

what do you mean price increase? Intel macs cost about the same as building your own would cost think that's fairly well documented.

as in increasing some ram is increase of 200 dollars, that's a no from me, although I love apple

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Ankh Tech said:

as in increasing some ram is increase of 200 dollars, that's a no from me, although I love apple

Yeah obviously the upgrades are stupidly expensive but RAM upgrades go for north of £100 for laptops regularly so it's not an Apple only thing.

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lord Vile said:

Yeah obviously the upgrades are stupidly expensive but RAM upgrades go for north of £100 for laptops regularly so it's not an Apple only thing.

yes but getting ram on a pc is so easy, and much cheaper, also you can run mac os on a pc, but with M1, no windows yet, so early adopters have to wait

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Ankh Tech said:

yes but getting ram on a pc is so easy, and much cheaper, also you can run mac os on a pc, but with M1, no windows yet, so early adopters have to wait

You can't run MacOS legally on a PC and you can't run it on any PC. Also there's a difference between an ultrabook and a SFF PC and a desktop. Funny think you can upgrade the RAM on the iMac yourself same with the Mac Pro. With the iMac it's actually easier to upgrade the RAM than it is in a traditional PC case.

 

Why would you buy an M1 mac to run windows on it... They literally still sell the intel models if you want that.

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

 

2 hours ago, Den-Fi said:

Yes, but the summary showed a complete lack of understanding.

Part of this is the fact that you haven't touched the stuff.

You're applying super loose PC logic to a platform that's even further beyond your scope of experience.

The only thing you're accomplishing here is letting everyone know who to completely ignore.

I think you're being overly hyperbolic without enlightening us one bit. Woudl you care to dispell at least a single misunderstanding?

As far as I can see

 

3 hours ago, Ankh Tech said:

ARM in general, has very good performance per watt, this is not new at all

This is undeniably correct.

 

3 hours ago, Ankh Tech said:

M1 knows how to utilise it.

This I have no clue what it means.

 

3 hours ago, Ankh Tech said:

M1 has great single core, but this will be irrelevant soon, as almost all task are getting optimised for more cores and multi core

This is just made up.

 

3 hours ago, Ankh Tech said:

4 cores can't be used for any productivity

A 1050 ti can't be used for productivity

I don't understand if 4 of the M1 cores can't be used, or if quad-cores can't be used, but it doesn't matter because more generally: the use of "productivity", from youtubers to forum members, is meaningless, so it's impossible to say whether something "can be used" for it or not.

 

3 hours ago, Ankh Tech said:

M1 emulation is very bad, and will not be as good as x86 native anytime soon

No clue. Is this what's entirely wrong?

 

3 hours ago, Ankh Tech said:

It has an excellent accelerator and scaler

Also no clue. It sounds like he's talking about the dedicated hardware?

 

3 hours ago, Ankh Tech said:

We need to wait for a M1 refresh for good productivity.

See above 🤷‍♂️

 

3 hours ago, Ankh Tech said:

One last thing

 

Can it run crisis

Can it?

 

 

Overall, there aren't that many points with an actual meaning, so it shouldn't be too hard to set the record straight on the subset that is wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SpaceGhostC2C said:

 

I think you're being overly hyperbolic without enlightening us one bit. Woudl you care to dispell at least a single misunderstanding?

As far as I can see

 

This is undeniably correct.

 

This I have no clue what it means.

 

This is just made up.

 

I don't understand if 4 of the M1 cores can't be used, or if quad-cores can't be used, but it doesn't matter because more generally: the use of "productivity", from youtubers to forum members, is meaningless, so it's impossible to say whether something "can be used" for it or not.

 

No clue. Is this what's entirely wrong?

 

Also no clue. It sounds like he's talking about the dedicated hardware?

 

See above 🤷‍♂️

 

Can it?

 

 

Overall, there aren't that many points with an actual meaning, so it shouldn't be too hard to set the record straight on the subset that is wrong.

as I said this is a summary of what I understood, looking for clarification

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SpaceGhostC2C said:

 

I think you're being overly hyperbolic without enlightening us one bit. Woudl you care to dispell at least a single misunderstanding?

Yes, yes I can:

 

image.thumb.png.490ded2821093804b4eb08d33b1c88b7.png

Link to comment
Share on other sites

Link to post
Share on other sites

Every time I hear/read the word Productivity, I always think of Office Suites first... So yes both Quad Cores and the 1050Ti can use Productivity Applications. Come to think of it, you don't even need a dGPU at all for such usage. iGPUs will do just fine.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Blade of Grass said:

I would say this is distinctly false. Lots of applications run with single processes, and lots of applications are faster without multi-processing. Only embarrassingly parallel tasks are able to benefit significantly from multi-core architectures. Multi core isn't some magic that speeds up applications, any task that requires synchronization (think, anything that requires states) can be significantly impacted by multiple processes contending over locks. IPC can also be quite expensive for these tasks. 

 

Amdahl's law also provides a good ceiling for how optimized tasks can theoretically become, and ignores the minutia of practical implications. 

you would be wrong and thats a bad assumption a single processes doesnt equal single core performance. this a misunderstanding.... as a single process can spawn multiple threads running in parallel on different cores.

suggest looking at, https://www.backblaze.com/blog/whats-the-diff-programs-processes-and-threads/

 

all the test measures is a performance of "one thread" but no modern program will run only one thread or process at a time. its more when a program isnt properly optimized and doesnt spawn enough threads to take advantage of the available resources.

 

you can go into windows and open task manager and add a column to see the number of threads if your curious for each of your processes. you will find many processes running more than one thread.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, tech.guru said:

you would be wrong and thats a bad assumption a single processes doesnt equal single thread performance. this a misunderstanding.... as a single process can spawn multiple threads running in parallel running on different cores.

suggest looking at, https://www.backblaze.com/blog/whats-the-diff-programs-processes-and-threads/

 

all the test measures is a performance of "one thread" but no modern process only runs one thread having less cores leads to queuing and delays for many processing tasks.

 

now if you look at AMD they have improved how the threads talk to one another through a unified cache reducing latency and helping reducing the performance penalty when the threads need to communicate to each other on different cores.

Eh, I was borrowing terms from python which I'm sure created some confusion. Let me rephrase it then:

Lots of applications (a process) run with a single kernel thread, and use user space threads to achieve concurrency. The user space threads all share time-slices of the kernel threads that their process has. If a process has multiple kernel threads, then they can achieve parallelism by scheduling their user space threads across kernel threads

Regardless, many applications actually slowdown while running across different kernel space threads because the user space threads have to contend over locks. Data passing between threads requires synchronization, so it is always going to be a slower when threads are homed on different kernel threads because there's no way for Thread A to yield time to Thread B to get it the data it needs. 

 

There is always going to be buffering/queueing in a concurrent system as it's impossible to have perfect flows of data (i.e. producers and consumers being perfectly synchronized and operating at the same rate). In fact, buffers are almost always basically full or almost empty, doesn't that seem inefficient? ;) 1 

 

One example of an application that suffers when ran in a parallel fashion is a vote consensus algorithm for planning tour groups--you have N visitors, you're trying to form G groups, each visitor gets to vote V times. Every G votes you form a group with the voters, determine what kind of tour they want to go on, then start the tour. You run this until you run out of visitors/they run out of votes. 

 

Execution time of the algorithm using internal scheduling2 where each visitor is a thread is as follows (note number of processors here is the number of kernel threads used, i.e. level of parallelism):

vote N G V seed [ processors | default = 1]
usr/bin/time -f "%Uu %Ss %Er %Mkb" ./vote 100 10 10000 1003
1.58u 0.04s 0:01.62r 32972kb
/usr/bin/time -f "%Uu %Ss %Er %Mkb" ./vote 100 10 10000 1003 2
50.13u 0.09s 0:25.08r 33048kb

Double the parallelism, 31x the user time, 2x the kernel time, and 15x the real time. 

Why is this? Because of lock contention and IPC cost (mainly), but also extra scheduling cost and kernel context-switching3 cost. When run on a single process there is very very little contention, but when running across kernel threads this is not true. 

 

Hopefully this helps explain concurrency and parallelism more. Going back to the original point though, note every application benefits from running in a parallel fashion, which is why you'll see a lot of things like games not benefit beyond half a dozen cores or more. Again, see Amdahl's Law. Tasks that are embarrassingly parallel exhibit the most significant speedups when run in a parallel manner because they don't need to share data and contend over locks. 

 

Footnotes:

  1. Only really true if the production and consumption times are not similar or have variance, which is common
  2. Generous here, internal scheduling is one of the faster ways of doing it
  3. This can actually be quite expensive too, but I would wager lock contention is a bigger cost

 

Edit: also just want to point out, this code written without threading is an order of magnitude faster. Threads are slow. 

Edited by Blade of Grass

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/27/2020 at 1:07 AM, gal-m said:

Anyways, what do you guys think?

The Stone Ages would be welcome here because it would mean no spying of users would be possible from Apple, Microsoft, Google and countless others. They don't exist.

 

Or we can use older devices and not just imagine personal computing gadgets being personal again: we'd be using them instead.

https://sneak.berlin/20201112/your-computer-isnt-yours/

As far as I'm concerned, the value of older machines (20+ years old) has just gone up substantially for anyone that values more privacy and who is less keen on today's ever-pervasive software and hardware backdoors.

 

Bring on the Stone Ages!

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, unsorted said:

The Stone Ages would be welcome here because it would mean no spying of users would be possible from Apple, Microsoft, Google and countless others. They don't exist.

 

Or we can use older devices and not just imagine personal computing gadgets being personal again: we'd be using them instead.

https://sneak.berlin/20201112/your-computer-isnt-yours/

As far as I'm concerned, the value of older machines (20+ years old) has just gone up substantially for anyone that values more privacy and who is less keen on today's ever-pervasive software and hardware backdoors.

 

Bring on the Stone Ages!

 

 

 

You can do that now by switching to FOSS OSes and Applications. In addiction to being careful with online services you use.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/26/2020 at 3:13 PM, Ankh Tech said:

no matter what, arm will not, never, replace x86. Arm is optimised for low end efficient systems. High end will never be achieved by arm. 

ok fanboy, nevermind that the M1 outperforms Core i9 Macbook Pro's right? and trades blows with iMac's that have desktop intel cpu's in them? 

She/Her

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ashley xD said:

ok fanboy, nevermind that the M1 outperforms Core i9 Macbook Pro's right? and trades blows with iMac's that have desktop intel cpu's in them? 

Ok, first, I love apple, sorta a fanboy, second, second, if arm is so good, why didn't it already take over x86, it;'s been there for years, with it slowly advancing. M1 isnothing new, it's just the way apple hyped it up, that made it seem like it can beat a 5950x

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ankh Tech said:

Ok, first, I love apple, sorta a fanboy, second, second, if arm is so good, why didn't it already take over x86, it;'s been there for years, with it slowly advancing. M1 isnothing new, it's just the way apple hyped it up, that made it seem like it can beat a 5950x

nobody felt like they needed to make a high performance ARM chip. there was no point. now that x96 is lacking in Apple's view they had the need to make one, so they did. ARM is capable of a lot.

She/Her

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ashley xD said:

x96

wait is there x96, or a typo?

 

anyways, this isn't the first time apple changes architecture, this is the 3rd iirc.

 

Too much hype for no apparent reason, apple may go further than this, and beat x86 if they stay away from little big, and make a 8/10 core big cpu Anyways that's my opinion, everybody has his own

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, whm1974 said:

You can do that now by switching to FOSS OSes and Applications. In addiction to being careful with online services you use.

true but personally I don't want Intel Management Engine or AMD PSP, among others. I am not hopeful for what security researchers will find here

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, unsorted said:

true but personally I don't want Intel Management Engine or AMD PSP, among others. I am not hopeful for what security researchers will find here

you do know while there's been vulnerabilities with intel management engine,

alot of the modules actually improve security

 

such as, (look at https://en.wikipedia.org/wiki/Intel_Management_Engine)

the secure boot and intel platform are two great security features. 

for things like enabling disk encryption and ensuring malware hasnt tampered with the low level boot process

 

some scenarios come to mind these technologies help prevent,

  • laptop device gets lost and sensitive information gets exposed without disk encryption.
  • malware infects bootloader as part of ransomware and only allows system boot when payment is made
Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2020 at 1:26 PM, Crazywizard said:

So you need a "normal" x86 laptop for school for anything related to computers.

Yep. Windows is definitely the go-to for school work. Though, with the MacBook's new performance in reference to battery life, I think a lot of people would want to carry a nice, light, portable laptop for tasks that do not require Windows based applications.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2020 at 2:01 PM, Uttamattamakin said:

If it doesn't run the software you need it doesn't matter how powerful it is.

Couldn't have said it better myself @Uttamattamakin!

Thanks for the reply :) 

Link to comment
Share on other sites

Link to post
Share on other sites

Speaking of stone age, are we gonna talk about these nice little perks afforded by the custom/console-like nature of M1 Macs

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, saltycaramel said:

Speaking of stone age, are we gonna talk about these nice little perks afforded by the custom/console-like nature of M1 Macs

 

 

What little perks of M1 Macs? And what is that about changing Resolutions? All modern OSes can do that...

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, whm1974 said:

What little perks of M1 Macs? And what is that about changing Resolutions? All modern OSes can do that...

Without flashing black, not even for an instant?

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/3/2020 at 2:02 AM, Ashley xD said:

nobody felt like they needed to make a high performance ARM chip. there was no point. now that x96 is lacking in Apple's view they had the need to make one, so they did. ARM is capable of a lot.

there has always been a big push for arm in supercomputers/servers for years and while its making some progress now but x86 is still more popular and in some cases even more efficient.

imo much of the m1 advantage comes from being on tsmc 5nm which amd is using for zen 4 also 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×