Jump to content

Apple Siri powered by ReaLM LLM

11 minutes ago, Sauron said:

It's the same source so... why would you assume that? Are you just so invested in iphones having a better battery life that you'll ignore data to the contrary?

Same source, different graph with no mention of the test methodology whatsoever. Could be all kinds of testing scenarios, including those that do not reflect typical use cases, which would make one or both of the graphs meaningless.

13 minutes ago, Sauron said:

...no it's not...? 1800 minutes is 30 hours...

Yeah sorry, stupid division error on my end.

14 minutes ago, Sauron said:

Yeah, it is just your personal anecdotal impression, so I'll just ignore it. Maybe you don't do that but a lot of people, myself included, keep their phones under charge while at their desk even when they're not low on battery - because why not?

Because it makes no flippin sense at all if your phone lasts easily 24h+ to go through the trouble of keeping a charger or even battery bank at your desk, plugging /unplugging every time you leave/arrive at your desk, cycling connectors, cables, and so on. No one in their right mind would charge their phone when it actually lasts 2x 12h of typical usage.

16 minutes ago, Sauron said:

Not to mention, it's not like all android phones have equally amazing battery life, and I never argued as much. The android market is vast and diverse, offering models across all price ranges and of varying quality. All I'm saying is that compared to some of the most popular competitors, iphone battery life is not especially impressive and is really bad in some models.

You showed me a graph over a rather large range of Android phones from all kinds of manufacturers that all claim excellent battery life.

17 minutes ago, Sauron said:

I'm saying your usage of the word "engineering" is interchangeable with magic, because you throw it out as a thought terminating clich猫. You can't just say "engineering" will solve a problem without explaining how it might do that.

Excuse me that I don't put several decades of SW/HW co-design and design of custom AI accelerators into a forum post. The former part is part of my job and you can either trust me that one can achieve improvements in energy efficiency for a given task of 10x or more, or read some academic papers about the matter, or simply stay ignorant and claim that good engineering will only change the outcome a little and all that matters is slapping a brick-style battery in there.

What you however cannot do is claiming that I somehow said anything around this whole matter was/is magic.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dracarris said:

Excuse me that I don't put several decades of SW/HW co-design and design of custom AI accelerators into a forum post. The former part is part of my job and you can either trust me that one can achieve improvements in energy efficiency for a given task of 10x or more, or read some academic papers about the matter, or simply stay ignorant and claim that good engineering will only change the outcome a little and all that matters is slapping a brick-style battery in there.

If anyone here is putting words in people's mouths it's you - I never said AI accelerators don't or can't exist, or that they aren't faster than non custom hardware. Your argument from the start, however, was that Apple "will just figure it out" because they have "a track record":

23 hours ago, Dracarris said:

I think Apples track record shows that we can rely on them figuring out the energy efficiency part just fine.

so excuse me if I don't take your backtracking from this position particularly seriously.

How about you show me any indication that a purpose built NN accelerator can run a network like the one in the article at a sufficient speed for interactive tasks while using an amount of energy that is reasonable for a phone? If you have expertise in the field maybe you could have lead with that, rather than handwaving away the concern with a "they'll figure it out because Apple"

Don't ask to ask, just ask... please聽馃え

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sauron said:

If anyone here is putting words in people's mouths it's you - I never said AI accelerators don't or can't exist, or that they aren't faster than non custom hardware. Your argument from the start, however, was that Apple "will just figure it out" because they have "a track record":

You said that they won't make a difference significant enough that battery life isn't significantly hampered, which is (probably) wrong.

And my argument from the start remains very much true. Apples has demonstrated over and over again that they are capable of designing energy efficient custom hardware and making good use of it with their SW/OS stack. A graph that - based on whatever test - shows varying battery life over different iphone generations does not disprove that argument in any way whatsoever.

1 hour ago, Sauron said:

so excuse me if I don't take your backtracking

I am not, not in the slightest.

1 hour ago, Sauron said:

How about you show me any indication that a purpose built NN accelerator can run a network like the one in the article at a sufficient speed for interactive tasks while using an amount of energy that is reasonable for a phone? If you have expertise in the field maybe you could have lead with that, rather than handwaving away the concern with a "they'll figure it out because Apple"

I am not an expert when it comes to the design of NN accelerators. So I suggest you wait until the next WWDC or September event to have your mind blown by Tim Apple presenting what, in your words, would be "magic".

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Sauron said:

...if there's retraining involved it can be done asynchronously but the meat of the operation would still be the inference of a response, which you can't get from just cutting down the model (if we knew which specific parts of the network are needed to answer specific types of question this would be a whole lot easier...).

To be fair I'm not specialized in neural networks so maybe it's possible and I just don't know about it.

That's correct.

5 hours ago, Sauron said:

The question here is whether running an LLM like this won't draw significantly more power (on the same device of course) compared to a more constrained, but arguably more than adequate, assistant chatbot.

Also correct, however keep in mind that the NPUs use significantly less power. So probably not much of an increase in battery usage (but it would be a measurable consumption).

The LMM of Siri would be small and complex enough to perform local tasks within the phone. Think of Siri being the same, but with common sense.

It's probably trained so that if it can't answer your question appropriately based on the locality of data, it will just forward it to the cloud to a much larger GPT. To the user, it will seem like one seamless operation regardless of the bifurcation between local and cloud based LLMs.聽聽

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now