Jump to content

We Must Run Doom on Everything - TalkLinked #10

tl;dr: I didn't find a better place to post this, the gist is: security is really hard, and while a phone is less likely to kill you than a tractor, it's still security from a design perspective that makes things the way they are. I'm sure the money/legal/retail departments aren't sad about it either, but there hasn't been a better way to do security so far.

Some of the security reasoning behind certain parts not being 'drop-in' like external monitors used to be is that a screen isn't just a screen. There are two issues at play:

 

- A phone is usually an extremely private device, one that you always have with you

- Instead of having "a computer" with "a screen" and "a camera", lots of low-level things are distributed along with the parts that used to be 'dumb' parts

 

The first thing is rather important because the impact of getting your device pwned is a lot bigger if it's your "external brain", the second one is important because one of the few ways to have the features and integrations that mobile devices in such a small form factor have is optimisation and specialisation. Putting a display controller (not to be confused with GPU) inside a CPU or SoC makes little sense when the display controller needs to be specific to a screen.

 

Why some manufacturer chose one method over the other usually is a matter of scale, level of vertical integration and what they are actually trying to accomplish. If you want to have a cheap device, you'll likely get as many off-the-shelf parts as possible where someone else already did all the work. Downside is that you can't really distinguish yourself in the market. (Imagine for example trying to make a screwdriver that turns out to be exactly the same as all the other screwdrivers because that was cheaper/easier than making sure your product is different/better)

 

If you take a look at what you want to do with a device where you both advertise privacy as well as try to design privacy into the architecture of the hardware and software, you can't just go around using random input and output devices and 'hoping for the best'. Your camera actually needs to be really hard to fake, your biometrics scanners really need to not be swappable, your audio in and out really need to be switched out-of-band. Even a battery can be used to break into a device (well, not the battery itself, the management portion - just like with the Sony PSP Jailbreak).

 

Some of those can of course easily be cut off from the rest of the system with no real problem. Vibration motor or haptic engine really doesn't do much in the way of faking your identity, stealing your data or vibrate you to death. Worst case they either break and don't do anything, or they use more power and as a result drain the battery a bit faster or warm up the phone a bit more.

 

Biometrics, data input and display are a different story. A finger print scanner for example is something you can either centralise in the SoC (bad for separation of concerns and security) or put some trusted chip in it and make sure that it never transmits your fingerprint, only the 'yes this is a known finger' information. And to make sure you can't simply plug a fake sensor in that always says "yes this is a known finger" you'll have to sign and possibly encrypt that message from the sensor to the SoC. If you don't, you simply cannot trust that what messages are received from the finger print sensor is true. The same applies to USB, WiFi, Bluetooth, FaceID, cameras etc.

 

Some people might argue that perhaps the middle ground here is to simply have a user tell the phone "I know this sensor cannot be trusted but I wanna do it anyway". That's great, but how is the phone to know that it actually was the user that did want to do that? It cannot trust the sensor, so it cannot know if it was the user that did it, or the creepy nephew, the boss at work or the scary ninjas from a scary country. It makes that whole security/privacy part ambiguous, which to some manufacturers is either not acceptable as a whole, or simply bad PR or bad for their legal department (likely a bit of both). At the same time, you can't really do security well if you don't have a simple enforcement method where you can say "if it has this digital signature we trust it, otherwise we don't".

This is why Apple has a secure enclave, why the SSD controller is in the SoC and not on the DRAM board, it's why Microsoft has Pluton, Intel has PTT, AMD has PSP, and Google has Titan. Everything is essentially converging on to the same conclusion, the main difference is how far along the companies are in their capability to execute. Samsung is still trying to get Knox off the ground for example.

 

The great team at AsahiLinux (Linux for Apple chips) has a rather technical overview: https://github.com/AsahiLinux/docs/wiki/Introduction-to-Apple-Silicon but the important takeaway from that page is the table near the "Firmware overview" heading, which shows that instead of having "one big computer" as traditional Intel PCs used to be, it is a tenfold of what even a 'modern' computer would have in terms of 'sub computers'. There is of course the 'computer part', the CPU cores mainly, but some risky and performance-specialised parts are available as separate sub-systems with their own software and require their own trusted signatures before the main computer will "listen" to them and trust what they communicate. This means that if someone were to compromise one of those parts, they can't actually do much harm, yet they can be spotted much easier. This doesn't work if digital trust cannot be established.

 

That same GitHub Wiki page has a section on the design goals, where somewhere in the middle they do list out how this actually works out for the user, but also about the trade-offs that you have to make with the design goals in mind.


As for how good this is for the world, and the people, I don't know. Most people are keeping busy with other topics, or scratch the surface, get tired and move on. Most security measures are to some degree at odds with various interpretations of freedom (ironically, being free from bad hosts and infected computers on the internet isn't one of those), but the distributed model seems to be the best bet so far. The downside is that you can't maintain it if you also allow overrides like "I found it in the dumpster, please unlock it", because if someone were to pretend a parked car is a dumpster and you remove a device, I'm pretty sure the owner wouldn't be pleased if the information on it were suddenly accessible to someone else. For recycling, however, the fix is simple: switch off 'find my' and the device will be wipe-able and usable by anyone.

The other issues that were mentioned are (I believe) much easier to resolve (in the Chromebook-way of all methods!); if a vendor/manufacturer thinks their way/shape/form is best, and whatever a user (or owner) wants to change about that would be at odds with that, the PR/Marketing/Brand protection department should probably not worry too much about it and simply have a 'open mode' switch or screw. In Chromebooks with coreboot firmware, there usually is a switch, PCB-pad or electrical-shorting-screw somewhere you can use to make the device delete its secret keys and start in developer (or 'open' mode). At that point, Google won't mark your device as trustworthy anymore (which makes sense, without secure boot there is no telling what the actual state of the system is), but you can still use it however you please. The downside is that if you need to make use of services that only accept trusted devices, you either have to use a different device or can in some cases switch back into 'trusted' mode (which again can wipe everything in some cases). This is also a choice or balance, how can you be sure a device that once was untrusted is really suddenly trusted again, and at wat price? (cost based on resources spent verifying this or accounting for possible future losses etc) Apple probably won't do that with mobile devices, but they are doing exactly that on the computers.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×