SecureMac, Inc.

Checklist 187: Talking Encryption with Patrick Wardle

May 28, 2020

This week, we sit down with a special guest to discuss the ins and outs of encryption — touching on everything from iPhone passcodes and brute-force attacks to political posturing and quantum computing!

Checklist 187: Talking Encryption with Patrick Wardle

This week, we sit down with a special guest to discuss the ins and outs of encryption — touching on everything from iPhone passcodes and brute-force attacks to political posturing and quantum computing!

iOS encryption: a deep dive with Patrick Wardle

Apple’s iOS devices are famous for their strong encryption. But that same encryption can cause problems for law enforcement in the course of their investigations — because the encryption is so strong that even the federal government can’t unlock a suspect’s iPhone!

Over the years, there have been a number of high-profile cases in which the FBI has clashed with Apple over this issue. Recently, the U.S. Department of Justice has renewed calls for Apple to develop a way for law enforcement to access an encrypted device if need be, while at the same time protecting user privacy. Apple has demurred, saying that it’s not possible to do what the government is asking without seriously undermining the security and privacy protections of iOS. The government, of course, begs to differ.

In order to engage in a rational, fact-based discussion of the issue, it’s essential to first have a solid understanding of how iOS encryption works — something that’s often absent from heated public debates.

This week on the Checklist, we spoke with Apple security guru Patrick Wardle to discuss all things iOS encryption. Wardle’s bona fides are beyond reproach — he’s Principal Security Researcher at Jamf, a leading Apple device management solution used by thousands of organizations worldwide. He is the founder of Objective by the Sea, the world’s premiere Mac security conference, and he also develops and maintains a suite of free macOS security tools at Objective-See. It’s also worth mentioning that earlier in his career, Wardle worked for the U.S. National Security Agency (NSA), giving him an unusually broad perspective on this issue. 

We started our conversation by asking for a basic explanation of the extent of iOS encryption (in other words, to say what, exactly, is encrypted on an out-of-the-box iPhone).

PW: Apple has done a tremendous job of ensuring that the majority of information on an iPhone is encrypted when you set it up.

That’s because they understand that there’s a ton of very sensitive personal information on your phone. I mean think about it: It’s your chats, your photos, your location as you carry it around … so if that falls into the wrong hands, or if that information is able to be retrieved by law enforcement, there are some real privacy concerns. 

So Apple has designed a very secure consumer device that has a lot of encrypted components. When you set up your phone and you create a passcode, that encrypts essentially all of the information on the phone. That includes the majority of the filesystem, user content, etc. This means that if law enforcement gets access to your phone, all of that underlying information is fully encrypted — very strongly — based off of your passcode and some other encryption keys as well.

So the short answer to the question of what is encrypted on an iPhone is: basically everything. And again, that’s designed specifically by Apple to protect the privacy of their users.

Wardle’s remark that the encrypted content is based on the device’s passcode may come as a surprise to many of us, since we’re more accustomed to thinking of our passcode as nothing more than a key that grants access to our device. But as he explains, an iPhone’s passcode is actually much more:

PW: A passcode is like a key in one sense, in that you have to put in that passcode just like you need to put in a key to open a door. But behind the scenes, as you’re inserting that key and turning the lock, if that key is correct (i.e. if your passcode is correct), it is also decrypting the information on your device.

This is an efficient approach to killing two birds with one stone: locking your phone, but also allowing an encryption key to be derived off of something that the user created. This is one of the reasons why Apple cannot decrypt your information: because it’s actually encrypted with your specific key (that Apple doesn’t know about). They did that by design so if or when law enforcement comes to them and says “can you decrypt this phone”, they can say “we technically can’t, because we’ve designed it in such a way that the encryption key that protects everything is based on information the user creates and that only that specific user knows”. 

Knowing that entering your passcode is doing more than just giving you access to your files — and is actually decrypting them as you enter it — raises some interesting questions. For example, most of us probably think of that email we wrote yesterday as existing on our iPhone “in email form”, even if we’re not looking at it, or even if our device is turned off. But given what we now know about encryption, is that really what’s going on?

To gain a better understanding of how encryption works, and to grasp what it means to say that “the information on your phone is encrypted”, we need to consider the state of data on your device when you’re not using it. As Wardle explains:

PW: When your phone is off, essentially everything is encrypted: your email, your photos, etc. If someone tries to access your phone — and say they have a way to access the hard drive, the actual bits on that — that’s all encrypted. So it basically gives an attacker nothing. If they power on the phone, they’ll see the passcode prompt. And it’s only when you enter the passcode that it’s going to fully decrypt the majority of the user-specific content.

So if there is an email sitting on your iPhone, when it’s physically sitting on your phone, it is going to be encrypted when the phone is off and remain encrypted until you enter in your passcode. As an aside, it’s important to differentiate between the copy that’s on your iPhone versus the one on Google’s servers versus the copy that goes across the network, because those are all different transmission mediums that are encrypted in different ways

Many times when you hear about law enforcement and iOS encryption, what they’re really trying to do is gain access to the unlock passcode — because with that, they can decrypt the content that’s physically sitting on the iOS device.

One way to think about what’s actually going on is by analogy. Imagine you’re checking into a hotel room. You go to the front desk to check in and they give you a key that’s unique to you; that’s been rekeyed just as you check in. That’s kind of like your passcode (1-2-3-4 or whatever). When you get to the room, you swipe your card and that opens the door — unlocks it — but at the same time it might turn on the power, turn on the TV, the lights come on, the AC comes on, etc. So that one key has two purposes: It lets you in physically, but it also takes some action to prepare the room for your entry. 

So your passcode on your phone can be thought of somewhat similarly. Yes, it provides access to your apps — you unlock everything and you can now use the phone — but in the background, what it’s doing it’s taking those specific numbers and using them to open up other pieces of information: decrypting. We’re taking some liberties here for the purposes of our discussion, but decrypting is kind of like unwrapping something. You need a key, the correct fingerprint, or the correct numbers, to open the package — but once you do that, the information is available.

So again, that passcode for your phone has a dual purpose: It lets you in, allows you to use the phone; but in the background, it’s also opening up all the information that has been encrypted so that apps (for example your email app) can read and use the content which has been transparently decrypted behind the scenes based on your passcode. 

We’ve talked about the importance of using strong, unique passwords on the Checklist before. Because your iOS device’s passcode functions as a decryption key, it’s especially important not to use something that’s trivially easy for someone else to guess. Wardle elaborates:

PW: It’s very important to have a good user passcode — and not something like 1-2-3-4! Apple has even built in some mitigations here; for example, if you try using 1-2-3-4 they’ll yell at you when you’re setting up your phone. In addition, they do things like requiring a passcode after reboot even if you’re using Face ID or Touch, because in theory a lot of the biometrics are easier to break than a passcode: you can lift someone’s fingerprint, perhaps off of a cup they were drinking from or something, whereas the passcode exists only in their mind, which is safe…for now.

When we spoke with Bart Busschots about password safety a few weeks ago, one theme that came up repeatedly was that complexity and length were both crucial for creating safe passwords. As Wardle explains, there are similar concerns at play with mobile device security, both in terms of passcodes and also encryption key lengths: 

PW: It all comes back to brute-forcing. Imagine someone has gained physical access 

to your phone, but it’s locked. They need to access your data and, as we discussed, the way to do that is to unlock the phone with the correct passcode so that you can decrypt the contents. If you don’t have the passcode, well, you might start guessing: 0-0-0-0, 0-0-0-1, 0-0-0-2, etc., all the way up to 9-9-9-9. That would be manual brute-forcing. So the idea is that if you have a longer passcode, and a longer encryption key “under the hood”, it resists brute-force attacks better. Here’s why. Assuming the encryption algorithm is secure, and the ones that Apple uses are, by all definitions of our current understanding of mathematics and cryptography, then the only way to decrypt the data without the passcode is to guess it: a brute-force attack. Different encryption algorithms have different key lengths, 128-bit, 256-bit, 512-bit, and the longer the key length, the more variations there are to guess. So there are parallels between choosing a passcode that’s longer, and that has special characters and numbers and uppercase and lowercase letters, and the length of the encryption key. In both cases, more complex is better, because it’s more difficult to guess.

There is more to encryption, however, than just “on-device” encryption — because data that we send to other people or services has to be sent over a network, and has to be received on the other end. To offer one example, if someone sends an email to their mother, the text of the email may be encrypted on their phone, but it still has to travel over a network to get to her, and then it has to be stored on her device (which is hopefully protected by a strong passcode). Wardle explains why this wider context should be considered when thinking about encryption and data security: 

It’s important to understand that when your email “arrives” to her, the bits and bytes coming over the network are encrypted, albeit in a different way — that’s SSL-level security, network level security. Once it’s on her phone, when she shuts off her phone or locks it, that information is then locked on her phone using her specific encryption key and passcode.

This actually brings up an interesting point: You have to be careful about where you’re sending information. First of all, because there is this network layer, which can perhaps be attacked in different ways if that encryption is not very secure. And secondly, because if the recipient doesn’t use a passcode or has a very weak one, then that becomes the weak link. 

The network level is actually where most attackers now try to play, because it’s often more difficult to secure things at the network level than on a physical device. One great example of this is text messages and cell phone traffic. That’s encrypted — but it’s not very strong encryption. Two people can be having a discussion on their cell phones, but if they’re using a traditional cell phone network, there are various attacks against that protocol; a network level attack could potentially recover plaintext data that’s occurring between two devices.

Once the basics of encryption are understood, it’s possible to start asking questions about the kind of “good guys only” backdoor that the Department of Justice is asking for. Apple’s model of encryption, as we have seen, is specifically designed to ensure that only the individual user can decrypt their data — and that no one else, not even Apple, can do this. But could we create an alternative model of encryption which would allow law enforcement to access devices when necessary, while still protecting individual privacy? Apple (and many privacy advocates) say that this can’t be done, but the Department of Justice (DOJ) and Attorney General William Barr vehemently insist that it is possible.

PW: So here’s the funny thing: The DOJ doesn’t actually believe that. Anybody you talk to that has a technical understanding of what encryption is doesn’t believe that. There may be some politicians who just can’t wrap their heads around encryption, and that’s fine, they’re not necessarily supposed to be experts. But people at the FBI, people with the DOJ, and I’m sure Mr. Barr, all know that this is an untrue statement. I’m not saying they’re lying — I’m just saying we have to understand why they’re saying that. 

It’s funny because people get all spun up about it, but I think they’re not focusing on what’s really going on. So we have politicians saying “Silicon Valley should be able to create a device that protects the user but sometimes can be accessed by law enforcement”. And anybody who understands cryptography, or technology, knows that this is just an absurd claim. And people get upset that politicians are saying this, but the politicians know that what they’re asking for is absurd, and incorrect, and likely impossible. 

But just asking for it actually does something. It gives the politicians plausible deniability — so that when there’s a mass shooting or a terrorist attack and they aren’t able to unlock a suspect’s phone, they can say “Hey, look, we knew this was going to happen, because encryption by definition protects devices, and now there’s this device we can’t break into because the encryption is working, and we told you guys in the past that if this happened we weren’t going to be able to break into the device, so it’s really not our fault, it’s your fault!” 

So this is the DOJ preemptively making statements so that if and when something bad happens, they have something to point back to and say “Yeah, this is the problem we knew we were going to run into; we tried to do something about it, but Silicon Valley and Apple didn’t want to work with us”. Even though they know there’s no solution, it really helps them shift the blame. And this is exactly what they’re doing. I don’t think they really believe that there is a secure way to create an encrypted device that also has some backdoor. I mean, you see state secrets leaked from the FBI, the NSA gets their exploits stolen … everyone knows that it would be very difficult to secure backdoor encryption keys. Invariably they would leak, or hackers would get to them, and it would undermine the whole system.

So I really believe the DOJ is just trying to pre-position themselves and make these claims so that they can pass the buck, or if something bad happens and they can’t decrypt the device; they can point at Apple or other companies and blame them.

I mean, I’m sure they would also love it if Apple did give them a way to decrypt phones, but I don’t think they believe that’s actually going to happen. Apple has built its entire company around the idea of user security, and if they came up with a way for governments to unlock devices, it would completely undermine that. So Apple would likely die on that hill to protect user privacy. 

We’ve seen them basically do that: The FBI shows up with a phone, says “Hey, we need you to unlock this”, and Apple basically says “Sorry, we can’t help you”. Again, that’s by design. Apple has taken a very smart approach where they develop the system so that, from a technical point of view, they essentially can’t help, even if they wanted to. And the good thing is that they don’t want to, which helps sell a lot of phones, helps protect their users, but at the same time helps give Apple plausible deniability — because they can say “Guys, even if we wanted to help you, we couldn’t”. Still, I think we’re going to see the DOJ continue to make these requests. Governments have always complained about encryption, and it obviously does make their job harder. 

At the same time, however, it does benefit governments as well. This is kind of interesting … maybe somewhat tangential, but it does play into encryption and phones. If I send you a text message via iMessage, that’s encrypted by Apple’s encryption technology. And it’s end-to-end encrypted, meaning only your phone and my phone can see the message; even Apple can’t see it. And governments will complain about that, and will say, “We want to be able to read those text messages, those iMessages, and we can’t”. And Apple says, “Sorry, we can’t either, it’s only between those two people”. 

Well, what an attacker (perhaps even a government attacker) can do, if they have a vulnerability or an iMessage exploit, is send you a malicious message that can infect your phone and gain access to all your content — because when your phone’s unlocked, processing iMessages, that content is unencrypted. And since the iMessage itself is encrypted, neither Apple nor anybody else can actually see that. So in a way, this strong encryption can actually be abused by the same government that’s complaining about it to deliver exploits that won’t be detected. So it’s interesting that you have this dual-edged sword: On the one hand, the government’s complaining about the encryption … but on the other hand, it does benefit them in their offensive hacking capabilities.

And just to tie everything back to that DOJ claim: I don’t believe for a minute that they actually expect Silicon Valley to create this magical, non-existent, secure backdoor scheme — I think they’re just doing it so they can point back to it and say “Hey, we tried to do this” and if something bad happens they go “Hey, can’t blame us”.

There is another reason to think the DOJ knows its “secure backdoor” is easier said than done: Past discussions of how this might be implemented suggest that the government expects Apple to safeguard the encryption keys. On the face of it, that might sound innocent enough, but reading between the lines a bit, it may also indicate that the politicians are well aware of how difficult it would be to secure encryption keys.

PW: If you talk to any cryptographer, in general the most complex component of a usable cryptographic system is key management. That’s always kind of, like, the “pain in the ass”, for lack of more mathematically correct term! 

So I think this, again, is the DOJ passing the buck saying, “Yeah, we don’t want to manage these millions of keys, or this one master key, because if we get hacked, we look bad … so, Apple, you’ve got to manage that”. 

But of course it’s not something Apple wants to take on, because for one thing it puts a big target on their back, but also it would likely be very complicated to implement a system that’s both secure and extensible. And again, just ripe for attack. Because, OK, some Apple employee now has access to that. Well, maybe the Chinese government shows up with, let’s say, a billion dollars, and says, “We will give you a billion dollars, you just need to give us a copy of that key”. Eventually, someone will take some amount of money and grant access to that key.

So again, once you start looking at a scheme like this … this is why I really don’t believe that the DOJ actually believes it’s feasible. But, again, they have to make noise, and try, and it’s kind of their job to do that; it protects them against future stuff. But any scheme that’s proposed is just torn to shreds by cryptographers, engineers, everyone — because it basically goes against the entire idea of encryption. You can’t have your cake and eat it too.

In general, we tend to prioritize user privacy over the needs of law enforcement or the intelligence community, but this is not to suggest that we’re entirely unsympathetic to the government’s concerns. Wardle agrees, pointing out that there are legitimate reasons for agencies like the FBI to want the kind of capabilities they’re seeking — even if what they’re asking for isn’t really feasible.

PW: You can definitely understand both sides of the argument. The government wants to be able to do its job and protect its citizens. And you can imagine, for example, a terrorist running around the country with an encrypted phone: That encryption may pose a challenge for law enforcement, who want to access that and prevent an attack against its citizens. There’s definitely two sides to the argument. And you want the government to be able to do its job and protect its citizens, but at the same time provide ways for citizens not to have to fear the government, and to be able to secure their data. It’s a very large, complex debate. 

And to go back to DOJ, that’s one of their claims: They’re saying, “Hey, look, we don’t care about the average American’s text messages or the photos on their phone, we just want to be able to do our job and make sure people are secure”. And I think there’s a lot of truth to that; there are a lot of good people working with government organizations. Balancing user privacy and government insight is just — and always will be — a very complex debate. 

For the moment, Apple and iOS users appear to have the upper hand when it comes to privacy and encryption. But there has been much speculation of late around the possibility that advances in computing (or even radically new technologies like quantum computers) may one day be able to crack even the strongest encryption algorithms. 

Wardle acknowledges that this is a possibility, but is skeptical about the idea that such a change is imminent. He also points out that there is a far more immediate danger to strong encryption — and tells us what iOS users can do about it:

PW: Supercomputers are one way that encryption has been attacked. It goes back to this brute-forcing we talked about earlier, where they can basically just try a large number of passcodes simultaneously and ultimately decrypt a device.

Apple has done a really good job of defending against this, though, because they bake in this idea of “number of max tries”. So if you try to decrypt the phone and you get to ten attempts, the phone locks itself because it knows it’s a brute-force attempt.

And the way encryption algorithms and the like have grown, the complexity of breaking those systems is exponential. Even the fastest supercomputer in the world can’t crack the most secure encryption algorithms, which is which is good.

Quantum computing is interesting, though, because certain encryption algorithms use factoring, or the complexity of factoring, or leverage prime numbers to protect information. There’s been a lot of talk and theory that quantum computing can perhaps break some of these concepts or these foundational ideas that protect systems and information. The idea is that there are these mathematical problems, factoring prime numbers is one example, that are hard or time-consuming to do. But if quantum computing can swoop in and turn those into trivial problems, then systems that are protected by those assumptions — and factoring being hard is one of them — will potentially be broken.

But people have been talking about quantum computing and its impact on cryptography for ten plus years now. I don’t think that there will be a breakthrough overnight that will change everything. I think as the technology matures and the capabilities of quantum computing increase, something like “quantum-resistant cryptography” will grow in parallel. That may not be the case, but I think that’s likely to be the case. It’s unlikely that we’ll wake up tomorrow and the FBI will have a quantum computer that can somehow magically unlock any iPhone.

To take a step back from all of that, there’s actually a simpler solution to all of this, from an attacker’s perspective: exploits and vulnerabilities. There are lots of hackers, security researchers, and even companies that are developing capabilities that can unlock or bypass some of Apple’s security mitigations. And this is exactly what the government uses and leverages when they want to try to break into a phone. I believe it was the FBI a few years ago that had a locked phone that Apple wouldn’t unlock, and so the FBI just turned around and bought a capability from a hacker or some security company and used that to unlock the phone.

The unfortunate reality is that an iPhone is just a computer, and any computing system is going to have flaws and vulnerabilities. There are ways that those can be leveraged by law enforcement to gain access to phones. So while quantum computing or supercomputers may pose a threat in this future, I think there’s a more immediate threat, which is the existence of vulnerabilities and exploits.

The takeaway is that it’s very important for users to always run the most recent version of iOS, because Apple a lot of times will patch some of these flaws and vulnerabilities — so if you have the latest version, you’re protected. There are also vulnerabilities or flaws at the hardware level that can be exploited, but that the newer model iPhones prevent. So again, if you are very concerned about your privacy and the security of your iPhone, it’s best to run the latest software and then, if you can, understanding that this is expensive, always try to run the latest hardware, because it has built-in security mechanisms that can prevent these attacks.

The Checklist would like to thank Patrick Wardle for joining us on the podcast. If you’d like to learn more about Patrick’s work, you can follow him on Twitter or check out his technical blog at the Objective-See website. 

And as always, if you have security questions or suggestions for future show topics, please write to us at Checklist@SecureMac.com

Get the latest security news and deals