SecureMac, Inc.

The Checklist Podcast

SecureMac presents The Checklist. Each week, Nicholas Raba, Nicholas Ptacek, and Ken Ray hit security topics for your Mac and iOS devices. From getting an old iPhone, iPad, iPod, Mac, and other Apple gear ready to sell to the first steps to take to secure new hardware, each show contains a set of easy to follow steps meant to keep you safe from identity thieves, hackers, malware, and other digital downfalls. Check in each Thursday for a new Checklist!

Checklist 182: Tech Versus Tech Versus Coronavirus

Posted on April 27, 2020

This week, a special guest talks about the Apple/Google contact tracing tool (and the one proposed by France and Germany), plus we’ll share an important warning about a new text messaging scam.

A conversation about contact tracing apps

Last week on the Checklist, we told you about Apple and Google’s joint project to develop a contact tracing tool designed to let you know if you’ve been exposed to someone who has Covid-19.

The two tech giants are still in the early stages of development, but they’ve released technical specifications that shed light on what the final product may look like. The plan is to use the Bluetooth technology in mobile devices to emit rotating, anonymized beacons that can be detected and recorded by nearby phones. If someone is diagnosed with Covid-19, they can allow their data to be uploaded to a centralized server, which will then warn the people they’ve come into contact with that they may have been exposed to the virus.

Officials in France and Germany have recently announced that they will develop their own contact tracing tool, leading many to wonder how — if at all — their project differs from the one already underway in Silicon Valley. In particular, while Apple and Google’s tool seems to rely heavily on decentralization and anonymity to ensure user privacy, it’s not clear that the European version will offer the same protections.

George Starcher is a senior security engineer and data analyst who is well-positioned to speak to these issues. He joined us on the Checklist this week to explore concerns around these proposed tools. 

Regarding the similarities between the two projects mentioned above, Starcher doesn’t see that much of a difference:

GS: The Apple/Google plan and the European one pretty much sound the same, and the European agencies that want to create their own app are asking for access to the Apple/Google API so that they can get iOS and Android phones to emit that beacon. From everything we’ve seen, it’s essentially the same deal in terms of the protocol. There was some mention of something like true peer-to-peer functionality, in terms of possibly passing along who announced themselves as testing positive for the virus between phones. I’m not sure exactly how that would work, because ultimately these are still just plans that are being laid out, and with these things the implementation is always where the rubber meets the road. I mean, we know from every other web service that what the specs say and how they actually implement it are often totally different. 

The uncertainty due to the gap between specification and implementation leaves room for questions about the proposed contact tracing tools, and Starcher has plenty of them. 

GS: Both plans look like they’re going to rely on decentralization to some extent, though they’ll still need that centralized server, so it’s really a “both / and” kind of situation. They also say they’re not going to collect any location information, so, OK. But all of this still brings up lots of concerns.

First of all, with respect to the Bluetooth beacons, you wonder if whatever they’re using to generate that beacon address is going to produce something large enough to avoid duplication at large scales, so that, for example, you and I don’t end up with the same beacon ID.

Also, radios are different between phones, even if you’re talking about the same manufacturer, they’re different between models sometimes, and signal strength can vary. There are a lot of variables — so how does that work out if you’re trying to use signal strength to determine proximity? 

In addition, what happens if you and I are driving and we stop at the same intersection? Remember, these tools supposedly don’t know anything about location, so it wouldn’t be able to see that we were in the middle of the street, or in cars. It can’t rule out a false positive triggered because we were just near each other, but we weren’t truly exposed to each other. What is that going to do in terms of driving up, for example, consumption of testing? 

Another thing you have to ask is how anonymous this is really going to be. Let’s say I’m doing a really good job of social distancing, for the most part, but I get a notification that I may have been exposed to Covid-19. What is that notification going to look like? I’m not entirely sure. Are they just going to tell me the date? Are they going to give me a range, like, sometime in the last week? Or are they going to tell me the date and the hour? Because if it gets down to that kind of resolution then I would have a pretty good idea of who I was around. And what happens if I’m just walking around the neighborhood, and my phone gets close enough to other people’s devices to get heard, even though they weren’t physically near me? Then I get sick, and mark myself as exposed, and well, now everyone has that alert that they may have been exposed, everyone starts communicating, and they figure out that it’s me.  


So “anonymous” is really more like “pseudo-anonymous”. Names aren’t being used, but you can infer from a lot of things, in certain edge cases, about who it might be.

Some of these questions have possible answers. For example, regarding the issue of two cars stopped at an intersection resulting in a false positive, one imagines that Apple and Google could build accelerometer functionality into the API that would allow the system to disregard beacons generated by a user who was stationary one moment and accelerating to 35 MPH the next, i.e., clearly driving their car. 

Starcher agrees that there are possible solutions to some of the concerns he raises, but reemphasizes the issue of implementation, pointing out that there will be multiple parties involved in developing the actual mobile applications in question — not just two tech companies.

GS: I think what it comes down to is, sure, you could use acceleration data to rule out edge cases that might generate false positives without needing to access location (although we don’t really know if they’re going to do that). Fine, but although you trust Apple and Google to be that smart … do you trust an app developed by the government to do that? I don’t know. 

There are similar issues when it comes to privacy. The problem is that with a tool like this, what happens if you take this data and you throw it up against other government systems, and use it to “enhance” the contact tracing that they already do on people? Do you trust that government? How does this tool get dismantled when it’s no longer needed; at what point? Because once this thing exists, it exists, and even if you trust Apple and Google, what’s to keep a government from giving them a secret security letter that says “You’re going to hand over all of this data and you’re not going to tell anyone you did it”.

The question of whether or not to trust governments is obviously extremely important, and it raises complex issues. Apple and Google have assured the public that their API will only be made available to qualified public health authorities who have legitimate reasons to develop a contact tracing app. But while this would likely prevent other apps or digital marketers from abusing the tracing functionality, it’s harder to predict how governments will behave. 

For one thing, there may be honest differences of opinion, from country to country, about what constitutes a legitimate public health authority. The United States and Germany, for example, may have very different ideas about what this means than Russia or China. In addition, providing this functionality to governments is problematic when not every country is fully invested in the idea of “privacy as a human right”, to say the least. But it’s difficult to know where to draw lines, because while we’re used to talking about how or even whether Apple should be cooperating with Beijing, this particular discussion is happening in the context of trying to save lives. There are no easy answers.

All of these worries about privacy should prompt some reflection, as well. While we are all rightly concerned about the possibility for this contact tracing tool to be abused by governments and corporations, it’s worth taking a moment to pause and ask ourselves if this is really a new danger at all. When iOS 13 was released, its enhanced security and privacy protections made it alarmingly clear just how many apps were constantly requesting location and Bluetooth information in the background. Wouldn’t it be more accurate to say that Apple and Google are simply leveraging capabilities that are already being used on our phones in potentially questionable ways?

GS: The short answer is yes. If you go to the mall, there are retail outlets that are capturing all of the Bluetooth addresses, or, say, the fact that your phone is asking “Is the hotspot called ‘Fred’ available”, and they’re recording all of that. They could potentially have sensors near a cash register, and if I perform a transaction, especially if I’m using their loyalty card, they could theoretically take the purchase transaction, know that these beacon identifiers are near that register, and make a pretty good bet that those are my devices.

In addition to privacy concerns, there are also questions about efficacy and practicality of these contact tracing tools. One major issue has to do with opt-in rates, since researchers estimate that around 60% of a country’s population would have to be using a mobile contact tracing system in order for it to be effective. Some people with older operating systems which are incompatible with the new apps will be left out; and others might decide that opting in just isn’t worth the risk to their privacy. For his part, Starcher says that he would likely make use of a contact tracing tool if one became available, despite his reservations:

GS: I’m managing my risk well. I saw it coming; I stocked up early. I stayed home except for one or two top-off runs and I’m staying home. Will I use a contact tracing app? Probably, if I start going out. 

If I’m staying at home I probably don’t need to, personally, but I might do it just to try to help. I work from home (have for years), I’m not leaving the house. My idea of going out is sitting on the back deck and working there or eating my dinner. But maybe if I go to a store, or if I start trying to get out more to get groceries or things like that, then maybe I would turn it on — maybe even just for that period of time. 

I probably would, in that case, personally — but there are these implications to consider. You have to decide for yourself. It’s a tough call, and it’s a tough situation, and it’s a situation we’re facing where, before, we didn’t have this kind of technology option at all.

Don’t click that link!

We’ve covered Covid-19 scams before, both on the Checklist and over at the SecureMac blog, so we know you’re pretty clued-in when it comes to the subject. But not everyone is, so we’re sharing another coronavirus scam with you this week in the hopes that doing so can help you keep other people in your life safe.

Police are warning of a text message-based coronavirus scam that attempts to impersonate one of the contact tracing systems that we’ve just been discussing. Victims receive a text message saying that they’ve come into contact with someone who has tested positive for Covid-19, and are told that they should go get tested — and are given a link that they can click to get “more information”. That link is malicious and clearly should not be clicked!

There have also been reports of a similar scam targeting seniors. According to officials, some older folks have been receiving text messages from bad actors posing as the U.S. Department of Health. They are told that they need to take a mandatory online Covid-19 test in order to receive their economic stimulus payments, and provided with a link to the “test”. Again, a scam, and a link that should definitely not be clicked!

We know that you would probably never fall for one of these scams, but far too many people do, either because they don’t know any better or because their judgment is clouded by fear and stress (something which scammers love to take advantage of at times like this). So if you know someone like that, please take a moment to share this information with them to help keep them safe.

That brings us to the end of another Checklist, but be sure to take a look at our archives for past shows and notes so that you can keep learning about digital security all week long. And if you have a question you’d like answered on an upcoming podcast or a suggestion for a topic you’d like to hear more about, please take a moment to drop us a line and let us know.

Join our mailing list for the latest security news and deals