SecureMac, Inc.

Checklist 153: Grades and Zeroes and Dumb Shoes

September 5, 2019

On this edition of The Checklist: Grading the Siri “grading” situation, Project Zero’s “iPhone Hack”, and Making the case for shoes that aren’t that smart

Checklist 153: Grades and Zeroes and Dumb Shoes

This week on the Checklist, we’ll grade the graders as we take a look at Apple’s response to their Siri privacy debacle. We’ll share some telltale signs that your iPhone may have been compromised. And we’ll ask (yet again) if it’s really necessary to have a “smart” version of absolutely everything.

It’s all on this week’s Checklist:

  • Grading the Siri grading situation
  • Subtle signs that your iPhone has been compromised
  • Grumpy old men go shoe shopping

Grading the graders

A few weeks ago, it came out that Apple had been listening in on Siri users’ most private moments—including everything from confidential conversations with physicians to drug deals and amorous encounters. 

Apple responded by suspending the program for review and providing some basic explanation as to why, exactly, they’d been sending recordings of users’ Siri interactions to their development group in the first place.

As it turns out, Apple was basically just doing quality control: performing what’s known as “grading” to ensure that Siri was being activated at appropriate times and interpreting user voice input correctly.

The problem, however, was that the third-party contractors tasked with Siri quality control were accidentally overhearing quite a few extremely personal moments—and that none of this was made as clear to the public as it should have been. This was compounded by the fact that, according to the whistleblower who leaked the information in the first place, a substantial chunk of these recordings came about due to accidental Siri activations.

Apple has since apologized for not living up to its own standards, and has reiterated its commitment to user privacy. The company also clarified that stored Siri data has never been used for profiling or marketing purposes, sold on to third parties, or put to any other use than improving Siri’s performance.

All of which sounds reasonable—but still probably leaves a lot of us feeling uneasy at the thought that our voice recordings are being collected and stored by Apple, whatever the reason.

Still, Apple’s heart seems to be in the right place, and they point out that Siri is engineered to safeguard user privacy as part of its basic design.

First of all, Siri is intentionally minimalistic when it comes to collecting and transmitting data. Basically, if Siri doesn’t need to send data to its servers in order to fulfill a user’s request, it doesn’t: It simply accomplishes the required task using the phone’s own system.

This is the idea behind what Apple calls “Core ML”, a development standard aimed at improving software performance by enabling machine learning to take place right on a user’s phone, with no network connection or data sent off the device. In theory it’s win-win: Apps get better without users having to sacrifice their privacy.

Apple also pointed out that when they do have to collect and analyze user data for development purposes, they use a random identifier to track the data without its being connected to a phone number or Apple ID. What’s more, the connection between the data and this identifier is only maintained for a maximum of six months, providing another layer of privacy.  

Considering how conservative Apple has been with handling user data, it’s hard to imagine a scenario in which a snippet of a recorded user-Siri interaction could have resulted in a genuine security risk. There would be little chance of a user identifying themselves in any meaningful way while at the same time disclosing some sensitive piece of information. 

But it’s also fair to say that Apple should have been more transparent about collecting and storing personal moments from their customers’ lives, and could have done a lot better here—which they seem to recognize based on both their apology and their strategy for Siri development going forward. 

Apple came out with an announcement about how they plan to respect user privacy while at the same time continuing the necessary work of Siri development. First of all, while they are going to restart the grading program, in the future they will allow users to opt in or out of having their recorded voices sent to quality assurance teams. Secondly, Apple is going to keep the program strictly in-house from now on: No more third-party contractors listening to voice recordings. Some folks may find this reassuring, as Apple will be able to keep an even closer eye on how the development teams are handling user data.

However, Apple will still need some way to make sure that Siri is working as intended. To this end, they will continue to send computer-generated transcripts of recordings to their grading teams in order to assist with development work…and it looks like there’s not going to be any way to opt out of that without getting rid of Siri altogether. So if you’re still uncomfortable with your conversations being stored by Apple as transcripts, you’ll have to go to Settings.and disable both Siri and Dictation.

Has your iPhone been compromised?

Google’s Project Zero announced last week that they’d discovered a network of websites that had been hacking iPhones. According to the researchers, merely visiting one of these sites was enough to launch an attack on your device that could result in the installation of monitoring software. The story was picked up by the media and reported on somewhat dramatically, with the implication that the discovery of these “thousands” of hacked iPhones meant we were all at risk. 

Sounds pretty bad, right? 

Turns out that there’s a bit more to the story than meets the eye. What Project Zero didn’t say (but what surfaced in later reporting) was that the websites in question weren’t especially well-known. In fact, they weren’t even sites that most iPhone users were likely to come across. Many observers are speculating that the sites in question were set up by the Chinese government and aimed at the country’s Uyghur Muslim minority. In other words, while Project Zero had done some excellent security research, the likelihood that most of the world’s millions of iPhone users could ever even find these sites was pretty low.

So why make a dramatic announcement in a way certain to create media buzz? It might have something to do with timing: Apple recently announced an upcoming press event at which they may discuss new iPhones. In addition, the story was presented as an “iPhone hack”, when in reality it would be more accurate to describe it as a bug in Safari (one which Apple knew about and had patched). All of this raises some serious questions about Google’s motivations.

But while the story might be a little suspicious in terms of its timing and spin, it does provide a good opportunity to talk about iOS security—namely, how you can tell if your iPhone may have been compromised. 

We’d like to share a useful list of common signs of a hacked device, along with some steps you can take if you think you may have been the victim of an attack. It’s worth noting that these signs aren’t absolute proof of a hack, but they definitely warrant a little bit of digging if you notice them and have some reason to think your device may have been compromised.

Battery drain

If you start to notice a sudden drop in battery performance, your phone may have picked up some malware. It’s worth heading over to Settings > Battery and seeing if there’s some app that seems to be using an unusual amount of power—especially one you don’t recognize. If you do find a strange app there, delete it!

Data usage

If you notice your mobile data usage has increased, even though you’re not doing anything out of the ordinary, that’s a pretty good indication that there’s a lot of data exchange going on in the background. Again, take a look at the apps on your phone to see if there’s something there that doesn’t look familiar and might be the culprit.

Buggy behavior

If you’re seeing apps crash more than usual, you might want to delete any apps that seem to be going down frequently and then reinstall them from the App Store.

Unknown apps and certificates

If you spot something on your device that you don’t remember putting there, it’s probably best to remove it. That certainly applies to apps, but it could also apply to certificates as well.

Again, none of these are absolutely certain signs of a hack. Sometimes app updates introduces bugs, sometimes we install an app and forget that we installed it. But these things are common signs of something fishy going on with a device, and are worth investigating.

Smart sneakers: As dumb as they sound?

Switching gears a bit, we turn to a story that might not be as funny as it sounds: Siri and Apple Watch support for a new line of Nike shoes.

Yes, now you can adjust the fit of your shoes with an app! It sounds like something out of Back to the Future, but Nike’s new “Adapt Huarache” smart sneakers will allow wearers to tighten or loosen their shoes with a watchOS app or by using Siri. 

On the one hand, it’s easy to make fun of a product like this—and both of our hosts had a laugh at the thought of someone reinventing the humble shoelace using bluetooth technology.

But in fairness, hardcore runners might want a completely customized fit from their footwear, and it’s possible that Adapt Huarache could deliver that. So from a security point of view, is there any issue here?

Frankly, while someone could conceivably hack a running shoe, it’s hard to imagine what harm they could cause if they did. Loosen a shoe mid-sprint and cause the victim to trip and fall? Inflict some arch pain on a target? 

But in general, the more tech you incorporate into things that don’t actually need it, the more chance there is for something to go awry. And in fact, earlier in the year, Nike’s very same Adapt system suffered a firmware delivery issue that effectively bricked the shoes for people using the Android version of the app. Frustrated owners were left with very expensive, very ordinary sneakers. So it boils down to this: Are the benefits of making a smart sneaker, or coffee maker, or anything else, sufficient to offset the risk of technical failure? In the case of shoes and shoelaces, we’d have to say no.

But there’s another issue here as well—one that goes far beyond Nike’s new shoes. The growing trend of manufacturers trying to create a “smart” version of every single product has resulted in a proliferation of Internet of Things (IoT) devices. And all of these new IoT things are fast becoming a genuine cybersecurity issue, as these millions of new networked devices in our homes and offices have vastly expanded the attack surface available to malicious actors. Add to that the fact that many of these devices are not exactly developed with security in mind, and are often set up and connected to a network using only factory default passwords, and you have a recipe for disaster (courtesy of your baby monitor or fridge).

So while smart sneakers may be good for a Marty McFly reference and a chuckle, the trend they represent is no laughing matter. As for what to do about it, we’d reiterate the advice we’ve given out before: Ask yourself if you really, really need a smarter version of whatever thing they’re trying to sell you. Unless the answer is a definite yes, then you’re probably better off going with the old, dumb version you’ve always used. 

Get the latest security news and deals