SecureMac, Inc.

The Checklist Podcast

SecureMac presents The Checklist. Each week, Nicholas Raba, Nicholas Ptacek, and Ken Ray hit security topics for your Mac and iOS devices. From getting an old iPhone, iPad, iPod, Mac, and other Apple gear ready to sell to the first steps to take to secure new hardware, each show contains a set of easy to follow steps meant to keep you safe from identity thieves, hackers, malware, and other digital downfalls. Check in each Thursday for a new Checklist!

Checklist 125: Pirates on the Enterprise

Posted on February 14, 2019

Touring around the headlines this week, we’ve got an exciting and eclectic mix of stories to hit. From apps looking back at you to a story from the past few weeks that just keeps getting deeper, we’re covering privacy and app security today — and a bit of a silly story to round things out, as Apple must face down a lawsuit that we might generously describe as “just a little bit frivolous.” On our list for today:

  • Apps That Record What You’re Up To 
  • Pirates on the Enterprise
  • A Disgruntled User Sues Apple for… Taking Too Long?

We’ll hit each of those stories in turn today, but let’s start with a story about apps that might be taking a peek at your screen even when you aren’t expecting them to do so.



Apps That Record What You’re Up To 

Over the past week, TechCrunch ran a piece with a startling headline: “Many popular iPhone apps secretly record your screen without asking.” What’s the deal with that? The TechCrunch reporter investigated and discovered a slew of apps, from those designed for banking to those created by hotels, airlines, and others, all directly record user screens while the app is in use. Most of these apps don’t disclose that they do this, and if they do, it’s not easy to find that disclosure anywhere you’d normally look.

How does it work? These apps, including Hotels.com, all rely on a specific company’s service. This business, called Glassbox, develops software known as “session replay.” It’s exactly what it says on the box: while you use a Glassbox-enabled app, it essentially takes a running series of screenshots to record your every interaction. That means where you tap, what icons you choose, when you use the keyboard, and more. All this ends up back in the hands of the developers, ostensibly to allow them to refine the user experience.

However, TechCrunch enlisted a third-party researcher to take a closer look at these apps, and what they found isn’t great. Using a commonly available tool called Charles Proxy, the researcher was able to capture the session replay data as it streamed out of the app on its way to the developer’s servers. In other words, someone savvy enough could potentially steal that information in transit. 

The researcher also demonstrated some apps, like Air Canada’s, were failing to mask data in the stream properly. This “bad masking” could leave your address or credit card number exposed if session replay captures your input. So that’s problem #1 — problem #2 is the fact that users don’t know any of this is even happening, even though it’s mandatory for every app to have a privacy policy. Of all the apps TechCrunch analyzed, none disclosed the session replay recording — and users don’t have to give permission, so there’s effectively no way for the average user to know.

After the story ran, Apple released a statement to TechCrunch clarifying its position on the recording. It noted that the screenshotting process itself was kosher, but clearly informing users of the process was key. According to the company, developers are required to let users know when their actions are being recorded, and that users must opt-in to allow the process. Apple went on to say that developers were to be reminded of this fact and that they would take action against them — if necessary.

While the whole thing sounds nefarious, there is a real purpose at the heart of it all — analytics that gather data about how users interact with software is nothing new. In fact, many websites record your clicks for the same type of developmental purposes. So, while Glassbox may not have been appropriately used in these cases, there are legitimate uses — and the company itself seems to be trying its best to protect end users, too. They released a statement, pointing out that they give developers the software tools necessary to mask data safely. Of course, having the tools clearly doesn’t mean the developers will always use them. 

From this story, too, it’s clear that we cannot trust app developers to be up-front about such practices. Apple has taken swift action on some apps that are illegitimately recording users, though, with one developer given less than 24 hours to modify and resubmit his application before its automatic removal from the App Store. While Apple may try to stamp out this practice by forcing more apps to require explicit consent, the effects remain to be seen. Then again, a warning that you are being recorded that appears upon launching app would certainly get someone’s attention!

Pirates on the Enterprise

Next up, we have another story that just keeps on giving. A few weeks ago, we brought you a story first reported on by TechCrunch about how Facebook was circumventing the App Store and using a “VPN” app to suck up massive amounts of data about users in exchange for $20 a month. It worked because Facebook was using its Enterprise Certificate, something which allows businesses to develop and deploy apps for employee-only usage. Apple slapped Facebook down by revoking their certificate for a short period, causing chaos for employees. 

Not long after that, it came out that Google was doing the same thing — and again Apple responded similarly, although Google’s ban only lasted a few hours. During these discussions, we speculated that neither Facebook nor Google were the only companies out there abusing the Enterprise Certificate to get apps to end users away from Apple’s rigorous App Store procedures. TechCrunch has now broken another story that all but confirms that suspicion — and shines the spotlight back on Apple.

According to TechCrunch’s investigation, there are dozens — and possibly more — of illegitimate apps that can be “sideloaded” onto an iOS device without jailbreaking through the abuse of Enterprise Certificates. These included apps that offered pornography and even real-money gambling; many were developed overseas and are primarily available in Chinese. However, others target Western users directly. In every case, users only had to go through the simple process of allowing a Developer app to be installed on the device. No App Store needed.

A quick look at the process Apple uses for granting enterprise developer certificates reveals that it should be no surprise the abuse of the program is so widespread — there is very little oversight, and much of the process consists of making promises. Shady developers need only pay a $299 fee to join the program and must then provide a legitimate D-U-N-S business ID number. While some developers actually registered their apps to their own business, many are not. In fact, Apple does not verify — beyond the user’s affirmation — that they have the right to use the D-U-N-S number they supply.

The company even offers a D-U-N-S lookup tool, leading to many developers simply choosing a business at random and registering their certificate to that company name. Called “rogue certificates,” it’s what allows these developers to skate by the App Store. Within a few weeks, they must field a phone call from an Apple representative and reaffirm that they have the right to represent the named business. Of course, what’s one more lie at that point? With access to the program granted, the developer is now free to instruct users on how to set up their phones to install the illegitimate apps. 

TechCrunch points out that it seems Apple is not merely lax in its approach to who it allows in the program, but also who it allows to stay. Up until recent reports on the certificate abuse, there seemed to be very little oversight of developers within the program. Apple, for now, seems to have assumed good faith. It’s likely that perhaps they didn’t see this would be a problem — they merely wanted to provide a useful tool for businesses and to provide a barrier to entry high enough to keep out the riff-raff. That’s just not how it played out in the real world.

Perhaps most interestingly, though, none of the apps TechCrunch examined were out to harvest user data — they just relied on gambling and adult content to make money the old-fashioned way. We wonder… what does it say when Facebook is collecting more data than an illegitimate app developer?

Whether Apple will crack down on these apps and retool the program remains to be seen. Implementing auditing or oversight of the enterprise program would be a good place to start; currently, there is no real process in place for review once someone obtains their enterprise developer certificate. 

A Disgruntled User Sues Apple for… Taking Too Long?

We’ll end today with a story that’s a little bit more lighthearted — at least, it’s certainly funny to think about, because a man in California has chosen to sue Apple for offering two-factor authentication without user consent. Wait… what?

According to a man named Jay Brodsky, Apple is at fault for and causing harm to users due to the amount of time it takes for them to use iOS’s built-in two-factor system, and that he was forced to accept two-factor. The filing actually contains several errors, such as misidentifying when Apple rolled out iOS 2FA and mistakenly alleging that Apple did not offer an opt-in procedure; in fact, the procedure to accept the new practices was quite explicit. Users even received an email with a link to revert their settings if they did not like the change.

The lawsuit claims it takes up to five minutes for the plaintiff to access his devices now, owing to the process described in the filing:

First, the Plaintiff must enter his selected password on the device he is interested in logging in. Second, the Plaintiff must enter a password on another trusted device to log in. Third, optionally, the Plaintiff must select a Trust or Don’t Trust pop-up message response. Fourth, the Plaintiff then must wait to receive a six-digit verification code on that second device that is sent by an Apple Server on the internet. Finally, the Plaintiff must input the received six-digit verification code on the first device he is trying to log into. Each login process takes an additional estimated 2-5 or more minutes with 2FA.

This procedure sounds to us like someone who may not be very savvy with his devices. AppleInsider, who broke the story, did some testing of their own — and the average time to unlock a device with 2FA turned on was a whopping 22 seconds. This story is undoubtedly a first; we can’t think of another time when someone claimed that a company trying to protect users was actually a bad thing. 

If this story demonstrates anything, it’s the truth of the old age: you can lead a horse to water, but you can’t make it drink. Or, in this case, you can give a user security, but you can’t convince them it’s worth their time. Will Apple really face a day in court over the “burden” imposed by authenticating yourself on your device?

For now, that remains to be seen — although we would be a little surprised if a judge didn’t toss this lawsuit out before it ever had a chance of making it in front of a jury, let alone resulting in a finding of damages. The plaintiff wants a boatload of cash for inconvenience. We think he’ll probably leave the courthouse empty-handed.

A quick reminder: 2FA is not just a good thing, it’s a very good thing. It’s the easiest way to protect yourself from unauthorized logins, such as when you re-used a password, and someone is trying to find a way in to your other accounts. When you get a code you didn’t ask for, you know something’s up — meanwhile, the would-be intruder is left with no way to gain access to your account. Lawsuit or not, it’s worth your time to keep two-factor turned on everywhere you can take advantage of it. 

Should something else develop, we’ll bring you the news! For now, that’s the end of our discussion for this week on The Checklist. Have you missed a recent episode or don’t quite remember everything that’s happened with the Enterprise Developer certificates discussed earlier in today’s show? We suggest visiting The Checklist archives, where you can quickly find complete show notes, helpful links, and even the audio recordings of every episode we’ve done in the past. It’s a resource you won’t want to miss.

Join our mailing list for the latest security news and deals