Checklist 127: Once More, with Facebook!
Once more, with Facebook! This week, we’re looking at — you guessed it — yet another case of a story where Facebook has been hoovering up data left and right, only to claim “Oops, we didn’t mean it!” once they were caught. From there, we’ll investigate what’s going on in the world of consumer encryption and what you need to know about it right now, before wrapping up today with a round of follow-ups to stories we’ve covered in the past weeks and months. Without further ado, here’s the list we’re running down today:
- Once More, with Facebook
- The State of Play on Consumer Encryption
- Onavo Goes the Way of the Dodo
- Cleaning Up the Nest
- The Hill Has Qs
We’ve got a lot on the menu for today, so let’s not waste any more time getting started. What’s Facebook up to this time?
Once More, with Facebook
Facebook’s got a new mystery on its hands: how does it keep ending up with so much personal information on people that it doesn’t want? It’s all just little things, too — such as financial information, or your heart rate, or even “that time of the month.” And Facebook promises, for real this time, that it didn’t mean to collect it and didn’t want any of this data!
Sure thing, Facebook.
Our story today comes to us courtesy of the Wall Street Journal and Apple Insider. The WSJ ran some tests on smart devices to see whether there was any interaction between Facebook and some other apps. What they found is a surprise: Facebook harvests data from some other apps on both iOS and Android with no disclosure whatsoever.
Some of the offending apps interacting with Facebook in this way on iOS include “Instant Heart Rate: HR Monitor” by developer Azumio, which apparently sent user heart rate data to Facebook servers as soon as it registered a reading. As one of the biggest apps in this class on the App Store, you can imagine that’s quite a lot of health data flowing to the social media giant. Another app, the “Flo Period and Ovulation Tracker,” boasts more than 25 million active monthly users, and it too passed this private information along to FB. One more we know about: Realtor.com, which told Facebook when a user looked at home listings, along with the geographical area and price range the user searched.
Did we mention that all this happens for users of the app who aren’t even on Facebook? It’s true: even if you’ve never made an account on the world’s largest social network, if you use one of these apps, Facebook learns about you. The WSJ discovered that’s precisely what was happening — and there’s a “good” (not really) reason Facebook wants that data: shadow profiles. For an in-depth rundown on what those are, you can head back to Episode 32 of The Checklist: Shadow Profiles on Social Media.
Facebook, for its part, says, “Oh no, we don’t want that!” and stated that the business’s own terms stipulate that developers should not transmit such data to its servers, and that it has since flagged the apps and requested that they stop the process or face the consequences. So, should we believe that this was all a big misunderstanding?
Well — there’s plenty of reason to doubt. After all, Facebook has been caught time and time again with their hand in the digital cookie jar; how many times can we find it there and accept “I didn’t mean to” as a reasonable excuse? Plus, even if it was accidental, Facebook surely knew it was capturing this information before the story broke in the media — and it had no problem devouring that data for months or years before this point. So, what’s the point of sending data from apps as intimate as an ovulation tracker?
The answer is the same as always: money. More than likely this is all about mining data to better target ads so Facebook can continue to make more money — and it wouldn’t surprise us to learn that there was a behind-the-scenes quid pro quo between Facebook and these developers to exchange the records. Despite FB’s insistence that this was all an accident, there’s little reason to believe that’s actually the case.
Let’s give them the benefit of the doubt, though, and play devil’s advocate for a moment. What if this was all just a mistake? Well, it’s ultimately just code, and code can change. So why was Facebook not rejecting this information outright? Why not just make changes on their end, and ensure that these apps can’t get through? These are good questions, but not ones we can answer.
What about Apple’s stance on all this? They reiterated to Apple Insider that the company requires clear disclosures of data collection to users and that it would act against any apps found to hide the sharing of information. After recent events, though, we have to say some of the blame falls on Apple’s shoulders here, too. Sure, there are thousands of apps on the store, and policing them all is difficult. However, Facebook is a repeat offender in this area, and in combination with the recent enterprise certificate fiasco we’ve previously discussed, we wonder who will come out on top in this tug of war.
The State of Play on Consumer Encryption
Who’s that at the back door?
Even though we haven’t had a story about it lately, the war for a back door in consumer-level encryption continues to rage on, with Apple Insider highlighting another piece from the WSJ on the issue. In this story, an FBI executive assistant director referred to the encryption you enjoy on an iPhone, Mac, or any number of messaging apps as an “infection.”
Speaking to the Journal, Amy Hess said that law enforcement agencies face an increasing problem with users “going dark” — that is, slipping behind the veil of encryption, away from the prying eyes of surveillance. Law enforcement agencies such as the FBI contend that the ease of access users enjoy for encryption today only compounds problems as it allows criminals and potential terrorists to plan and work in secrecy.
Although we understand the arguments made towards reducing crime, backdoors hurt everyone — and that includes average users. Everyone has a right to keep their private data private; if we must sacrifice that for the idea of safety, is it really worth the cost? 9 to 5 Mac also has a quote from the Journal discussing why governments want to be able to bypass encryption: to “track threats” and solve criminal cases. However, perhaps it would be more accurate to state that governments simply say this is all they want from the technology.
So why is all this coming up again? It has to do with a law Australia passed late last year that seemed to mandate back doors. A large group of tech companies, among them Google, Apple, Microsoft, and Twitter, sent a joint statement to the Australian government warning of the consequences of weakening user security choices. Australia’s law is currently seen as a “test case” for efforts by other members of the Five Eyes intelligence-sharing arrangement, which includes the US, to weaken consumer encryption. As we’ve said here and, in the past, weakening encryption puts everyone at risk and is ripe for abuse.
Want an example of the fact that there’s no such thing as a backdoor that only works for the good guys? Apple Insider recently pointed out that iPhone hacking tools from Cellebrite are currently available on eBay for merely $100. Remember, these are systems that are only “supposed to be” used by law enforcement agencies, and the company itself is adamant that it only provides products to verified LEOs. Now it seems those same agencies are reselling the devices, causing Cellebrite to warn its clients not to undertake such actions.
The story gets worse: these devices are going out into other hands without sanitization. Matthew Hickey, a security researcher who purchased some of the secondhand units, found that they were full of information extracted from phones in law enforcement custody, including identifiers that could allow an unscrupulous individual to pinpoint a device’s current location.
Perhaps Apple has picked up a few units itself to start figuring out how to close these loopholes. For now, we’ll keep on hoping that the fight against encryption is a losing battle.
Onavo Goes the Way of the Dodo
Okay, let’s pull on some loose threads and follow up on some of the stories we’ve hit in the past. First up, some good news out of the Facebook dungeon: the company is finally killing off the Onavo Project. As you may remember, that’s the so-called “Facebook VPN” that claimed to help you secure your info while using your device, but in fact, it collected all kinds of data about your device usage. It caused quite a stir when it came out, and we’ve discussed it a few times on The Checklist.
Now, though, Facebook says it is ending the Onavo program in favor of “reward-based market research.” Sound familiar? That’s the exact way to describe the side-loaded data collection app Facebook got dinged for distributing through Apple’s Enterprise Developer Program just a few weeks ago. Overall, though, this is good news — it’s one less scammy VPN to fool users.
Maybe it’s just time for Apple to toss Facebook off iOS, though.
Cleaning Up the Nest
Feeling a little uneasy about the Nest devices in your home after our stories about built-in microphones that Google failed to disclose? Well, here’s some more good news: Google has released a complete list of the products in which you’ll find a built-in mic. This way, if you own one of these devices, you can ensure you have disabled its listening capabilities. The affected devices include:
- Nest Hello
- Nest Cam Indoor
- Nest Cam IQ Indoor
- Nest Cam Outdoor
- Nest Cam IQ Outdoor
- Nest Protect (2nd gen. only)
- Nest Guard Hub
If you have a Nest Thermostat, though, don’t worry — there isn’t a microphone in that. Here’s a quick reminder on how to disable that feature:
- Visit the Settings App in your Nest software or tap “Nest Guard” for that product.
- Tap “Google Assistant.”
- Tap on the switch to disable Google Assistant.
And you’re done!
The Hill Has Qs
Finally, CNET has a story reporting on an interest Congress has taken in Google. The US Senate Commerce Committee has demanded that Google CEO, Sundar Pichai, address the microphone controversy directly. Saying that the lack of disclosure “raises serious questions” about Google’s efforts to promote transparency with consumers, the committee wants answers to six key questions before it determines any further action to take. Those questions include:
- Were there always microphones in Nest devices?
- When did Google discover it had failed to disclose the microphone?
- Why weren’t people told in the first place?
- What have they done to correct the problem since it became known?
- Have third parties ever used the microphones illegitimately?
- What else is Google hiding in other hardware?
The Committee requested answers by the 12th of March — and an in-person hearing before the full committee by Pichai at the end of the month.
We look forward to bringing you the results of this story as it plays out, so stay tuned.