SecureMac, Inc.

The Checklist Podcast

SecureMac presents The Checklist. Each week, Nicholas Raba, Nicholas Ptacek, and Ken Ray hit security topics for your Mac and iOS devices. From getting an old iPhone, iPad, iPod, Mac, and other Apple gear ready to sell to the first steps to take to secure new hardware, each show contains a set of easy to follow steps meant to keep you safe from identity thieves, hackers, malware, and other digital downfalls. Check in each Thursday for a new Checklist!

Checklist 130: A Sick Phone, Your Face, and MySpace

Posted on March 21, 2019

Here on The Checklist, we spend a not-insignificant amount of time discussing the many ways that malware can attack your Mac, but what about your iPhone? Could it be vulnerable as well? Meanwhile, Congress is back at it again, with more lawmakers looking at legislating in the digital arena — this time with facial recognition technology as the target of their legislative pens. That, plus a big boo-boo by MySpace that left many old files circling the digital drain make up our stories for today, as we check those items off our list:

  • Viruses or Malware: Can Your iPhone Get Either?
  • Congress Considers Facial Recognition Legislation
  • MySpace Teaches Us All About Backups

So, what do you need to know about your iPhone and malware?

Viruses or Malware: Can Your iPhone Get Either?

Here’s the truth: just as Macs can be prone to contracting malware infections, so too can iPhones. There’s good news, though: even though it’s possible for iPhones to get sick, it’s a lot harder for it to happen than it is on the Mac. By and large, if your phone ends up with some nasty malware on it, well—chances are, it was because of something you did. That’s not a guilt trip from the Checklist team to you, though; it’s just a simple fact. Here’s another one of those: it’s easy to protect your iPhone. 

Digital Trends brings us an article this week that has the Chief Technical Officer of AV-Test admitting in an interview that iPhones can — and do — get viruses. CTO Maik Morgenstern went on to say, though, that normal users are not likely to encounter the infections that are out there, and the bar one needs to clear even to get one of those infections is very high indeed. There are vulnerabilities, though — we talk about the ones Apple patches all the time, after all — and attackers can use them to dump all kinds of things onto an insecure and/or unpatched phone.

It’s not only viruses, though. Plain old malware is far more likely to show up on a phone, sniffing for data or looking in places it shouldn’t. However, it’s tough for malware authors to find open holes in iOS to get through, so often the real vulnerability lies in the wetware — in other words, you, the user. From jailbreaking to visiting unreliable sites that could be loaded down with malware of its own, often it’s the user who opens the door — not malware authors finding their way in through the back.

Apple’s “walled garden” approach, however, has taken its fair share of criticism — including, at times, on this very show — because it doesn’t allow for as much choice on the user’s part. With that said, it’s clear there are some benefits to being more restrictive. 

That doesn’t mean iOS is completely blemish-free, though. The Digital Trends piece points out that there have been times when Apple-approved apps on the App Store themselves became infected after approval. Ultimately, though, the first line of defense is you. So, what should you do, or not do?

Don’t jailbreak your phone. It’s the same as leaving your car parked in the middle of the street with the keys in the ignition and the doors unlocked. It’s an invitation to let malware take you for a ride.

Beware of phishing scams. If a company asks for your personal info, you need to be extremely confident they’re legitimate. Don’t click suspicious links in emails. If someone calls you to ask for information, ask tough questions to make sure they’re for real. In other words: always be asking.

Beware of “smishing” scams. Smishing is just a fancy word for phishing done via text message. For example, you might receive a text message purporting to be from your mobile service provider with a link in the text message. Click it, and suddenly, you’re on a fake website or triggering some kind of malware download. Ask yourself: is the company going to text you this way? When in doubt, get in touch with the company directly and ask if it was a legitimate text message. 

Avoid shady profile installations. Some services will try to walk you through installing new developer profiles on the phone to allow you to access services banned from the App Store. The permissions you give away, though, are often more trouble than they’re worth. Just as we saw with stories in the past with Facebook, Google, and others, you never know what information you’re giving away by doing that. Want to double-check? Go to your Settings app, tap General, and look for “Profile and Device Management.” If you don’t see it, you’re totally fine! Otherwise, tap on it and see what profiles you have installed.

Now, every week we discuss the importance of running a good antimalware suite for your Mac. Do you need something like that for your iPhone or iPad? Nope — steer clear of them. In fact, share that fact with your friends and family. Many of the anti-virus apps on the App Store are scams or trying to steal data, and Apple’s policies are such that any legitimate antivirus app will likely be crippled in terms of what it can do that it won’t really provide you with any additional protections.

Congress Considers Facial Recognition Legislation

Next, we’re heading back to Capitol Hill to cover a law proposed by Hawaii Senator Brian Schatz and Missouri Senator Roy Blunt. TechCrunch brings us this story about the proposed bill titled Commercial Facial Recognition Privacy Act. The law has several purposes.

First, it would require companies, such as Apple, to ask users directly for their permission before collecting any data used for facial recognition. Next, the law would prohibit companies from sharing that data “freely” with any third parties. Discussing the legislation, Senator Blunt had this to say:

“Consumers are increasingly concerned about how their data is being collected and used, including data collected through facial recognition technology… That’s why we need guardrails to ensure that, as this technology continues to develop, it is implemented responsibly.”

Sounds good, right? Well, about that—there are a few reasons why the proposed legislation isn’t necessarily so great. Let’s first quote from the law itself:

it shall be unlawful for a controller to knowingly— (1) use facial recognition technology to collect facial recognition data, unless the controller— (A) obtains from an end user affirmative consent in accordance with subsection (b); and (B) to the extent possible, if facial recognition technology is present, provides to the end user— (i) a concise notice that facial recognition technology is present, and, if contextually appropriate, where the end user can find more information about the use of facial recognition technology by the controller…

Talk about a mouthful! Let’s break down the problems here. First: this doesn’t cover any sort of government use of facial recognition data, only commercial purposes — and we should absolutely be more concerned about the former use in the long run. One only needs to look at how China deploys facial recognition software on a large scale to get an example of what can go wrong. Second, how can we trust companies to inform consumers? 

Let’s use an example one of our hosts encountered personally on a recent trip to an amusement park. Within a particular section of the park, a video was being filmed for promotional purposes. A sign near the area stated that for those who did not want to be in the video, the only option was to navigate the park a different way. Going past the sign was considered consent to appear in the video. Here’s the thing: the sign wasn’t particularly noticeable, and this doesn’t really seem like the best way to gather consent. 

So, when businesses do things like this, how can we trust that they won’t merely make the process of granting consent so frustrating (making users click through without reading) or so seamless that users don’t even know it’s there? On the surface, though, this legislation is heartening — it is good to see that lawmakers are starting to think more seriously about how to safeguard our rights and privacy in the face of new challenges. However, its shortcomings mean this is not quite a big enough step in the right direction. 

Interestingly, Microsoft has endorsed the legislation, saying that it’s now or never — we’ve got to get something done before the genie is entirely “out of the bottle” so to speak. Isn’t it already, though? 

MySpace Teaches Us All About Backups

Our final story for today is of the kind that leaves us asking, “How could this happen?”

According to TechCrunch, the old social networking giant MySpace says they’ve “accidentally” lost massive amounts of music uploaded by users over more than a ten-year period. Savvy listeners may recall that MySpace launched the careers of numerous bands and musicians, and it was a haven for high school indie bands and professionals alike for many years. Now, that content has vanished into the digital ether forever. According to the company, more than 50 million songs representing 14 million different artists on the site were destroyed and lost during a “server migration issue.”

There are a lot of questions here, especially since MySpace is being so tight-lipped about everything. It seems like an extremely poorly executed software update — did they not use a test server? Why are there no backups? Why can’t they simply roll the server back to the previous state? We don’t know, but we do know this: this is amateur-level stuff. 

What can we learn from this? Well, for the average user, this is a perfect example of why keeping safe backups of your data is so important. There are a lot of different ways to handle them, but here are the ones you can consider:

First, you could backup to an external hard drive — though, if it’s your only other copy, you’re still at risk if something were to happen to that hard drive. Apple’s Time Machine is handy for backing up to external drives, but if your backups are all on-site, something such as a house fire can still lead to the destruction of all your data. Next, you can use off-site backups, where your data is stored on a hard drive somewhere other than your home, like in a safe deposit box.

Cloud services are useful, too. iCloud isn’t exactly a backup service, but it can help you hold on to all kinds of valuable data. Other services such as Dropbox and Amazon’s S3, which allow you to keep copies of your files in the cloud. These can be clunky, though. There are some very robust cloud services, though, which can give you access to multiple backups over time, so you always have protection.

Ultimately, a combination can be the most useful solution. A local solution, such as Time Machine, combined with a third-party cloud backup service will give you the best coverage. Covering all your bases with a broader backup solution is what lets you enjoy true peace of mind. For content creators, it’s important to remember that you can’t trust others with your data — not MySpace, not YouTube, and not Facebook. Even services set up to hold your data, such as Flickr, will sometimes fail you. Take personal responsibility for your own backups!

Join our mailing list for the latest security news and deals