SecureMac, Inc.

Checklist 57: SecureMac Roundup: More Questions from Our Listeners

October 5, 2017

Part 1 of a 2 episode series where we go over various questions from our listeners!

Checklist 57: SecureMac Roundup: More Questions from Our Listeners

Today, we’re looking at some of your questions! We try to cover everything we can each week for every topic we discuss, but questions do still come up, and our listeners have sent in some of theirs to see if we can provide some clarity. Today, we’ll be covering topics such as:

  • Passwords
  • The psychology of hacking
  • Jailbreaking for security
  • More thoughts on VPNs

Join us as we break down our listener’s questions and provide you with the answers you need.

Protecting your passwords with apps

This one goes all the way back to Episode #8 when we looked at some password do’s and don’ts. Listener Colin wrote in with a couple of questions, but let’s begin with the first thing he’s curious about: Does Apple’s Keychain count as a password app, and how secure is it?

For those unfamiliar with this type of software, let’s quickly go back over what a password app is and why you might want one. On a basic level, it’s just an app whose primary purpose is storing your passwords, so you don’t have to remember them; these apps often make it easy to log in to all your favorite sites without the need to store your passwords in the browser.

The reasoning behind their development is something we’re all familiar with: it’s a bad idea to reuse the same password across tons of sites, like your email, your bank, and even iTunes. If any one of those accounts gets hacked, the bad guys can take your passwords, run with them to other websites, and break into your accounts.

Good, strong passwords are hard to make, too. That’s another reason why these apps are so useful. Computers are very good at processing through a vast number of tasks very quickly, making it easy for hackers to attempt a “brute force” attack. They use a fast computer to try all the possible combinations of letters and numbers to find the right passwords — in other words, trying 123, 1234, 12345 and so on until they figure out your password.

A secure password – something significantly lengthy and complex – makes a brute force attack unfeasible because it would take even the fastest computer decades or centuries to try enough combinations. However, people are bad at making passwords — we use easily guessable things like the names of pets and our birthdays. Sure, they’re easy for you to remember off-hand, but also make it easy for a hacker to figure out as well.  Password apps often have strong password generators built into them, so you can create good security for yourself without the need to ever remember a long, confusing string.

So, does Apple’s Keychain count as a password app these days? The answer is yes — and it has improved a lot from how it was in the past. You might’ve seen your Mac suggest complex alphanumeric passwords for you to use before. In the past, these weren’t as strong as they could have been, but today it creates strong security. Keychain isn’t as customizable as third-party apps are, but it still provides a reasonable level of functionality.

Keychain stores your data securely on disk, bound up by a master password that unlocks the whole thing. This password is usually the account password for your Mac, but you have the option to change it if you wish. Once you’ve unlocked your Keychain, you’re authenticated and can use the items stored inside it, like your other account passwords. When you’re not logged in or haven’t plugged in the master password, all that sensitive data is securely stored and inaccessible. In this way, it covers the same bases as many password apps.

In recent years, iOS gained the ability to autofill passwords from your Keychain into login forms in Safari — so now it’s even acting like many of the most popular third-party apps. Apple has made leaps and bounds in bringing Keychain up to speed, so it is an excellent integrated and free solution for password storage. Now, that doesn’t mean there aren’t some concerns, too. To sync your Keychain between devices, the data must be stored in iCloud. Apple does rigorously secure this data, but there will always be some listeners out there who are wary about trusting a cloud service with their data.

We frequently hear people asking if using iCloud and Keychain together is safe. If you would use an app like 1Password, you should feel comfortable with this setup. For most people, the appeal of these apps comes from the ability to use the same app across all your devices for easy access to your login information. You’ll be relying on someone’s cloud service no matter where you go. There are options for offline storage, though there are challenges with that choice, too, and a loss of convenience.

If you’re a target for hackers going after your data, you might want to take steps to ensure you don’t store sensitive data on someone else’s cloud servers, whether it’s Keychain or not. However, the number of people who really fit into that category is very small. If you’re a typical, everyday user and you trust Apple for everything else, you can trust iCloud.

What if hackers don’t target your information individually, but they do compromise an entire password service? We’ve seen this before in the past. The good news is that these companies use the best encryption methods for keeping your data safe. The bad news is that we just have to expect that some of these breaches will occur; it’s the world in which we live now. When they do happen, look to see what your provider says about how they stored your data and what was compromised. This information allows security pros to offer advice to average users as to how concerned they should be. As we go forward, we’ll have to consider each breach on a case by case basis; encryption makes some breaches less of a threat, while a lack of proper security makes other incidents much more troubling.

Let’s summarize with a real-world example: let’s say you have a safe in your home. Not only does it have a sturdy lock, but it’s also incredibly heavy, too. A burglar breaking into your house might want what’s inside, but the complexity involved in doing so prohibits them from even trying. Maybe your password security app isn’t entirely safe — but it’s certainly better than leaving them written down and in plain sight. Strong encryption is like a heavy safe: it just takes too long for the bad guys to break in, giving other security measures time to work.

Does a company ever need to know your password?

Now let’s move on to Colin’s second question. He wonders:

If you want your phone unlocked so you can use it on whatever network you want, is it necessary to give the cell company your phone password? A friend of Colin’s recently had their phone unlocked by a cell phone shop, and they asked for his password before unlocking the phone.

First, let’s clear up some terminology: “unlocking” can mean two different things. It might mean supplying your phone password to get to your Home Screen, or it can mean changing some device settings to allow your phone to be used on other carrier networks. Colin wonders if you need to do the first type of unlocking to facilitate the second type. The basic answer is yes — if you want to move between networks, and are having someone else do it, you’ll need to provide them with access to your phone so they can gather information and change the appropriate settings.

At an Apple store, a store associate will hand you your phone back, ask you to input the password and look away. In this case, it sounds as though the shop in question didn’t care about your security quite so much. Perhaps there was a long line, or maybe it was just a matter of poor practices. Either way, you don’t have to be forced into a situation where you must provide your password.

Apple recommends that you simply call your carrier directly and ask for a “device unlock.” While not every provider will do this for you, many will be accommodating and unlock your device with a remote process. If you must use a third-party service, think ahead. You could put your passcode in to unlock the phone before you hand it over, or you can change the password after the process is over. If you must go this route, do your homework and choose a store that you can trust.

What if you’re concerned about giving someone access to your phone with sensitive data on board? Just make a backup of your phone at home and then wipe the device. The store can unlock your phone, and no employees will be able to pry into your information. When you get home, just restore from backup! You’ll have your data back, and your phone will still be unlocked from its previous carrier.

Picking apart the psychology of hacking

Next up, Kevin asks an interesting question about the psychology behind hacking: In your experience, what makes one person become a black hat hacker while another goes the white hat route? Do you know people who started out black hat and changed over to white hat? Is it always pure greed that makes one go black hat, or are there other motivations in play for both sides?

This question is a tough one to answer, and it’s not one that we can provide any definitive conclusion for — in the end, it’s really situational. It depends, to an extent, where a person is in their life’s journey, how old they are, their level of education and many other factors. It can also sometimes be “part of growing up” in a sense. If you have a kid who’s interested in computers, programming, and security, you may find them doing minor black hat-type activities to impress their friends or others online; they’re showing off. To an extent, we can see this as “kids being kids.” For many of these individuals who have an interest in security when they’re young, they grow out of this phase and start to view it as the potential for a career.

The ability to convert hacking into a career has led to a significant change in the balance between white and black hats. The prevalence of bug bounty programs has helped to convert hackers from rogue programmers into legitimate freelancers that help companies develop better practices. Corporations ask hackers to break into their systems and identify issues and provide payouts if the hackers can document and prove a bug exists.

These programs involve many rules to provide a secure atmosphere for hacking, such as limiting activities to specific test servers or limiting hacks to one’s own account. If you follow the rules, you can nab a sizeable payout. After all, it’s cheaper to pay out a bug bounty than deal with the fallout of a major security incident.

Up until about ten years ago or so, these programs didn’t exist. Before that, if you tried reporting a security flaw, it was a 50/50 shot that they’d either thank you or report you to the FBI for cybercrime! Slowly, these attitudes have changed, as corporations see that it’s better to enlist hackers rather than to try to dissuade them from probing their systems altogether.

However, you will still have people who are in it because of greed, sometimes revenge, or just because they want to cause chaos. They don’t have a job that makes a lot of money, but they do have extensive technical skills. They realize they could be making a lot more money if they were doing nefarious things with their computer knowledge rather than spinning their wheels. Who decides to make the leap into black hat hacking? It depends on the person. Sometimes, even if a good security job is available, these people would rather keep on hacking systems to cause havoc.

The resources that exist now, though, are much greater than when the previous generation of hackers came up in the world. That is why there are other pathways to take. Now we have books, classes, online tutorials and more, all offering a wide window into computer security. Years before, security guides were written by the same people who were hacking the systems. Now you can get a degree in computer security — which is itself a new development. As a result, there are many more options for people to go down the “white hat” road these days.

A lot of the major contributors to the security field these days got their starts as black hats. There’s Kevin Mitnick, of course, who did a lot of illegal hacking and social engineering and eventually spent time in jail for his activities. Today he’s a major consultant with some pretty influential books on the shelves. There’s Kevin Poulsen as well, a black hat who did his time and is now a researcher and author known around the world. So again, it comes down to the individual; yet not every black hat who gets into trouble comes to redeem themselves.

With security and research more in the mainstream these days, though, there are more opportunities for people interested in security to start out with white or grey hat hacking rather than beginning on the dark side. Remember, hacker culture has changed to a huge extent as well. Back when things started, we had people were just exploring and trying to gather more knowledge — we can see that clearly with the story of both Captain Crunch and Steve Wozniak, which we delved into in our episode on Apple’s Hacker History. Now, if you want to learn something new, there are just so many more resources.

To be a good researcher, do you have to be a good hacker? In a sense, yes. You might not need to be a proficient hacker and a wizard with code, but you should be able to look at systems as if you were a black hat. If you aren’t, you’re not doing your due diligence. It’s important to remember that even as we strive to stay away from black hat activities, we need to keep up on what they’re doing and how they work. Otherwise, how can we protect against them?

Jailbreaking: is it ever a good idea?

We’ve talked about jailbreaking before a time or two on The Checklist. We’ve got an email from Stefan, who says:

“In a recent episode, you talked about jailbreaking and its risks and benefits. My understanding is that when you are browsing the web on an iOS device, you are susceptible to all the methods of user tracking that companies have devised, except for getting malware through ads by using an ad blocker, maybe. The main reason I jailbroke my iOS device was to install an outbound firewall comparable to Little Snitch, and be able to control which apps “phone home” or track me through various services like Google Analytics or Google Safe Browsing without my content.”

If you have any suggestions on how to achieve safe browsing on iOS comparable to using a Tor browser and/or a specifically hardened Firefox and/or an outbound firewall like on macOS, without resorting to a jailbreak, please share them.

As a reminder to our listeners, when you jailbreak a device, you are exploiting a work-around within iOS to bypass safety restrictions and install software that Apple normally wouldn’t allow. This is perhaps the first time we’ve encountered the idea of jailbreaking for security purposes. For the vast majority of users, it’s simply not a good idea, and we suggest that you do not jailbreak your phone.

Apple has been playing cat and mouse with jailbreakers for some time now with each new iOS update. Usually, you’ll see the current working jailbreak is a few versions behind the current iOS release. So, if you’re jailbreaking your device, you can’t update your phone as it will reverse the process. In other words, you’ll be missing out on all the security patches Apple pushes with every update, keeping your device wide open to some nasty hacks. So even if you plan to jailbreak to improve your security, you’re really creating more holes in your defenses.

Stopping apps and sites from tracking you is a complex undertaking. You can achieve some degree of privacy with Safari Content Blockers, which you can use to harden your phone to block certain types of data and tracking. As Stefan mentioned, there is also a special version of Firefox for iOS that is more security conscious. As far as apps themselves go, though, that’s very tricky to stop. We saw this with our discussion of Accuweather’s GPS tracking on a recent episode. Mostly, you must rely on word of mouth and the news to know which apps to avoid.

However, the researcher who uncovered the Accuweather issue, Will Strafach, is creating a unique new service that might interest Stefan. Called Verify.ly, it’s leaving beta, and Strafach uses it to analyze iOS apps to see what data they might send off to servers. They then publish security alerts to help users know to an extent which apps track your usage or expose your data.

So, there’s no real way to have an outbound firewall like Little Snitch without a jailbreak as of right now — but as we can see, jailbreaking tips the scales too far in the wrong direction. There are other, smaller things you can make changes to as well. For example, you could set your default search engine to DuckDuckGo, which does not track users in any of the ways Google does. In iOS 11, cross-site tracking blocking has been enabled by default.

VPNs revisited

Finally, our last listener question for the day:

“I’ve listened to your Checklist on VPNs, and it’s a great introduction to the topic. It was good to see you address the difference between privacy and anonymity, which seems to escape a lot of people.” Our listener also brings up the anecdote we discussed about Netflix during that episode, which involved accessing a US version of Netflix from the UK, and shares concerns about the news that Netflix has begun to crack down on VPN usage. Is that true?

It is, and it isn’t just Netflix. Hulu and many other streaming services are beginning to collect information on which IP addresses belong to VPN servers, and they’ve started to blacklist their access. Our listener shares that he has created a website that allows you to compare VPN providers that still work with Netflix. You can find that resource here https://www.comparitech.com/blog/vpn-privacy/netflix-proxy-error-here-are-6-vpns-that-still-work-in-2016/. For those users who want to get their Netflix fix while they’re traveling outside the US, this could be a helpful resource.

Here’s another VPN question: how do you know when you’re doing things right? Put another way, if you’re going to use your laptop at Starbucks, when should you turn on your VPN?

If you connect to a public Wi-Fi network and you first see a portal page with Terms of Service and an agreement button, you must first agree and connect to the network. Before that, you won’t be able to access the Internet, so you can’t reach your VPN’s servers. Afterward, though, you’re free to turn on your VPN immediately. One thing to keep in mind, though, is that many apps on your computer will try to connect as soon as they detect an active connection. You may want to quit any software that auto-connects beforehand. Secure your VPN connection, then launch your apps.

On iOS, double tap your home button and swipe up on the apps you want to close before making your VPN connection. On a laptop, some apps and services are running in the background that you might not be able to see to close — so an outbound firewall blocking all connections until you’re on the VPN is a smart idea.

What if you aren’t connecting to public Wi-Fi on your iOS device, but instead rely solely on your cellular connection? Do you still need to use a VPN? No, not necessarily — you should be able to trust your cell carrier, after all! In fact, in some cases you may even be able to trust your carrier more than a VPN provider.

That’s everything we have this for this show! Do you have a question of your own? Somebody else might benefit from the answer, too, so feel free to send us your questions in an email. Just drop a line to checklist@securemac.com, and remember that you can find show notes for all our other shows conveniently right here.

Get the latest security news and deals