SecureMac, Inc.

Karen Gullo on government monitoring

June 16, 2020

Karen Gullo is an analyst and senior media relations specialist for Electronic Frontier Foundation (EFF). She is an award-winning journalist who has written for both The Associated Press and Bloomberg News. Her writing has focused on the intersection of law, politics, and technology. In her current role, Gullo supports EFF’s mission of defending civil liberties and enhancing rights in the digital world.

If you would like to learn more about Karen and her work, you can follow her on Twitter or read her past articles and blog …

Karen Gullo is an analyst and senior media relations specialist for Electronic Frontier Foundation (EFF). She is an award-winning journalist who has written for both The Associated Press and Bloomberg News. Her writing has focused on the intersection of law, politics, and technology. In her current role, Gullo supports EFF’s mission of defending civil liberties and enhancing rights in the digital world.

If you would like to learn more about Karen and her work, you can follow her on Twitter or read her past articles and blog posts on the EFF website.

If you follow digital privacy news, you’re probably well aware of the traditional threat landscape. Hackers make headlines with each new data breach, and the cybersecurity press is constantly warning us about the latest phishing scam aimed at stealing our personal information. Similarly, the privacy issues with large tech companies like Facebook and Google have been amply documented and discussed.

But what you may not be aware of is the extent to which your own government is deploying technological surveillance tools — both nationally as well as locally — and using them to keep tabs on you.

As Karen Gullo of Electronic Frontier Foundation (EFF) explains, the scope of these government monitoring programs is truly staggering: 

This affects everyone, but disproportionally affects marginalized, over-policed communities.

Governments and/or law enforcement are increasingly relying on tech tools at the local, state, and federal level to monitor, track, and collect information about people. 

This includes street level surveillance — cameras on telephone polls that capture facial scans or the license plate number on your car, surveillance drones, “stingrays” that trick cellphones into connecting with them, police bodycams, etc — as well as social media monitoring, GPS tracking, facial recognition cameras at airports, mass collection of Americans’ call records authorized under laws enacted after 9/11, and much more. 

This affects everyone, but disproportionally affects marginalized, over-policed communities. 

One particular area of concern is the increasing use of face recognition technology by law enforcement. It has been enthusiastically embraced by organizations of all sizes, from federal agencies like the FBI and ICE to local police departments. However, there are serious issues with the technology. Public officials have started to weigh in on the matter, with U.S. Senator Ed Markey (D – MA) standing out as perhaps the most vocal critic of face recognition in Washington. EFF, for its part, has strongly opposed the use of face recognition tech by law enforcement — and has encouraged citizens to take action to end its use in their communities. 

Calls for a complete ban may strike some as overkill — especially those who believe that face recognition has the potential to prevent crimes. But Gullo challenges assumptions about the technology’s basic efficacy, and points out that it could end up adversely impacting civil liberties and social justice.

It also can generate inaccurate reports, with racial and gender bias embedded in many systems, that disproportionately affect marginalized communities.

Face recognition may seem convenient and useful, but is actually a deeply flawed technology that exposes people to constant scrutiny by the government, and has the potential to chill free speech and movement by identifying and tracking people as they visit their doctors, lawyers, houses of worship, or political demonstrations. It also can generate inaccurate reports, with racial and gender bias embedded in many systems, that disproportionately affect marginalized communities.

In addition to law enforcement, governments have also shown interest in using technology to monitor citizens for reasons of public health. During the COVID-19 pandemic, authorities around the world have turned to technology as a way to implement contact tracing and try to slow the spread of the virus. 

Apple and Google have developed a cross-platform API to facilitate mobile-based contact tracing that also protects people’s privacy. Within the tech community itself, there are skeptics who doubt that Silicon Valley can deliver on its promises to safeguard user privacy — but perhaps more worryingly, many countries and locales are foregoing the new API entirely, opting to create their own (even more intrusive) versions of contact tracing.

Gullo warns that we must be extremely careful about the powers we grant the government during an emergency — and proposes a three-part test to answer the question of whether or not a particular measure is acceptable:

No COVID tracking app will work absent widespread testing and interview-based contact tracing. Bluetooth proximity is the most promising approach so far, but needs rigorous security testing and data minimization. No one should be forced to use it.

Thus, when governments demand new surveillance powers — especially now, in the midst of a crisis like the ongoing COVID-19 outbreak — EFF has three questions: 

1) has the government shown its surveillance would be effective at solving the problem, 

2) if the government shows efficacy, would the surveillance do too much harm to our freedoms? 

3) if the government shows efficacy, and the harm to our freedoms is not excessive, are there sufficient guardrails around the surveillance?

It is all too easy for governments to redeploy the infrastructure of surveillance from pandemic containment to political spying. It won’t be easy to get the government to suspend its newly acquired tech and surveillance powers.

This approach may seem overly cautious to some, but it’s important to keep in mind that the current pandemic will one day end — and when it does, governments may not feel like surrendering their new powers and tools. As Gullo explains, this could have serious repercussions:

Many of the new surveillance powers now sought by the government to address the COVID-19 crisis would harm our First Amendment rights for years to come. People will be chilled and deterred from speaking out, protesting in public places, and associating with like-minded advocates if they fear scrutiny from cameras, drones, face recognition, thermal imaging, and location trackers. It is all too easy for governments to redeploy the infrastructure of surveillance from pandemic containment to political spying. It won’t be easy to get the government to suspend its newly acquired tech and surveillance powers.

COVID-19 has transformed our society, virtually overnight. Millions of people are now working at home, and many companies are being forced to deal with the cybersecurity challenges of a remote workforce for the first time ever.  

For students, online learning has become the norm. But this new reality has created serious privacy issues, as more and more young people are being subjected to surveillance in the name of education. Universities and school districts are turning to exam proctoring software in order to allow kids to test from home — software which includes such invasive features as webcam and microphone recording, AI gaze detection, and the ability to take screenshots of students’ computers. 

In response to this new privacy threat, EFF has created a toolkit for students and parents who want to better understand the issue (and learn how to push back against school administrations that require such software):

We published a Surveillance Self-Defense guide for students that shows students and concerned parents what kind of technologies to watch for, how they can track you, and what it means for privacy. Often, the best solution is to simply not use the systems that schools have set up, if you’re able to, and encourage your friends to do the same. But the new guide also shows students how to gather information on what’s happening and how to talk to adults about it.

Discussions of digital privacy often assume a context of democracy and constitutional law, in which people facing threats to their privacy have a political and legal basis to challenge their governments’ surveillance activities. But for billions around the world, this is simply not the case. Many governments can and do violate their citizens’ privacy and human rights, seemingly with impunity.

Complicating the issue is the fact that most tech companies are based in democratic countries with strong privacy protections — and yet these same companies have often enabled abusive governments abroad. 

Gullo says that there are a number of things ordinary people can do to hold tech companies to account — and to further the cause of digital privacy and human rights in other countries:

Customers still have a great deal of leverage of their own. Boycotts, or simply deciding that you won’t use technology that doesn’t reflect your values, can be a strong strategy …

This has been an issue for many years, not just during COVID. It’s deeply troubling when companies assist in human rights abuse, whether it’s Cisco selling tools custom-built to help China target minorities, or FinFisher selling spyware to the government of Ethiopia, or NSO group selling technology to Saudi Arabia that was used to target a U.S.-based journalist. 

Tech workers and customers can insist that tech companies selling to potential abusers adopt a robust Know Your Customer program, to make sure that they aren’t selling tools that are being used to repress people or populations. 

Customers still have a great deal of leverage of their own. Boycotts, or simply deciding that you won’t use technology that doesn’t reflect your values, can be a strong strategy in some situations. You can choose, support, and even help to build free and open source alternatives. Another way to advocate for more ethical technology is for Internet users to band together to change laws or public policies.

People tend to take action when they feel that change is possible. Unfortunately, there is widespread pessimism about digital privacy issues. One Pew survey showed that most Americans believed it was “impossible” to go about their daily lives without being monitored by the government or corporations. While the vast majority of those polled expressed concern over the use of their data, and believed that the risks of such monitoring outweighed the benefits, an alarming 81% of respondents said they felt as if they had little or no control over the data collected about them.

But although this pessimism is understandable, it isn’t entirely supported by the facts. While she cautions that the fight for privacy is far from over, Gullo points out that coordinated political and legal action is already producing results: 

We are all better informed about privacy abuses and the collection and sale of our private information, and users are demanding that tech companies do more to protect our privacy. Communities are pushing for bans or moratoriums on technologies like facial recognition, while states are passing strong privacy protection laws like the California Consumer Privacy Act. 

Courts have ruled in favor of consumer privacy rights in a number of very important cases, such as the “Carpenter” decision, in which the U.S. Supreme Court found that people have a right to expect their real-time physical location is private and police need a warrant to obtain their location through their cellphone. Other court rulings have strengthened privacy rights: in Pennsylvania, the state supreme court recently ruled that police can’t force people to turn over their passwords, and a federal court in Boston ruled that suspicionless searches of travelers’ electronic devices by federal agents at airports and other U.S. ports of entry are unconstitutional. 

This is all good news, but much more needs to be done to protect digital privacy.  

To receive updates and news from Electronic Frontier Foundation, follow them on Twitter. To access their extensive body of informational resources and practical toolkits, visit EFF.org. EFF is a 501(c)(3) non-profit organization; if you would like to support their work, please consider making a donation through their website.

Get the latest security news and deals