When Edward Snowden met with lawyers in Hong Kong in 2013 to get legal advice after he told the world about the National Security Agency’s (NSA) spying program, he asked everyone at the meeting to put their phones in the refrigerator.
It was an odd request, but the idea was that the thickly insulated metal walls of the refrigerator (not the temperature inside), would act as a shield. Something like a Faraday cage, the refrigerator would prevent eavesdropping.
When big tech, advertisers and even your boss seem to shadow you, keeping your phone in the refrigerator may look appealing. But is this just paranoia, or are our phones really spying on us? A new study investigates ways to protect our privacy.
Restricting App Access
Our phones are not spying on us, says Jonathan Weissman, a cybersecurity expert at the Rochester Institute of Technology in Rochester, N.Y. But our apps might be. An app that has access to your phone’s microphone can certainly hear what you’re saying. Weissman suggests to beware of the access you give apps, such as the camera, contacts list and photos.
“Ask yourself,” says Weissman, “Why would a weather app need access to my microphone? In just about all cases, an app that asks for permission to access things it doesn’t need or shouldn’t have is malicious.”
Of course, devices like Alexa and Google Assistant are supposed to listen to us, otherwise how would they know to set our alarm or order a pizza? These devices do listen, says Weissman, but they don’t record or share any information until we activate them with the wake-up word.
“A lot of people think these devices are collecting conversations, and that’s just not true,” he says. “The recording starts only when the trigger word is issued.” He also points out that with Alexa, at least, you can have what you say sent to the Cloud in text form, so there is no recording of your voice.
Using Artificial Intelligence Against Itself
Still feeling a little paranoid? That’s understandable. Big corporations haven’t exactly earned our trust with the way they handle our information. Plus, people can hack devices. But machine learning — the same technology that helps apps understand you — can make it difficult for those apps to make sense of what you’re saying.
The new tech of machine learning creates a background noise while you talk. But it’s not just any background noise. It’s specifically designed to confuse the Artificial Intelligence (AI) that’s listening to you. It finds patterns in your speech, then changes your words just slightly, in a way that a human could still understand, but would baffle a fellow AI.
However, if the technology can’t modify your words until after you’ve said them, it’s too late. Researchers at Columbia University came up with a new technology to solve this problem. Using a method like training large language models, the program predicts what you’re about to say, then distorts it.
In a study testing it, the technology disguised about 80 percent of what was said. To the study’s lead researcher, Mia Chiquier, that’s even better than it sounds. The words that the AI distorted were more likely words that carried the most information. That means eavesdropping programs may pick up on a lot of words like, “so,” “have,” and “you,” but completely miss the important words — the ones that explain how you found out about the NSA, for example.
Chiquier and her team are not producing an app that can do this. They just worked out that it can be done. But if someone decides to market an app using this technology, they probably wouldn’t have trouble finding customers. Not many of us are leaking government secrets, but most of us would like to have a little privacy.