Hackers can use white noise to break into your Alexa

Share

In doing so, you could essentially override the message the voice assistant is supposed to receive and substitute it with sounds that will be interpreted differently, thus giving the voice assistant a different command that would be virtually unrecognizable to the human ear.

Researchers at the University of California, Berkeley were able to activate the AI assistants on smartphones and smart speakers, prompting them to open websites or dial phone numbers.

The hidden instruction is inaudible to the human ear, so there's no easy way of telling when Alexa might be tricked into adding an item to your Amazon shopping cart or unlocking your front door, for example.

Researchers have found a way to hide malicious voice commands in music that Alexa, Siri, and Google Assistant will follow.

Over the past two years, researchers have worked within university labs on hidden commands that only Siri, Alexa, and Google Assistant will pick up, and not humans. However, they can be heard and used by the machine learning software that's used to power these digital assistants. "My assumption is that the malicious people already employ people to do what I do", Carlini said.

What these research studies prove is that it's possible to manipulate speech recognition gadgets by making minute changes to speech or other audio files. In the latest development of the research, some of the UC Berkeley students determined a way to hide commands within music or spoken text recordings.

Couple Marries Days After Bride's Arm Bitten Off in Crocodile Attack
Despite not being able to have as many guests as originally planned, the couple went ahead with the wedding on their planned date. Just five days later, they Wednesday in a hospital chapel as planned. "The canoe started deflating and it all happened so fast".

They also embedded other commands into music clips. Try as we might, we couldn't discern the difference.

The researchers have now demonstrated that automatic speech recognition, too, is vulnerable to such attacks. Speech-recognition systems typically translate each sound to a letter, and compile these into words and phrases. "By making slight changes to audio files, researchers were able to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being almost undetectable to the human ear".

Consider that Amazon, which is behind Alexa, wants to use a series of connected locks and cameras to allow deliveries into your home or auto while you're not there.

Amazon told The New York Times it has taken steps to ensure its speaker is secure. Apple says that its HomePod won't act on commands that unlock doors, and iPhone and iPod models must be unlocked before Siri can access personal data, call up websites or open apps.

But Carlini explained their objective is to flag the security problem - and then try to fix it. Or does this sound far-fetched?

Are you turning the microphones off on your smarty speakers as we speak?

Share