Skip to main content
Partly Cloudy icon
20º

University of Michigan researchers hack into Alexa, Google Home, Siri with lasers

Research team discovers vulnerability of microphones in voice assistants

ANN ARBOR – Google Home. Amazon Alexa. Apple’s Siri.

Many of us own these virtual assistants and use them to play our favorite music, search for recipes, check the weather and more.

Aside from the occasional concern associated with privacy and microphones on the devices, most users don’t think twice about the power these assistants hold.

Until now.

Researchers at the University of Michigan and University of Electro-Communications in Tokyo have found a way to hijack these assistants with lasers. That’s right. Devices that are wired to respond only to voice commands are able to be disrupted with light.

How does it work?

In the voice assistants’ MEMS microphones, there is a vulnerability that allows attackers to inject invisible and silent commands into the device using laser beams. This vulnerability is what the researchers have dubbed Light Commands.

✉ Like what you’re reading? Sign up for our email newsletter here!

While microphones work by picking up sound and converting it into electrical signals, the researchers were able to trick MEMS microphones by producing the same signals using light.

Michigan’s team, Profs. Kevin Fu, Daniel Genkin, Dr. Sara Rampazzi and PhD student Benjamin Cyr, along with professor Takeshi Sugawara from the University of Electro-Communications, Tokyo demonstrated the effect, injecting malicious commands into voice assistants, phones, tablets and smart speakers from a long distance via glass windows.

What does this mean?

Depending on what victims have tied to their devices, the threat could be minimal to severe. For instance, the researchers were able to unlock and start autonomous vehicles, and even unlock a victim’s home who had a smart-lock front door.

In many cases, a smart phone or a voice assistant, once hacked, could give way to several other systems for hackers to take control over. This could mean credit cards, connected medical devices and e-commerce accounts could all be vulnerable if they are linked to an assistant.

Moral of the story? Keep your virtual assistant far away from windows.

Learn more at https://lightcommands.com.