Light Commands: Hacking Voice Assistants with Lasers

Conference:  BlackHat EU 2020



The presentation discusses the vulnerability of voice assistants and IoT devices to light commands and injection attacks, and the need for better security measures to minimize attack surfaces.
  • Voice assistants and IoT devices are vulnerable to light commands and injection attacks, which can compromise their security and allow unauthorized critical operations to be executed.
  • Device manufacturers have applied software patches to mitigate this attack, but security checks may still be overlooked, allowing for unauthorized operations.
  • The success of the attack depends on the attacker's ability to aim at the device's acoustic ports and have a line of sight to the device.
  • Future research is needed to understand the effects of different injection attacks and to develop software and hardware solutions to prevent them.
  • Sacrificing security for usability is not always a good idea, especially when all devices are connected to each other.
In the presentation, the speaker demonstrated a realistic attack by firing a 75a laser 75 meters from a bell tower into a room on the third floor of a nearby office building, injecting the command to open the garage door of a smart device, and successfully executing the command.


In the near future, our homes will employ potentially dozens of IoT devices. These devices listen to our voice commands using sophisticated microphones. Our laser-based injection attack Light Commands shows how microphones can respond to light as if it was sound. By simply modulating the amplitude of laser light, we can inject fully inaudible and invisible commands into microphones of smart speakers, phones, and tablets, across large distances and through glass windows.In this talk, we will show:How Light Commands works by exploiting a physical vulnerability of MEMS microphones,How it's possible to remotely inject and execute unauthorized commands on Alexa, Portal, Google, and Siri voice assistantsHow the ecosystem of devices connected to these voice assistants, such as smart-locks, home switches, and even cars, fail under common security vulnerabilities (e.g. PIN bruteforcing) that make the attack more dangerous