Amazon Black Friday deals: here's when the madness begins

Share

The Light Commands vulnerability seems to be something of a flawless storm: since hackers can use lasers to activate smart speakers, tablets, or phones, they don't need to be within the typical pick-up range of the device's microphones.

Researchers at Japan's University of Electro-Communications and at the University of MI have discovered a way to hack into Amazon's Alexa, Apple's Siri Portal and Google Assistant from just over a football field away (110 meters), they announced in a white paper released on Monday.

According to the researchers' website, Light Commands is a vulnerability in Micro-Electro-Mechanical Systems (MEMS) microphones that allows attackers to remotely injected inaudible commands into popular voice assistants.

While Google Home is quickly expanding its library of commands, the reality is that Alexa remains the most robust digital assistant when it comes to smart speakers.

Daniel Genkin, one of the paper's co-authors and an assistant professor at the University of MI, told the Times that there is an easy fix for the time being: Leave your voice-controlled assistant out of the line of sight from outside your home, "and don't give it access to anything you don't want someone else to access". In a statement sent to WIRED, both Google and Amazon have said that they are reviewing the research paper.

Voice-activated digital assistants can be remotely hijacked by lasers as far as 350 feet away and made to order products, start cars, and otherwise drive a smart-home owner insane, researchers have discovered. Also, an attack would only work on unattended smart speakers as an owner could notice a light beam reflecting on a smart speaker.

More news: Jordan says three Mexicans, one Swiss wounded in stabbing at tourist spot
More news: As Uber Eats Helps to Sink Shares, Questioning the Gig-Economy
More news: Lamar Jackson, Ravens deliver landmark win over defending champ Pats

Amazon says that 85,000 smart home gadgets now integrate with Alexa, while Apple is trying to get more gadgets to work with its HomeKit system.

While this is a troubling development for fans of smart home technology, Light Command isn't going to make all your smart speakers and displays easily hackable.

The speakers responded to the light as if it was voice-based sound waves, which raises serious security concerns-especially since most of these devices do not require user verification in order to be used, at least not by default. However, they had to be much closer to the device as opposed to Echo.

They wrote: "Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio". In one case, they took over a Google Home on the fourth floor of an office building from the top of a bell tower at the University of MI, more than 200 feet away. For the first factor, if your voice assistant is not enabled, naturally the light would not have any effect. Assuming a smart speaker is visible from a window, hackers could use Light Commands to unlock smart doors, garage doors, and auto doors.

The researchers exploited the vulnerability in tests to do things like trigger a smart garage door opener and ask what time it is.

This kind of attack was demonstrated by the researchers via a video.

Share