Security researchers from the University of Cambridge discovered that attackers could use smart assistants such as Alexa or Google Home to spy on people in their vicinity by listening for typing sounds on virtual keyboards and recovering PINs and passwords. Using microphones to listen to what people type on their keyboards is nothing new.
They are known as “side-channel” attacks. In fact, the same team had a research paper on this very subject, but capturing sounds from mechanical keyboards is entirely different.
Writing on a virtual keyboard also produces sounds, even if they are not as powerful as with regular keyboards. “Acoustic side channels can also be exploited with virtual keyboards such as phone touchscreens, which despite not having moving parts still generate sound,” say the researchers. “The attack is based on the fact that microphones located close to the screen can hear screen vibrations and use them successfully reconstruct the tap location.”
A single microphone would have difficulties in listening for such sensitive sounds, but smart home assistants use microphone arrays that are a lot more sensitive and let them capture sound from any direction.
This is a good news, bad news situation. It turns out that directly using smart home assistants is not easy, as the implemented APIs don’t provide access directly to microphones. But the problem appears because of the poor history of privacy surrounding smart home assistants. Some companies are known to share recorded data with third-party organizations.
Researchers already showed that assistants capture data when various household items activate them, even for brief periods. If people happened to type while near smart assistants, any mistakenly recorded data would be enough to reconstruct much of the typed content.
Researchers used a Raspberry Pi with a ReSpeaker 6-mic circular array to collect data, and in the end, they determined that a smart speaker such as Alexa could, in theory, snoop on PINs or text entered on a
nearby phone or tablet. Up to around 40% of taps can be correctly classified on the first guess for numerical and text keyboards.
The best mitigation for this type of attack would have to come from the phone manufacturers who would have to inject false positives into the data stream by randomly playing silent tap-like sounds while the keyboard is open. A large number of false positives would make any guessing impossible.