AI can now find out passwords by listening to you type. But it isn't as scary as it sounds

New research has discovered a way to use machine learning to understand exactly what you’re typing, including passwords and secret information.

Try 6 issues for £9.99 when you subscribe to BBC Science Focus Magazine!

Published: August 18, 2023 at 8:00 am

All the hard work that you’ve spent making strong passwords, blending pet names with numbers, symbols and birthdates could all soon be for nothing as a new artificial intelligence model achieves a 95 per cent accuracy of understanding keystrokes.

Or at least that’s the extreme view of a new piece of research from a team of British researchers. Using a deep learning model, this team were able to steal data from a laptop’s keyboard using a microphone to understand what is being typed.

This, in theory, would allow hackers who were able to gain access to your laptop or a nearby device to obtain a transcript of what is being typed including messages, passwords and other sensitive information.

Nosey AI models

The first step for this attack to work is by recording the keystrokes on someone’s keyboard. This is needed to train the algorithm. While this could be done from the laptop’s microphone, it could equally be achieved by placing a microphone, such as a smartphone, near the computer.

By pressing 36 keys on a modern MacBook Pro 25 times each and recording the sounds produced, the researchers gained a full set of training data. This information is turned into waveforms to show identifiable differences between each key.

With this information in hand, the researchers could then build a machine-learning model to understand which of these waveforms lines up with which key.

Man working online at home on his laptop computer.

“If you get enough data, a model can be built pretty easily. They’ve used a MacBook which is a common computer and one that Apple has used the same keyboards on for three or four years,” Oli Buckley, a professor of cyber security at the University of East Anglia, told BBC Science Focus.

“If it works on one keyboard, it will likely work on the next. The MacBook has a nice, quiet keyboard, not something mechanical and clicky, so the idea is that if it works on something quiet, it will have a wide-reaching ability on louder keyboards”.

Why we're probably safe

While this all sounds pretty dystopian, not to mention a new form of hacking to look out for, it isn’t quite as worrying as it sounds.

“A good sample of data is needed for it to work, so this changes if you’re using a Dell, a MacBook or an external keyboard. Also, factors change. Some people type louder and harder, or my keyboards full of cat hair so that impacts things slightly”, says Buckley.

“They also mentioned that touch typing reduces the accuracy. It doesn’t have that percussive nature of typing, it’s more sedate and harder to pick up.”

While building an accurate dataset from your specific typing style and keyboard can be difficult, there is the added problem of gaining a good recording of the typing happening.

If a computer or smartphone is hacked into and used as a microphone, the hacker needs to capture enough clear data before the model can be used. If there is noise in the background – like music or chatting – this becomes more difficult.

The model could also be used by someone else placing a recording device near your laptop instead of hacking in. However, the same problem applies here.

“They placed a standard iPhone about 15-20cm away from the laptop. While that’s impressive, it depends on how commonplace it is. You need to get data about someone typing, know details about their keyboard and want to target this particular person,” explains Buckley.

“Then they’re going to need to be recorded typing even more, to actually capture the data that is being stolen. I can’t see this being a broad-scale issue in phishing attacks, so there’s no need to start playing white noise all the time.”

Even if the data is accurately captured, anyone utilising this model would need to sift through the data without context. While there could be obvious clues, such as an email address followed by something that looks like a password, it won’t necessarily be clear what this applies to.

There are also problems in the software that was tested, such as its struggle with noticing the shift key – a button mostly used for capitalising letters or using symbols.

While the chances of the technology being used via your hacked laptop’s microphone are slimmer, the risk still exists with opportunists in places like coffee shops and on public transport where people are working, but these situations are also more complex than the situation of the study.

“There’s a lot of background noise on trains and in coffee shops. Of course, someone could learn the dataset for MacBook Pros specifically, and target these laptops out in public, but there is still a lot of context that is needed,”

“You could record an hour of someone typing, using a pre-existing model of their laptop’s typing data but there’s no guarantee they’ll type anything interesting in that time. Typing is also a very unique thing. People pick up quirks and traits that aren’t necessarily easily translated.”

Read more: