- AI can steal passwords from keystroke sounds recorded over Zoom with up to 93% accuracy, per a new study.
- The accuracy rate ratcheted up to 95% when keystrokes were recorded using an iPhone 13 mini.
- "Passwords containing full words may be at greater risk of attack," per the study.
An AI tool could decipher text — including passwords — from keystroke sounds recorded over Zoom and be right over nine times out of ten, a group of researchers said in a paper published on August 3.
An AI model developed by the researchers showed a 93% accuracy rate in deciphering keystrokes from a recording of a Macbook's keystrokes made over video conferencing software Zoom, according to a group of researchers affiliated with Durham University, the University of Surrey, and the Royal Holloway University of London.
Moreover, the accuracy rate rose to 95% when keystrokes were recorded using an iPhone 13 mini.
"Acoustic side-channel attacks" are a growing threat to keyboards, the researchers said. Side-channel attacks are used by hackers to exploit information — like how much power your computer is using or the keystroke sounds it makes — rather than directly attacking the system's code.
For their experiment, the team used a 2021 16-inch MacBook Pro, highlighting its consistent keyboard design with other recent MacBook models.
Their AI tool was based on "deep learning" — a subset of machine learning that trains computers to analyze data similarly to how the human brain functions.
The study also highlighted potential countermeasures. "Passwords containing full words may be at greater risk of attack," per the study's authors. Touch typing and adding background noise also seemed to lower the accuracy rate of the AI tool.
The authors said that while these types of attacks are under-studied, they have a long history. "Acoustic emanations" were even mentioned as a vulnerability in a 1982 partially declassified NSA document, the authors wrote.
The study adds to recent concerns over how AI tools could be used to compromise security and privacy.
AI tools can make online scams harder to detect because AI makes it easier to personalize scams for each target, Insider reported last Tuesday.
Two specialists raised the alarm in 2019 over how the advancement of AI and 5G technology would heighten vulnerabilities in internet-connected devices, amplifying cybersecurity threats.
The study's authors did not immediately respond to a request for comment from Insider, sent outside regular business hours.