A new study has found that artificial intelligence (AI) models can be used to steal passwords by listening to the sound of keystrokes. The study, conducted by researchers from Durham, Surrey, and Royal Holloway universities in the UK, found that an AI model could accurately identify passwords with 95% accuracy.
The researchers trained the AI model on a dataset of keystroke sounds collected from a variety of devices, including laptops, smartphones, and tablets. The model was then able to identify the sounds of individual keys and their combinations, even when the typing was quiet or muffled.
The researchers say that the AI model could be used by hackers to steal passwords in a variety of ways. For example, the model could be used to steal passwords over a Zoom call, or it could be used to steal passwords from a device that is left unattended.
The researchers say that the study highlights the need for users to be more aware of the risks of acoustic side-channel attacks. These attacks exploit the fact that every device makes a unique sound when it is used. By listening to these sounds, hackers can potentially steal sensitive information, such as passwords and credit card numbers.
To protect yourself from acoustic side-channel attacks, the researchers recommend taking the following steps:
- Use a strong password manager to generate and store your passwords.
- Avoid typing passwords in public or over unsecure channels.
- Keep your devices up to date with the latest security patches.
- Avoid using common passwords or passwords that are easily guessed.
- Change your passwords regularly.
- Enable two-factor authentication (2FA) whenever possible.
By following these tips, you can help to protect yourself from acoustic side-channel attacks and other forms of cybercrime.
How to Protect Yourself from Acoustic Side-Channel Attacks
As the study has shown, AI models can be used to steal passwords by listening to the sound of keystrokes. This is a serious security threat, but there are steps you can take to protect yourself.