AI can steal data by listening to the sound made by keyboard keys

With the’AI it seems it is possible to do everything, or almost. A team of researchers from Cornell University (USA) explained in a detailed document how he managed to train artificial intelligence to make it capable of interpret, listening only to the sounds, the inputs on a keyboard.

Simply trained on the sound played by individual keys, the AI ​​model – reads the report – was able to predict typing on the keyboard with as much as 95% accuracy. However, the system does not work with just any keyboard, but only with specific keyboards with one precise correspondence between sound and single key. In demonstrating the project, the team used a MacBook Pro and pressed 36 individual keys 25 times. In summary, this was the basis for the training of the AI ​​model, possible thanks to the distinguishable waveforms produced by each single key.

Keyboard typing

A new tool for hackers: AI to steal information

Given the potential, a tool of this type could be exploited by computer pirates to steal sensitive and/or confidential information. However, such an attack is not without weaknesses: to mitigate the accuracy of the system, for example, a simple change the typing style (pressure, speed, etc.). Another fairly trivial but effective way to embarrass the hacker on duty is to use software that plays white noise or extra keystrokesso as to completely distort the real input.

It goes without saying that the most “problematic” keyboards are those that are even relatively noisy. The mechanical ones (usually for gaming) produce a loud click, but membrane peripherals also generate distinguishable sounds, even the quietest ones. And therefore, the best solution to avoid attacks of this type would be the use of special software.


Please enter your comment!
Please enter your name here