This AI tool can steal your data just by listening to what keys you press

New research has revealed a novel way to steal sensitive information – by working out what a target has typed via the sound of key presses alone. 

A team at Cornell University has published a paper detailing their exploits, offering, in its words, “a practical implementation of a state-of-the-art deep learning model in order to classify laptop keystrokes, using a smartphone integrated microphone.”

Source link

This post originally appeared on TechToday.

Leave a Reply

Your email address will not be published. Required fields are marked *