Scam attempt against LastPass with AI imitating the CEO’s voice

Scam attempt against LastPass with AI imitating the CEO's voice

The use ofArtificial intelligence on the part of cybercriminals is creating quite a few concerns.

Among the most used technologies in this sense there are also software capable of simulating human voices, with hackers able to establish dialogues with potential victims by pretending to be almost any person. This includes the damage attack of LastPassnoto password managerwith an attempted fraud involving the use of the CEO’s voice Karim Touba.

In fact, in recent days, a company employee declared that he had received voice messages e calls by an individual claiming to be Touba. The sense of urgency not contact methodstrangers to company policies, they alerted the employee who recognized the attempted attack as such.

In fact, no damage was done to LastPass, despite the technique and method used by the cybercriminal being impressive in terms of sophistication and credibility.

LastPass attack foiled: a very important lesson for everyone

The company wanted to praise the employee’s attention, underlining how his work pushed LastPass to raise awareness of how these attacks can be a real threat to anyone.

For LastPass, it was important to share this experience to increase user awareness and, at the same time, highlight how threats related to deepfake and similar are now the order of the day. If a few years ago there were apps that, at a rudimentary level, allowed you to recreate the voices of famous or less famous people (like FakeYou), the proliferation of AI has made these software incredibly capable, therefore usable by attackers to simulate realistic calls.

From this point of view, users’ defenses may be few, apart from aextreme caution. Suspicious calls, which leverage a sense of urgency, should always be carefully checked to avoid potential scams.


Leave a Reply

Your email address will not be published. Required fields are marked *