A new wave of scams is leveraging artificial intelligence voice cloning to extort money from victims through a terrifying tactic known as "virtual kidnapping." Scammers use AI to replicate the voice of a loved one, making it sound as if they are in distress or danger, then demand immediate payment.
This psychological manipulation preys on emotional triggers, often leaving victims with little time to verify the situation. The technology, once a novelty, has become a tool for criminals to generate significant profits.
Cybersecurity experts urge the public to stay alert and verify any urgent calls from supposed family members, especially if they request money or personal information. As AI voice cloning becomes more accessible, the risk of these scams continues to grow.