A sophisticated AI deepfake romance scam has cost a U.S. woman her life savings—over $630,000—after a fraudster used realistic AI-generated video messages impersonating a well-known actor to build a fake relationship over several months.
Vivian Ruvalcaba's mother, described as educated, sharp, and aware, fell victim to the elaborate scheme. The scammer used deepfake video technology to create personalized messages that appeared to come from the actor, calling her by name and making her feel special. The emotional manipulation was so effective that she eventually sold her home and transferred the funds to someone she never met in person.
"AI deepfake romance scams are not just convincing—they are indistinguishable from reality. And they are destroying families financially and emotionally," said Ruvalcaba.
The case highlights how deepfake technology makes it possible to impersonate celebrities with convincing video and audio, leaving victims vulnerable even if they are typically cautious. The scammer exploited the trust built through repeated personalized interactions, a tactic common in romance fraud.
Ruvalcaba shared the story to warn others that no one is immune. She emphasized that red flags include unsolicited contact from a celebrity, requests for money, and reluctance to meet in person. As AI tools become more accessible, such scams are expected to rise.