DailyGlimpse

AI Voice Clone Scam Nearly Dupes Veteran Attorney with Fake Bail Plea

AI
May 1, 2026 · 2:19 AM

A veteran attorney with decades of courtroom experience nearly fell victim to a sophisticated AI voice cloning scam that used his son's cloned voice to demand fake bail money.

Gary Schildhorn received a phone call from a panicked voice that sounded exactly like his son, claiming to have been in a car accident and needing bail money urgently. The AI had captured his son's exact tone, inflection, and speech patterns from publicly available audio. A second caller, posing as a public defender, stood by to collect the payment.

The scam relied on creating panic and isolation, leaving no time for the victim to think critically. Schildhorn only avoided losing thousands of dollars by taking one crucial step: he hung up and called his son directly on his known number, confirming the call was a fraud.

This incident highlights the growing threat of deepfake audio scams, where fraudsters need only seconds of voice sample from social media or voicemails to clone a voice convincingly. Experts warn that as AI technology improves, these scams will become harder to detect, and recommend families establish a code word or verification protocol for emergency calls.