Jennifer DeStefano, a mom from Arizona claims that scammers used AI expertise to clone her daughter’s voice in a $1 million kidnapping scheme. She obtained a name from somebody claiming to have kidnapped her 15-year-old daughter whereas she was on a snowboarding journey. The particular person used an AI simulation to mimic her daughter’s voice and demanded a ransom of $1 million, later decreased to $50,000. DeStefano stated that she heard her daughter crying and begging for assist in the background, which satisfied her that it was a real name — fortuitously, the mom managed to substantiate her daughter’s location and security after contacting the police.
The expertise used to clone the daughter’s voice was an AI simulation, which replicated her voice from transient soundbites. The incident has raised issues concerning the misuse of AI and the potential for scammers to make use of it to control folks in new and complicated methods. The pc science professor and AI authority at Arizona State College, Subbarao Kambhampati, defined that AI simulations may come near replicating somebody’s voice with solely three seconds of audio.
Jennifer DeStefano says she was focused in a $1 million rip-off involving AI
In an effort to keep away from such scams, Dan Mayo, an assistant particular agent within the FBI Phoenix workplace advises folks to be cautious with their private info on social media, as scammers can use this to dig into their lives and create a convincing story. Mayo recommends asking questions on family members that scammers wouldn’t know to confirm their identification.
This incident highlights the significance of being vigilant and conscious of the potential for AI expertise for use in scams.
Filed in
. Learn extra about AI (Synthetic Intelligence) and Crime.
GIPHY App Key not set. Please check settings