A terrifying new scam is using artificial intelligence to trick people by cloning a loved one’s voice. That’s exactly what happened to a San Jose mother who recently lost hundreds of dollars.
Several weeks ago, Livier Hernandez picked up the phone and heard hear 20-year-old daughter screaming on the other line.
It was a call from a New Jersey number. A man’s voice identified himself as a drug trafficker, and explained that Hernandez’ daughter had witnessed a drug deal and he they had kidnapped her.
Then, he demanded thousands of dollars. Otherwise, he threatened, he would take her daughter to Tijuana and Hernandez would never see her again.
So she rushed to a nearby money transfer service in San Jose as the voice instructed, first depositing $400 under a woman’s name in Tultepec, Mexico.
Hernandez told the man that was all the cash she had — but she says she then heard her daughter’s screams again, pleading for help. She was able to come up with the rest of the money, and then transferred it to the same woman.
Hernandez never heard from them again.
Finally, she called her daughter’s cellphone — and realized that she had been safe all along. The call was a scam.
“We talked to this money transfer service and they tell us this isn’t the first fraud case they’ve seen here,” said FBI Special Agent Gilberto Lujan, who added, “A.I. has given the capability to many that didn’t have prior technical experience, or deep technical experience, to be able to create these attacks.”
Get a weekly recap of the latest San Francisco Bay Area housing news. Sign up for NBC Bay Area’s Housing Deconstructed newsletter.
Lujan works with the Cyber Security Squad at the San Francisco Division of the FBI. He said the bureau is now seeing extortion cases like Hernandez’ where the crime scene is a virtual one. That makes leads harder to track.
In fact, Hernandez said that’s exactly what a family friend — who was a police officer — told her, which is why she didn’t formally report it to San JOse police.
When NBC Bay Area reached out to the number that had called her, it was already disconnected.
Still, the FBI recommends reporting such an incident to local authorities.
Some Bay Area tech companies are also stepping in to help.
“Our models can actually detect if a piece of content is A.I.-generated, even if the human eye maybe can’t tell,” said Hive CEO Kevin Guo.
Hive — based in San Francisco — has a free online tool that can tell you if an image or audio is authentic.
While it might not always help in a moment of panic — as in Hernandez’ case — Guo recommends trying to carry on a longer conversation with the person claiming to be a loved one.
“So if you were to ask the person certain questions, could they actually respond instantaneously in that character, in that voice,” Guo said. “Usually they’re pre-recorded right? ‘Cause it’s very expensive to generate.”
Now, Hernandez hopes no one else has to go through what she went through, and hopes the people who defrauded her will one day face consequences.