Arizona mom who fell victim to deepfake kidnapping scam makes gripping testimony

An Arizona mother has made an emotional statement, recounting a horrifying ordeal in which scammers used artificial intelligence to mimic her daughter’s voice and staged a fake kidnapping for ransom.

Speaking in a Senate Judiciary Committee hearing on Tuesday, Jennifer DeStefano described her fear when she received a call from scammers in April demanding $1 million for the safe return of her 15-year-old daughter Brie.

While the ploy failed within minutes after DeStafano contacted Brie and confirmed she was safe on a ski trip, the sheer fear the mother felt upon hearing what sounded like the girl’s cry for help was absolutely real.

Though Brie doesn’t have any public social media accounts, her voice can be heard in some school and sports interviews, her mother said – warning parents to be aware of how easily scammers can impersonate a loved one’s voice.

DeStefano testified that it was “a typical Friday afternoon” when she received a call from an unknown number, which she answered thinking it might be a call from a doctor.

Speaking in a Senate Judiciary Committee hearing on Tuesday, Jennifer DeStefano described a kidnapping scam in which scammers used AI to mimic her daughter's voice

Speaking in a Senate Judiciary Committee hearing on Tuesday, Jennifer DeStefano described a kidnapping scam in which scammers used AI to mimic her daughter’s voice

DeStefano's 15-year-old daughter Brie (with her upstairs) was safe on a ski trip with her father, but scammers briefly convinced her mother they had kidnapped the girl

DeStefano’s 15-year-old daughter Brie (with her upstairs) was safe on a ski trip with her father, but scammers briefly convinced her mother they had kidnapped the girl

“I answered the phone and said, ‘Hello,’ and on the other end, our daughter, Briana, was sobbing and crying and said, ‘Mom,'” DeStefano told the Senate panel.

The mother initially believed her daughter had injured herself on the ski trip, but kept her cool and asked the girl what had happened.

Briana continued, ‘Mom, I screwed it up’ and cried and sobbed some more. Without thinking twice, I asked her again, “Okay, what happened?” the mother continued.

“Suddenly, a man’s voice called out to her, ‘Lie down and put your head back.’ At that moment, I panicked. My concern escalated and I wanted to know what was going on, but nothing could have prepared me for her reaction.

“Mom these bad men got me help me help me!!” She begged and pleaded when the phone was taken from her.

“A threatening and vulgar man took over: ‘Listen, I have your daughter, tell someone, call the police, I’m going to drug her stomach so full, I’m going to use her to get my way.’ , drop her off in Mexico and you’ll never see her again!”

“The whole time Briana was in the background desperately begging, ‘Mom, help me!!!'”

At the time of the call, DeStefano was at another daughter’s rehearsal, and as she silenced the scammers, she yelled for help, thereby luring other moms who began calling 911 and trying to contact her husband or Brie.

1686735148 38 Arizona mom who fell victim to deepfake kidnapping scam makes

“Mom, these bad men got me, help me, help me!!” “She begged and pleaded when the phone was taken from her,” DeStefano testified

Meanwhile, DeStefano did her best to get the “hijackers” to talk until the police could arrive.

The “hijackers” demanded a ransom of $1 million, but when DeStefano panicked and told them it was impossible, they quickly lowered their demand to $50,000.

She testified, “At that moment, the mother who was calling 911 walked in and informed me that 911 was familiar with an AI scam that allowed them to mimic the voice of a loved one.”

Brie was safe on a ski trip, unaware of the horror her mother was enduring

Brie was safe on a ski trip, unaware of the horror her mother was enduring

“I didn’t think that was cheating.” It wasn’t just Brie’s voice, it was her screams, it was her sobs that were unique to her. There was no pretending that I protested.

“She told me that AI can also reproduce tone of voice and emotion. That gave me a little hope, but not enough.’

She continued, “I asked for wiring instructions and routing numbers for the $50,000 but was denied.” “Oh no,” the man demanded, “that makes sense, this isn’t going to continue.” We’ll pick you up !“

‘”What?” I yelled, “You agree that we can pick you up in a white van with a bag over your head so you don’t know where we’re taking you.” You better all have $50,000 in cash or you and yours daughter dead! If you don’t agree to this, you will never see your daughter again!’ He screamed.’

Despite her horror, DeStefano kept her cool and continued to negotiate the details of her own kidnapping to buy time.

At that point, another mother approached her and confirmed that after reaching her on the phone, her daughter was perfectly safe and with her father on the ski trip.

“My head was spinning. “I can’t remember how many times I needed reassurance, but when I finally realized she was safe, I was furious,” DeStefano testified.

Friends were able to quickly confirm Brie's safety within minutes of the hoax call

Friends were able to quickly confirm Brie’s safety within minutes of the hoax call

Meanwhile, the outraged mother still had the prank kidnappers on the phone.

“I attacked the men for their horrific attempt to extort and extort money.” Going as far as faking my daughter’s kidnapping was the very best bang for the buck.

“They kept threatening to kill Brie.” I promised I would stop them, not only would they never harm my daughter, but they would not continue to harm anyone with their plan.’

Angrily, DeStefano said when she tried to file a complaint with the police, the matter was dismissed as a “hoax call.”

She called on Congress to take action to prevent criminal misuse of the new AI technology.

“As our world moves at breakneck speed, the human element of familiarity that underpins our social fabric of ‘knowledge’ and ‘truth’ is being revolutionized by artificial intelligence. Some for good, some for evil,” she said.

“If left uncontrolled, unguarded, and without consequence, it will redefine our understanding and perception of what is and isn’t truth.” It will erode our sense of “familiarity” as it increases our confidence in what is real is and isn’t, undermines.

DeStefano called on Congress to take action to prevent criminal misuse of the new AI technology

DeStefano called on Congress to take action to prevent criminal misuse of the new AI technology

Senators and witnesses are seen at Tuesday's hearing titled

Senators and witnesses are seen at Tuesday’s hearing titled “Artificial Intelligence and Human Rights.”

AI voice cloning tools are rife online, and DeStefano’s experience is part of an alarming spate of similar hoaxes sweeping the country.

“AI voice cloning has become almost indistinguishable from human speech, allowing threat actors like scammers to more effectively extract information and funds from victims,” ​​Blackbird.AI executive director Wasim Khaled told AFP.

A simple web search will bring up a multitude of apps, many of which are available for free, to create AI voices using a small sample – sometimes just a few seconds – of a person’s real voice, which are easily stolen from content posted online can.

“With a small sample of audio, an AI voice clone can be used to leave voicemails and voice texts.” “It can even be used as a live voice changer on phone calls,” Khaled said.

“Scammers can use different accents, genders, or even mimic the speech patterns of loved ones.” [The technology] enables the creation of convincing deep fakes.”

In a global survey of 7,000 people from nine countries including the United States, one in four said they had experienced or knew someone who had experienced an AI voice cloning scam.

Seventy percent of respondents said they weren’t sure they “could tell the difference between a cloned voice and a real voice,” according to the poll released last month by US-based McAfee Labs.

American officials have warned of an increase in so-called “grandparent scams,” in which a scammer poses as a grandchild in dire need of money in a difficult situation.

“You get a call. A panicked voice rings out on the phone. it’s your grandson He says he’s in big trouble – he wrecked the car and ended up in jail. “But you can help by sending money,” the U.S. Federal Trade Commission said in a warning in March.

“It sounds just like him.” How could it be a scam? voice cloning, here’s how.’

In the comments below the FTC’s warning were several testimonies from older people who had been deceived in this way.