Consumer Alert: Scammers use AI to try to fool News10 NBC anchor
Imagine this scenario. You get a call from a family member in serious trouble. The voice on the phone sounds just like them. But it’s not. AI has taken the family emergency scam to a whole new level.
We’ve all heard of the grandparent scam. You get a call from someone who says he’s your grandson. He’s in jail, and he needs money.
But with AI voice cloning, all a scammer needs is a short audio clip of someone’s voice, like a video posted on social media, and voila! It’s your grandson. That’s what happened to News1 NBC morning anchor Lynette Adams. Late one afternoon, she got a call from someone who said he was a police officer.
Lynette: “The person said ‘Ma’am, I have some bad news for you. I hope you’re in a place where you can listen.’ And I said ‘Okay,’ He said, ‘Your daughter’s been in a terrible accident.’ And immediately my heart started beating”
The scammer’s lie was designed to make Lynette panic. His next line was designed to terrify her. The so-called police officer told her this.
Lynette: “‘We want to tell you that she was texting and driving and crashed into a woman who’s expecting a baby, and the woman was injured and is at the hospital. And could lose her baby.’ And he goes on to tell me about the trouble that she’s in. He says right now these are traffic violations. but if the baby dies, the charges will escalate.”
Deanna: And any mother would panic! I know I would.”
Lynette: “I was panicked and worried about my daughter and worried about this woman who may lose her baby.”
Deanna: “So at some point he said, I’m going to put your daughter on the phone.”
Lynette: Right. And he said, ‘I’m going to put her on the phone, but I want to let you know that she has been hysterical. She’s been crying, and we’re trying to calm her down and we can’t. So, he puts what is supposed to be my daughter on the phone — sounding like a young woman — and actually sounding just like my daughter!”
Deanna: “And you’re her mother, so of course, you would know her voice.”
Lynette: “Absolutely. It was her voice.”
That’s the danger of AI. The scammer was able to replicate her daughter’s voice so well that he was able to fool even her mother. That so-called police officer then said her daughter’s public defender was going to call. And he did. Now the hook.
Lynette: “He said that he thought he could get my daughter out tonight, that the judge would be lenient to her if we had a significant amount of money for bail.”
Bam. That’s how the scammers planned to steal her money. Lynette didn’t fall for it. Instead, she made a phone call.
Lynette: “I called her dad. and I asked if he’d seen her. He said, ‘Yeah, I just saw her a few minutes ago.’ I said ‘What?’. ‘ So I went on to tell him what just happened, and it was said to me, it’s a scam.”
A week before this scam, someone had called Lynette’s daughter claiming she’d won something. Lynette believes that was the scammer, and during that bogus call they recorded her daughter’s voice. AI has made scams so sophisticated you can’t even trust your own ears. It’s always best to do what Lynette did. Hang up. Then call the person who the caller claimed to be. It’s a new world.
We, as smart consumers, have to stay a step ahead. And that’s your consumer alert.