Scam Alert: AI Voice Deepfakes
The phone rings. You hear a panicked voice. It's your grandson. He says he's in trouble. He wrecked his car and he's sitting in jail. He needs your help, he says. He needs you to send money.
You're immediately concerned about your grandson, yet you're cautious. You've heard about grandparent scams, but this seems legitimate. The caller sounds just like your grandson. You recognize his voice.
Could this be a scam?
YES!
Thanks to voice cloning and other forms of artificial intelligence (AI), scammers have found new ways to defraud you. Their techniques get more sophisticated by the day.
The Federal Trade Commission has issued new warnings about con artists who use use AI to clone the voice of a loved one. This technology, know as deepfake, involves fabricated media that has been digitally altered.
Criminals don't need much to produce a deepfake audio file of your grandson's voice. All they need is a short audio clip of him speaking, which is easy to find on his social media accounts. When the scammer calls you, he’ll sound just like your grandson.
It's scary, and it's the reality we live in today. Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We're living with it, here and now.
How can you tell if a family member is really the one calling you or if it’s a scammer using a cloned voice?
Don’t trust the voice.
Call the person directly and verify their story.
Use a number that you know is theirs.
If you can’t reach your loved one, try to get in touch with them through another family member or their friends.
If the caller is asking for payment via gift cards, money transfers, and cryptocurrency, it's probably a scammer. Scammers ask for these kinds of payments because they are much harder to trace.
Don’t be pressured into an immediate response.
If you spot a scam, report it to the FTC at ReportFraud.ftc.gov.
Source: ftc.gov
Comments