Artificial intelligence is now turbocharging a multibillion-dollar global criminal scheme known as the “imposter scam”.
The initial version of the scheme happens when scammers call or send text messages to unsuspecting people pretending to be someone they know who has a new phone number and a financial emergency.
But now, with the help of AI, scammers are cloning the actual voices of friends, family members and even children, according to a new McAfee cybersecurity artificial intelligence report.
Using three seconds of someone’s recorded voice, McAfee says AI can accurately replicate anyone’s voice and begin placing calls to unsuspecting victims.
McAfee cites the case of an Arizona mom who told the New York Post that scammers cloned her teenage daughter’s voice using AI, demanding a $1 million ransom for her release.
McAfee recommends people set a codeword with kids, family members, or trusted close friends that only they know, and make a plan to always ask for it if they call, text, or email for help.
The latest numbers from the Federal Trade Commission show impostor scams accounted for $2.6 billion in losses last year.
And the Commission has also outlined its own set of measures people can take if they believe a scammer may be on the line.
- Resist the pressure to send money immediately. Hang up.
- Then call or message the family member or friend who (supposedly) contacted you.
- Call them at a phone number that you know is right, not the one someone just used to contact you. Check if they’re really in trouble.
- Call someone else in your family or circle of friends, even if the caller said to keep it a secret. Do that especially if you can’t reach the friend or family member who’s supposed to be in trouble. A trusted person can help you figure out whether the story is true.
According to McAffee, 25% of adults surveyed globally have experience of an AI voice scam.
One in 10 say they have been targeted personally, and 15% say somebody they know has been targeted.
Don’t Miss a Beat – Subscribe to get email alerts delivered directly to your inbox
Check Price Action
Follow us on Twitter, Facebook and Telegram
Surf The Daily Hodl Mix
 
Disclaimer: Opinions expressed at The Daily Hodl are not investment advice. Investors should do their due diligence before making any high-risk investments in Bitcoin, cryptocurrency or digital assets. Please be advised that your transfers and trades are at your own risk, and any loses you may incur are your responsibility. The Daily Hodl does not recommend the buying or selling of any cryptocurrencies or digital assets, nor is The Daily Hodl an investment advisor. Please note that The Daily Hodl participates in affiliate marketing.
Generated Image: Midjourney
The post AI Turbocharging $2,600,000,000 ‘Imposter Scams’ by Cloning Children’s Voices and Calling With Fake Emergencies: Report appeared first on The Daily Hodl.