Scammers are using AI to bilk victims out of millions: Here's how to protect yourself
Briefly

A new sophisticated scam targeting the elderly involves criminals using AI to clone the voices of grandchildren from social media platforms like TikTok. This leads to phony emergency calls where the scammer pretends to be a grandchild in distress, needing urgent funds. Police have reported an increase in such scams, which exploit the emotional vulnerability of seniors. Officials advise families to establish ‘safe words’ for verification to help prevent financial losses to this deceptive scheme.
Police warned that scammers are leveraging AI to clone voices found on social media, specifically targeting elderly individuals by creating phony calls from their grandchildren.
Victims have been tricked into believing they're helping relatives in distress, resulting in significant financial losses due to sophisticated voice replication technology.
Authorities have noted an increase in these scams recently, emphasizing the importance of vigilance and establishing 'safe words' within families to prevent fraud.
Suffolk County Police Commissioner highlighted the effectiveness of voice-mimicking technology that criminals are using to emotionally manipulate seniors into sending them money.
Read at New York Post
[
|
]