Grandparent scams have taken a terrifying twist with the use of AI voice cloning technology. In recent incidents, scammers managed to clone the voices of victims’ grandchildren, leading unsuspecting grandparents to believe they were speaking to their distressed relatives in urgent need of financial help. The AI technology replicated not only the voices but also the accents, mannerisms, and even the use of nicknames specific to the scammed individuals.
One victim, Jessica Di Palma’s 73-year-old mother, received a call from someone claiming to be a police officer informing her that her grandson, Milan, had been arrested on drug charges and needed $10,000 for bail. The supposed officer then handed the phone to Milan, or so it seemed, urging his grandmother to post bail. Although the voice on the phone sounded exactly like Milan, it was actually an AI-generated imitation.
In a state of panic, Di Palma’s mother withdrew $10,000 to supposedly save her grandson. Luckily, Jessica noticed the large withdrawal and questioned her mother about it, uncovering the scam. Despite speaking directly with both Milan and Jessica, who confirmed this wasn’t Milan on the call, Di Palma’s mother struggled to comprehend how the imitated voice sounded so authentic. When the scammer realized they wouldn’t receive the money, they resorted to threatening Di Palma’s mother’s life.
Another victim, Bruno Aiezza, experienced a similar scam in which his mother was targeted with a request for $7,000 in bail money for a supposedly arrested grandson. The imitated voice perfectly matched the mannerisms and even switched between English and Italian, just as Aiezza’s son would. However, Aiezza’s mother, who lived in a residence with limited mobility, immediately sensed that something was amiss, as her grandson would never ask her to go to the bank.
AI voice cloning, also known as voice replication, enables scammers to search the internet for voice samples of their intended victims. The AI software then analyzes and trains itself to replicate the voices, accents, and other speech patterns, allowing scammers to create any audio they desire. The technology has become increasingly sophisticated and poses a serious threat given its ability to convincingly imitate individuals.
Experts urge families to familiarize themselves with the typical signs of grandparent scams, such as urgent requests for money or unusual behavior from loved ones. They recommend establishing secret code words to authenticate the identity of callers and to warn against potential danger. If unsure, hanging up and dialing the loved one directly using their known phone number is advised.
Unfortunately, not all families have been fortunate enough to escape these scams without financial loss. It is crucial for victims to report such incidents to the authorities. Cybersecurity experts also advise victims to review their financial records, change passwords, and contact their banks to ensure their accounts are secure. Installing antivirus software is recommended to detect any unauthorized activities on personal systems.
As these scams continue to evolve, individuals must take cyber threats more seriously. Trusting one’s intuition and promptly reporting any suspicious activity are crucial steps to avoid falling victim to such scams. The use of AI voice cloning has elevated the grandparent scam to a new level of terror, emphasizing the importance of vigilance and awareness in protecting oneself and loved ones from such fraudulent schemes.