Deepfake Scam Alert: Elon Musk and Celebs Exploited to Trick Users – Is Your Money at Risk?
Deepfake technology has become a serious concern, and recent investigations have revealed that it is no longer just a theoretical threat. According to a report by NBC News, there are at least 50 deepfake videos circulating on social media platforms with the intention of scamming unsuspecting internet users out of their hard-earned money. The videos primarily feature X CEO Elon Musk, but other celebrities and influential figures such as Tucker Carlson have also been targeted.
In these deepfake videos, users are encouraged to invest funds into a fake platform. Given Elon Musk’s previous endorsement of certain cryptocurrencies, many users may not immediately recognize these videos as scams. This makes them particularly vulnerable to falling victim to these fraudulent schemes.
What’s even more concerning is that the majority of these deepfake videos are found on Facebook, a platform with a significant user base of individuals aged 45 and above. Statista reports that 36.5% of Facebook users fall within this age category, and there is a perception that older individuals may be less tech-savvy, making them more susceptible to scams.
In response to these findings, a Facebook spokesperson stated that they are actively monitoring and tracking deepfakes, emphasizing that such content is strictly against their policy. However, YouTube has taken a different stance, with a spokesperson claiming that the deepfake videos discovered do not violate their terms and conditions.
So, what exactly are deepfakes, and what kind of damage can they cause? Oxford Language Dictionaries define deepfakes as videos in which a person’s face or body has been digitally altered to make them appear to be someone else. These videos are often spread maliciously or used to disseminate false information.
The availability of new Artificial Intelligence (AI) and Generative Artificial Intelligence (GAI) software has increased the risk of this technology being misused by malicious individuals and groups. Deepfakes have been predominantly used to scam people out of money or spread false information. However, they have also given rise to an alarming trend known as deepfake pornography.
Deepfake pornography involves inserting the image of a person’s face onto the body of a porn actor, often without their consent. This content is then shared online, causing significant harm and distress to the individuals targeted. Shockingly, the number of uploads to top deepfake porn sites has nearly doubled in recent years, reaching 13,345 uploads on one site alone.
Victims of deepfake porn have spoken out about the devastating impact these videos have had on their lives. Individuals like Jenny and Lucy have experienced a range of negative emotions, from hurt and fear to anxiety and panic. The violation of their privacy has led to feelings of isolation, changes in personal identity, and even contemplation of suicide. The psychological toll of deepfake porn cannot be ignored.
Interestingly, Israel has taken a stand against deepfakes by designating them as an invasion of privacy. The country’s Privacy Protection Authority (PPA) has stated that the distribution of deepfake photos or videos without consent, particularly when they contain humiliating or intimate content, constitutes a violation of privacy. Companies that produce fake content using deepfake technology are also required to comply with data protection regulations to safeguard individuals’ personal information.
However, the consequences for those who use deepfake technology inappropriately remain unclear.
In conclusion, deepfake scams targeting internet users, especially using the personas of celebrities like Elon Musk, are a growing concern. These fraudulent videos can lead unsuspecting individuals to invest money in fake platforms, resulting in financial losses. Furthermore, the rise of deepfake pornography presents a horrifying violation of privacy and can have severe psychological consequences for victims. It is crucial for social media platforms and authorities to take action to detect and remove deepfake content to protect users from falling victim to these scams.