South Africa’s retirees are increasingly under threat as cybercriminals leverage artificial intelligence (AI) to craft more sophisticated scams. With the advent of AI tools such as FraudGPT, it has become easier for criminals to deceive individuals, particularly retirees, by making fraudulent schemes appear more authentic and harder to detect.
The Rise of AI-Enabled Scams
In his recent research paper, actuary and damages expert Gregory Whittaker highlights that FraudGPT, which was launched in 2023, has become a preferred tool for cybercriminals. It allows scammers to create highly convincing phishing emails, build hacking tools, and exploit IT system weaknesses. Unlike the poorly written and easily identifiable phishing attempts of the past, AI now makes these scams far more credible.
Whittaker describes this development as “the beginning of a new era of cybercriminals at scale,” where AI is used to generate realistic communications that mirror legitimate financial service providers.
Targeting Retirees
Retirees, who often gain access to substantial capital after leaving the workforce, have become prime targets for cybercriminals. The 2023 FBI Internet Crime Report revealed that individuals over the age of 60 suffered losses exceeding $3.4 billion (R58.5 billion) due to cybercrime last year. While there is no specific research on South Africa, it is safe to assume that local retirees face similar risks.
Several factors make retirees more vulnerable:
Increased digital activity: Many retirees now use smartphones and computers to manage finances.
Complex financial products: The growing complexity of retirement products can be overwhelming, making retirees more susceptible to scams.
Social media usage: Retirees may turn to social media to fill the void left by no longer interacting with colleagues, providing scammers with more personal information to exploit.
Common AI-Powered Scams Targeting Retirees
Criminals use AI in various ways to deceive retirees. Here are the most common forms of cybercrime:
1. Phishing and Spearfishing Traditional phishing scams cast a wide net, sending fraudulent emails or text messages hoping to trick someone into revealing their personal details. However, with AI, criminals now practice spearfishing, a more targeted version. Using tools like FraudGPT, they can analyze large datasets and personalize emails to retirees, making the scams more believable. This increases the chances of retirees unwittingly sharing sensitive information, such as passwords or bank details.
2. Deepfakes AI-generated deepfake technology is another tool in a scammer’s arsenal. These scams often involve fake videos or images of celebrities or trusted public figures, promoting bogus investments or financial services. Retirees hoping to boost their savings fall for these schemes, only to lose their money when they try to withdraw funds from their “investments.”
3. Grandparent Scam (Voice Cloning) AI can also be used to clone a younger relative’s voice. In these scams, a retiree receives a call from someone who sounds like their grandchild, claiming to be in an emergency, such as a car accident or legal trouble, and needing money immediately. The caller usually asks for the incident to be kept secret, pressuring the retiree to act quickly. Retirees are advised to hang up and contact another family member to verify the situation.
Protecting Your Retirement Savings: What to Watch Out For
While AI scams are becoming more sophisticated, there are still ways retirees can protect themselves:
Mistrust first, verify later: Never share sensitive information over the phone, via email, or on social media. Call the company you believe you are dealing with to confirm the legitimacy of the request.
Check with your financial adviser: Consult your financial adviser before making any significant financial decisions.
Use official channels: If you are unsure about a financial service provider, check their credentials with the Financial Sector Conduct Authority.
For retirees, knowledge is the best defense. Being aware of these evolving cyber threats and maintaining a healthy level of suspicion when dealing with unsolicited communication can significantly reduce the risk of falling victim to AI-enabled scams.
—
Source: Daily Investor