Only a week ago, in an email exchange with a dear friend/sister-in-Christ, who lives in a northern midwestern state, whom I will call Ruth to protect her identity, I was reminiscing with her how her email account had been hacked a few years ago, and I received an email from her—except it wasn’t from her.
It was a “phishing” scam. The message, purportedly from my friend Ruth, told how (and this may not be exactly the words but you will get the idea) Ruth’s niece had been in a car accident in Virginia and lost her purse in the accident. Her purse contained all her credit cards, etc. The “niece” now desperately needed cash, and asked her “Aunt Ruth” if she could send her $500 immediately.
The “phishing” email went on impersonating Ruth to say that she (Ruth) was out of state and, for whatever reason (a legitimate-sounding reason), could not send her niece the money immediately. So, James, as a favor for me, could you go to Walmart (or wherever) and buy a $500 gift card and send it overnight to my niece at such-and-such address. (Nowadays, it is usually an electronic transfer.)
I thought that smelled really “fishy” (pun intended) and so I simply called Ruth to ask her if she had indeed sent me the email. She was glad I called and apparently everyone in her Contacts list had been hit with the same email, as others had also called her to ask the same question.
(Maybe some of you seeing this were in her Contacts list also, and could make the necessary corrections to my account of what the email said, but the point is…)
Scammers (thieves) are everywhere in cyberspace, and one cannot be too careful. Hence, we are passing along today’s post from Dr. Robert Malone. This will give you a better idea of just how sophisticated the scams have become, and his advice on how to avoid being robbed by them. QUOTE:
Beware: The Christmas Voicemail
Scams are everywhere, and the scammers are getting more sophisticated
AI voice cloning technology is incredibly accessible and works well with a small amount of voice data. Many programs can even clone a voice in as little as 10 to 30 seconds or less!
This means that a simple Instagram, TikTok, or Facebook video on your own profile page can be voice cloned by anyone.
This opens the door to a whole new generation of scams.
A recent scam involved an AI-generated voice cloned from a Florida woman named Sharon Brightwell’s daughter. The call featured the victim’s “daughter,” sobbing and claiming she’d caused a car crash that injured a pregnant woman, and that the police were confiscating her phone.
A fake “attorney” then demanded $15,000 in bail, instructed her to withdraw cash, put it in a box, and hand it to a driver. A follow-up call tried to push for an additional $30,000.
Other scammers in New York are scraping TikTok and other social media for young people’s voices, then using AI cloning and spoofed phone numbers to impersonate grandchildren calling grandparents in fake emergencies (like arrests or accidents), demanding urgent cash.
Fraudsters are now using scripted and interactive voicebots to conduct targeted attacks. Recent case studies show attackers posing as CEOs or finance chiefs, convincing staff to wire money, disclose claims, or bypass security checks.
For example, a UK energy company lost $243,000 after scammers used an AI-generated voice to impersonate its CEO, who claimed to require an urgent transfer of funds to be activated by employees.
Small and mid-sized organizations are more susceptible because they may lack the resources to authenticate each financial request. That said, a larger-sized corporation faces larger risks, as large sums of money are routinely changing hands.
Or one of the newest is the political or NGO scam. You get a personalized voice message from an important political person or famous actor, who directs you to a website that looks legit—maybe a word is misspelled, or the website is a different domain name than the legit one. The need is urgent, and money is donated. You think you donated a hundred bucks and never know the difference. And so it goes.
In a recent Substack article, Sharyl Attkisson details how she became the target of a highly sophisticated scam that used an advanced AI impersonation of President Trump and Chief of Staff Susie Wiles. She explains that scammers reached out pretending to be these prominent political figures and used convincingly crafted messages and audio to make the communication seem authentic.
The scheme involved asking her to join the TikTok board while pressuring her for a large upfront payment of $21,500 along with supposed “shares” worth $100,000 of TikTok—which was to be required of all board members. Throughout the interaction, Attkisson felt the manipulation was carefully designed to exploit trust and create a sense of urgency.
She notes that the experience was not only unsettling but also representative of a growing trend: AI-powered impersonation scams are becoming more common, more believable, and more invasive. Her account emphasizes that this is not an isolated incident but part of a broader and increasingly dangerous shift in online fraud.
These scams are scalable and likely to grow rapidly. Studies show that even advanced voice-authentication and anti-spoofing systems are vulnerable to specially crafted “adversarial” attacks.
How to Protect Yourself, and Your Loved Ones
Verify with a “code word” or shared secret. If someone calls you claiming to be a loved one in crisis — even if the voice sounds real — ask for a prearranged password or phrase that only they (and you) know.
If you don’t have a code word, and you receive an urgent call, make something up that isn’t real and ask about it. “I need to ask first, did you hear how Uncle Charlie’s operation went?” or “did you make it home ok that other night?” or say, “I need to verify it is you” and ask a personal question that one person would know the answer to.
Call back on another channel: Don’t rely solely on the call number or the voice. Use a previously verified number, or call back through a separate method (text, video call, etc.).
Limit public exposure of voice recordings: Avoid posting videos/audio of you or family members talking on public social media (or restrict privacy). Many scammers harvest voice data from those sources.
Be skeptical of demands for urgent money — especially via wire, gift-cards, or courier: Scammers try to push you into acting before you think. Treat any “urgent emergency — send money now” request with suspicion (especially if they ask to conceal the purpose from the bank).
Do not list family members on the profile page on Facebook or other social media accounts.
Consider making your social media accounts private, rather than public.
Do not play online games that ask for personal information.
Do not answer personal text messages from people or numbers that you don’t know. Often, these take the form of “want to go out to dinner tomorrow?” or something that implies a relationship.
Consider keeping a private email account for friends and family, and another for website activities (such as for sales, accounts, and queries) and public use.
Educate family members (especially elderly or less tech-savvy): These scams disproportionately target people who may be less aware of newer tech like AI-voice cloning. END QUOTE
Hope this saves you money and headaches!
~END~