← Back to Insights
Financial Planning

AI Scams Are Coming for Your Family's Money: 5 Things to Do This Month

Todd Sensing, CFA, CFP®, CEPA®, ChSNC® Updated March 13, 2026
AI Scams Are Coming for Your Family's Money: 5 Things to Do This Month

A woman in Colorado picked up the phone and heard her daughter screaming. The voice was unmistakable. Her daughter said she had been kidnapped and begged her mother to wire money immediately. The mother sent $2,000 before discovering the truth: she had never been speaking with her daughter at all. The voice was artificial, generated by software that needed only a few seconds of audio scraped from social media.

This is not a plot from a streaming thriller. It is a real case, and variations of it are happening thousands of times a day across the country. The difference between this scam and the clumsy phishing emails of five years ago is that AI has removed the skill barrier entirely. The tools are free, the results are convincing, and the targets are predictable: families with money.

The Numbers Are Hard to Ignore

The FBI's Internet Crime Complaint Center recorded $16.6 billion in cybercrime losses in 2024, a 33% jump from the prior year. Deepfake-related losses in the U.S. hit $1.1 billion in 2025, tripling from $360 million in 2024. Industry projections put AI-facilitated fraud losses at $40 billion by 2027.

The technology has reached a troubling inflection point. Three seconds of recorded audio is now enough to produce a voice clone that is 85% accurate. AI-generated phishing emails achieve click-through rates four times higher than those written by humans. And synthetic identity fraud, where AI constructs entirely fictional people from fragments of real data, jumped 378% last year.

The people most at risk are often the least prepared. Americans over 60 lost nearly $4.9 billion to cybercrime in 2024, up 43% from the year before. Only 1 in 44 cases of elder fraud gets reported, which means the real number is almost certainly much larger.

Why This Matters for Families Like Yours

Affluent families face a compounding problem. Wealth creates visibility, and visibility creates vulnerability. A public-facing career, an active social media presence, a family foundation, involvement in community organizations: all of these generate the kind of digital footprint that AI systems can harvest and weaponize.

The most common attack pattern is also the simplest. A scammer clones the voice of a grandchild or adult child, calls an elderly family member, and fabricates an emergency. A car accident, an arrest, a medical crisis. The urgency is the weapon. The emotional connection is the vulnerability.

But voice cloning is only one vector. AI-generated business emails from fake "clients" or "partners" requesting urgent transfers. Synthetic identities used to open accounts in a family member's name. Deepfake video calls that impersonate trusted contacts. Each of these scenarios is happening now, not in some speculative future.

For families with a member who has a cognitive disability or who is on the autism spectrum, the risk amplifies further. Financial exploitation is among the fastest-growing forms of abuse targeting individuals with disabilities, and AI tools make these attacks harder for everyone to detect, not just those with additional vulnerabilities. If you are building a special needs financial plan, fraud prevention protocols should be part of the conversation.

Five Steps to Take This Month

The good news is that the most effective defenses are not technological. They are behavioral. Here are five things you can do in the next 30 days to meaningfully reduce your family's exposure.

1. Create a Family Verification Code

Pick a word or phrase that only your family members know. It should be something that would never appear in a public conversation or on social media. If anyone calls claiming to be a family member and asking for money, the first response is to ask for the code. No code, no action. This is the single most effective defense against voice cloning scams, and it costs nothing.

2. Shrink Your Voice Footprint

Scammers need audio samples to clone a voice, and they get them from social media videos, voicemail greetings, podcast appearances, and public recordings. Set social media profiles to private. Audit videos and audio content that is publicly accessible. Ask family members, especially older parents and college-age children, to do the same. You cannot eliminate your digital presence entirely, but you can make it meaningfully harder to harvest.

3. Build a Financial Circuit Breaker

A "circuit breaker" is a deliberate friction point between a request for money and the actual movement of funds. The most practical version: designate your financial advisor as a mandatory second verification step before transferring any significant amount. Set up verbal confirmation protocols with your custodian for large transfers. Enable transaction alerts on all accounts. The goal is to make it structurally difficult for anyone, including a convincing AI impersonation, to move your money with a single phone call.

4. Protect the Most Vulnerable People in Your Family

Have direct conversations with elderly parents about the current scam landscape. Be specific: explain voice cloning, show them what it sounds like, and establish the family code word with them. For families with special needs dependents, make sure guardians, trustees, and caregivers are briefed on AI fraud tactics and know the verification protocols. Consider adding a trusted contact designation on financial accounts so your advisor or custodian has another person to call before processing unusual requests.

5. Stay Current, and Report What You See

AI scam tactics evolve monthly. What works in March may be obsolete by June. Your advisory team should be a resource for staying informed, not just for investment decisions but for protecting the financial architecture your family has built. And if you encounter a scam attempt, report it to IC3.gov or FTC.gov. The massive underreporting gap (remember, 1 in 44) means the data used to fight these crimes is incomplete. Every report helps.

Your Advisor Should Be Part of the Defense

Comprehensive financial planning has always meant thinking beyond the portfolio. Tax coordination, estate design, insurance architecture, retirement projections: the value of a fiduciary advisor comes from seeing how all the pieces fit together. Fraud prevention is now one of those pieces.

At FamilyVest, the advisor relationship is designed to provide ongoing, unlimited access, which means you can call when something feels wrong without worrying about racking up fees for a "quick question." The circuit breaker concept works because the relationship already exists. You are not adding a new vendor. You are using the trusted advisor you already have in a new way.

The regulatory environment is catching up. FINRA's 2026 oversight priorities explicitly flag generative AI and cyber-enabled fraud. The SEC is examining AI-related claims across the advisory industry. But regulation follows the problem. Families that take proactive steps now are the ones least likely to become a case study later.


FamilyVest is a fee-only wealth management and comprehensive financial planning firm. We do not provide cybersecurity services. The information in this article is educational and should not be construed as specific financial or legal advice. If you believe you have been the victim of financial fraud, contact your local law enforcement and file a report at IC3.gov.

Financial Planning Risk Management Special Needs Planning Estate Planning
Share

Ready to talk through your plan?

Every family's financial picture is different. Let's start with a conversation about yours.

Start a Conversation