"Mum, It's Me"... Or Is It? AI Voice-Clone & Deepfake Scams Explained
What are deepfakes and voice clones?
Why are parents targeted, and what red flags should I listen for?
The mindset shift that stops most scams
If a call sounds like my child and feels urgent, what should I do right away?
How can we reduce risk online without going off-grid?
Talk to kids without scaring them
If I think we were targeted or sent money, what should I do?
A few years ago, many scams were easy to spot because they looked suspicious. Today, the scary part is that they don't have to. With AI, scammers can generate fake videos ("deepfakes") and fake audio that mimics a real person's voice ("voice cloning"). For parents, that creates a new kind of risk: a call can sound like your child, a family member, or a trusted adult — panicked, emotional, and urgent — while none of it is real.
What are deepfakes and voice clones?
Deepfakes are AI-edited or AI-generated videos (and sometimes images) that make it look like a real person said or did something they never did. Voice cloning uses AI to imitate someone’s voice from short audio samples—often pulled from public videos, voice notes, or recordings. (Europol) Authorities in the US and Europe have warned that these synthetic media tools are accelerating fraud and social-engineering attacks. (European Parliament)

Why are parents targeted, and what red flags should I listen for?
Scammers exploit a parent’s urgency to protect their child. They use stories that trigger fear and speed: a lost phone, a new number, an accident, “I can’t talk,” or a crisis that demands immediate help. Red flags include pressure to act fast, instructions to keep the call secret, and warnings not to hang up or call anyone else. Those tactics are designed to block independent verification—the step that usually stops the scam. (Federal Trade Commission)
The mindset shift that stops most scams
The key shift is simple: a familiar voice is no longer proof. Treat the voice as a clue, not confirmation. If a message or call creates panic and tries to rush you into action, that's exactly when you should slow down. Even if the voice sounds right, you're allowed to be "difficult" and verify. In real emergencies, verification is still possible. In scams, verification is what breaks the story.
If a call sounds like my child and feels urgent, what should I do right away?
Slow down and verify. A familiar voice is now a clue, not proof. (CBS News) Use these steps:
Ask for your family safe word or phrase; if they can’t provide it, stop the conversation.
Hang up and call back using the number you already have saved (or the school’s official number).
Use a personal verification question that can’t be guessed from social media (e.g., what you ate together last night).
Do not send money during the first contact—set this as a firm rule. These habits break the scammer’s control of the channel and expose most voice-clone scams quickly.

How can we reduce risk online without going off-grid?
Be intentional about what’s public. Limit long public videos where kids speak clearly, tighten who can see your posts, and avoid visible school identifiers or details that map routines and relationships. These simple privacy upgrades reduce the “raw material” available for impersonation without requiring you to stop sharing family memories entirely.
Talk to kids without scaring them
Kids should be part of the conversation, too, because deepfakes aren't only used for scams. They can also show up as impersonation, bullying, or "proof" in social drama that isn't real. The goal isn't to scare them; it's to give them a calm rule of thumb. You can frame it as a confidence skill: if something feels off, if a message is trying to rush them, or if they're unsure whether something is real, they pause and ask a trusted adult. Make it clear that coming to you won't get them in trouble. That promise matters more than any single warning.
If I think we were targeted or sent money, what should I do?
Act quickly but calmly. End the conversation, verify your child’s safety through a trusted route, and if any money was sent, contact your bank or payment provider immediately. Save messages, numbers, and voice notes so you can report the incident to the appropriate authorities in your country.
Takeaway
This isn't about living in fear of technology. At Logiscool, we see this as part of modern digital literacy: children (and adults as well) need more than screen-time limits and "be careful" reminders. They need practical habits like verification, critical thinking, and privacy awareness — so they grow into confident, safety-aware digital creators, not just passive users. The world is changing fast, but core family protection is still achievable. When the moment feels urgent, slow down, verify, and take back control.


