Roblox Introduces Stricter Chat Safety for Children
Roblox, one of the most popular online platforms among children, has announced new safety measures to better protect younger users on the platform. To access chat features, players will now need to complete facial age verification, making Roblox one of the first large gaming platforms to introduce this system for users of all ages. The new approach enables age-based chat, helping ensure more age-appropriate interactions and limiting communication between adults and children under 16. (Roblox)
The update reflects a broader effort across the tech industry to make online spaces safer for young users.

Microsoft’s latest Global Online Safety Survey shows that online scams/phishing and cyberbullying remain among the most common risks people face online. The eye-opener: confidence in spotting deepfakes dropped from 46% to 25%, and participants struggled to reliably tell real vs. AI-generated images when tested. The same report notes that over half of scam victims suspect AI played a role in the attack.
Want a practical, parent-friendly guide on what to watch for — and what to do if it happens to your family? Read our latest blog on AI deepfakes and voice-clone scams.
