Logan Skrzypek ‘27, Copy Editor
On Jan. 31, two young sisters from Indiantown, Florida—just 12 and 15 years old—vanished from their home after three months of communication with a 19-year-old man from Omaha, Nebraska. The relationship began on Roblox, the gaming platform that has become a digital playground for millions of children worldwide, before shifting to Snapchat, where the conversations grew more personal. Investigators say the suspect, Hser Mu Lah Say, drove nearly 24 hours from Omaha to pick them up. The girls slipped out to meet him, leaving behind only the small clues their family had noticed in the weeks prior—unexplained food deliveries, odd behavior, and the sense that something was happening out of view.
A multi-state search ended when the Georgia Highway Patrol located the suspect’s vehicle and found the sisters safe inside. Law enforcement called it a “sobering case,” one they believed could have ended far worse. Florida’s Attorney General publicly criticized Roblox for enabling predators to groom children on the platform, pointing to the case as part of a broader pattern of safety failures. This incident reflects deeper weaknesses of the platform’s structure where the relationship began—weaknesses that have allowed predators to find, contact, and manipulate children despite Roblox’s public claims of safety.
Roblox now requires users to verify their age through AI facial estimation or government ID, a system introduced in September 2025 and marketed as a major step forward in child protection. Parents and regulators initially believed it would reduce grooming and inappropriate contact, but investigations and user reports show the system is inaccurate, easily bypassed, and “sometimes counterproductive.” Children can still access features meant for older users and adults can still pose as minors. In some cases, children are locked out of communication while predators remain active. Age verification was supposed to be the fix, yet it only highlights how much of Roblox’s safety strategy relies on systems that do not work as promised.
Gaps within Roblox’s moderation further escalate the problem. Age verification alone is not enough; predators can still use voice chat and messaging internally and externally to reach children. The Texas Attorney General has noted that when children enter environments designed for general users, they are exposed to predators who can blend in easily. Roblox is a child-dominated space, and safety is difficult to maintain when predators can disguise themselves with avatars and pretend to be young children. Even users as young as five-years-old can access the platform with no restrictions whatsoever. With countless user-generated environments and limited oversight, many spaces remain effectively unmonitored. Mandatory chat verification has not meaningfully reduced risk and technical flaws continue to surface.
Parents and officials have voiced their concerns directly. Florida Attorney General James Uthmeier warned, “There are concerning reports that this gaming platform, with limited moderation, and most children, is exposing them to sexual content and predators.” Digital safety experts note that “children ages five to 12 are increasingly exposed to risks in digital spaces.”
The Florida sisters’ abduction is not an isolated incident, as nearly 13,000 child exploitation incidents have occurred from Roblox in 2024 and 2025. This is the result of a predictable outcome of weak age verification, ineffective moderation, unmonitored chat, and a platform design that prioritizes growth over safety. As digital spaces continue to evolve, experts say awareness and communication remain the strongest tools for prevention. The goal, they note, is not to limit children’s access to games, but to ensure they can explore them safely. Child safety organizations recommend that parents:
- Enable parental controls and age-based restrictions.
- Keep gaming devices in shared spaces.
- Talk openly with children about online behavior and boundaries.
- Monitor friend lists, chat logs, and external messaging apps.
- Report suspicious behavior to both the platform and local authorities.
