.png)
Privacy Please
Welcome to "Privacy Please," a podcast for anyone who wants to know more about data privacy and security. Join your hosts Cam and Gabe as they talk to experts, academics, authors, and activists to break down complex privacy topics in a way that's easy to understand.
In today's connected world, our personal information is constantly being collected, analyzed, and sometimes exploited. We believe everyone has a right to understand how their data is being used and what they can do to protect their privacy.
Please subscribe and help us reach more people!
Privacy Please
Have a Seat in the Metaverse: The Chris Hansen vs. Roblox Investigation
For decades, Chris Hansen’s iconic catchphrase, "Why don't you have a seat?" was the prelude to exposing predators in the real world.
Now, his hunt has moved into the metaverse. His target is Roblox, the global gaming platform used by over 70 million people daily, most of whom are children. Hansen and his team allege the platform is a "cesspool" and a "hunting ground" for criminals, while Roblox maintains its safety systems are robust.
In this special report, "Privacy Please" goes beyond the headlines to investigate the clash. We explore the platform's design, from the "Avatar Loophole" that allows bad actors to bypass chat filters to the recommendation algorithm that can lead young users down dangerous paths.
Is this a simple case of a company needing to moderate more, or is the very business model that made Roblox a multi-billion dollar success also its greatest safety vulnerability?
Credited Resources & Further Reading
Primary Sources & Reporting:
- Takedown Across America with Chris Hansen: Official platform for Hansen's ongoing investigations and reporting.
- Roblox Corporate Statements & Community Standards: Official statements and policies from Roblox regarding their safety and moderation efforts.
- WIRED/Bloomberg Reporting: Recent articles from major tech publications that have investigated platform safety issues on Roblox and similar metaverse platforms.
- Common Sense Media: A non-profit organization that provides independent reviews and ratings for media and technology, often analyzing the safety features of platforms like Roblox.
(Note: As this is an ongoing investigation, it's recommended to reference the most current news articles and official press releases from the time of recording for the most up-to-date information.)
It looks like a child's drawing come to life. Blocky, cheerful avatars run through a digital park. The colors are bright, the sounds are playful. It's a universe built on imagination, creativity and connection, a place where over 70 million people, most of them children, log in every single day to build, to play and to be with their friends. This is Roblox, and for years it has been the virtual backyard for our generation. But look closer In the shadowed corners of a private server and a chat window designed to disappear, a different kind of connection is being made. The avatar may look like a teenager, but the person behind the keyboard is much older. The friendly questions about your favorite game slowly turn more personal, more probing, and, in a production studio, miles away from the servers and the code, a familiar voice is watching, a voice that has haunted the nightmares of Predators for two decades. I think you know who I'm talking about. His name is Chris Hansen. Remember that. Why don't you have a seat right over there, chris would say.
Speaker 1:Chris Hansen isn't in a suburban home anymore. He's not waiting for someone to walk through the door. His new hunting ground is the same one your kids call their playground, and he alleges it's a platform in crisis. So is he right? Is one of the most popular platforms for children, fundamentally unsafe. Today, on Privacy, please. We investigate the battle for the soul of Roblox. All righty, then, ladies and gentlemen, welcome back to Privacy, please. I'm your host, cameron Ivey, and what you just heard is not a hypothetical. It's the new reality on one of the biggest entertainment platforms on earth. For years, we've talked on this show about the privacy risks of social media, the dangers of data brokers and a surveillance baked into our smart devices, but what happens when the platform isn't just a place to share photos, but a world to inhabit, a world populated almost entirely by children? To understand this crisis Chris Hansen alleges is happening on Roblox, you first need to understand who he is.
Speaker 1:For a decade, chris Hansen was a fixture of early 2000s television. His show To Catch a Predator was an uncomfortable must-watch cultural phenomenon. The formula was simple and brutally effective. In partnership with a watchdog group, hansen's team would create sting operations. Posing as teenagers in online chat rooms, they'd lure suspected predators to a house filled with hidden cameras and just as the confrontation began, hansen would emerge from the shadows. I'm Chris Hansen with Dateline NBC, and we're doing a story about adults who come to meet underage teenagers. He would say the show ended years ago, but Hansen has continued his work independently and now, through his project Takedown Across America, he and his team have turned their full attention from the chat rooms of the past to the virtual worlds of the present.
Speaker 1:Their primary target, roblox. Now, if you don't have a child under the age of 16, you might still think of Roblox as just that blocky kids game, but that's a profound misunderstanding of what it is. It's less like a single video game and more like a digital Lego set the size of a galaxy. It's an engine, a platform where users themselves create and share the games, or experiences, as the companies call them. The scale is almost impossible to comprehend.
Speaker 1:As of today, august 27th 2025, roblox has over 70 million daily active users. They spend billions of hours a month on the platform and, according to the company's own data, more than half of its users are under the age of 13. More than half of its users are under the age of 13. More than half, it's an entire economy fueled by a virtual currency called Robux, a social network, a creative outlet and, according to Chris Hansen, it's a predator's paradise. That could be the name of like a band in high school, not a very good band.
Speaker 1:I'm going to verbatim say a clip from Chris Hansen from one of his recent interviews and podcasts. He said it is a cesspool, it is a hunting ground. We're talking to predators every single day on this platform who are actively looking for children. Their safety systems are a joke, end quote. Hansen's allegations are specific and deeply troubling. He claims his team has uncovered rampant grooming in private chats and on third-party apps like Discord, which users are often lured to. He points to the ease with which bad actors can bypass the platform's chat filters using special characters or coded language. And, most disturbingly, he highlights the proliferation of what are known as condo. Games is designed explicitly to simulate adult and sexual interactions, using the game's own mechanics and animations to sexualize the childlike avatars in ways that automated moderation struggles to detect.
Speaker 1:So you have this massive clash A child safety crusader whose name is synonymous with catching predators versus a 30 billion billion tech Goliath that insists it's doing whatever it can to protect its young users. But is it that simple? Is this a case of a company turning a blind eye? Or is this a problem of unprecedented scale that no amount of moderation can truly solve, scale that no amount of moderation can truly solve. To find out, we first need to look under the hood of Roblox itself. What are their defenses? And are the very features that make Roblox so popular also the ones that make it so dangerous? That's coming up after the break. Welcome back to Privacy, please.
Speaker 1:Before the break, we laid out the battlefield Chris Hansen, on one side, alleging that Roblox is a cesspool for predators, and, on the other, a tech giant that provides a creative outlet for over 70 million people a day. So, to be fair, we have to look at this from the company's perspective. If you were to ask Roblox, they would tell you that safety is their absolute top priority, and they have the numbers to back up that claim, at least on the surface. They employ an army of thousands of human moderators. They use protective AI systems designed to scan text, images, audio and even the 3D models of user-created items before they ever go live. And of course, there's the big red report abuse button available to every user, which generates millions of reports a month. In a recent statement, a Roblox spokesperson said, and I quote we have a zero-tolerance policy for sexual contact or predatory behavior of any kind. Our team works tirelessly to act on any inappropriate content or behavior to protect our community. End quote.
Speaker 1:They also provide a suite of parental controls. A parent can set a pin to lock certain settings. They can restrict who their child can chat with or turn off chat altogether. They can limit the types of experiences their child has to access. On paper it sounds robust. Sure, a multi-layered defense of AI, human vigilance and user empowerment. But in practice, this digital fortress has some serious cracks and to understand them you have to understand the fundamental nature of the platform.
Speaker 1:The first issue is simply the scale. When you have 50 million user-created games and millions of people interacting every second, a reactive moderation system is playing a game of whack-a-mole. It can never win. For every condo game that gets taken down, two more can spring up with slightly different names. But the second, more subtle problem is something we'll call the avatar loophole. Most online moderation is built to catch bad words. It looks for text slurs, threats, talk of harm or self-harm or attempts to share personal information. But sophisticated groomers know this. They know that the chat is monitored, so they don't use words, they use actions. This is the secret behind these condo games. They are built to bypass moderation because nothing explicitly violating is ever typed into a chat box. Instead, they use the game's own physics and animation systems. Users can make their avatars perform gestures, lie down or interact with an in-game object in ways that are clearly intended to simulate sexual acts. There are no trigger words for an AI to catch when the violation is being communicated through the movement of a blocky digital puppet.
Speaker 1:The third crack is one that should be familiar to anyone who's used the internet in the last decade the algorithm. So, like YouTube or TikTok, roblox wants to keep you engaged, and it does this with a recommendation algorithm that learns what you like and shows you more of it. But this can create dangerous pathways. A child might play a game that's just a little edgy, but not technically rule-breaking. The algorithm sees this and concludes okay, this user likes experiences that are a bit more mature. It then might suggest something that is over the line. It can unintentionally guide users from the well-lit town square of the platform to its darkest alleys.
Speaker 1:And finally, all of this raises a thorny privacy question. Under US federal law, called COPPA, c-o-p-p-a the Children's Online Privacy Protection Act, there are strict limits on how companies can collect and use the data of children under 13. To police. Their platform, roblox, needs to monitor and log what kids are doing and saying. It's a classic security versus privacy dilemma. To catch the predators, are they creating a massive sensitive database of millions of children's private conversations, one that could be a target for hackers or be used in other ways down the line? So we have safety systems that are impressive on paper but are constantly being outsmarted by bad actors exploiting the platform's core features, its scale, its avatars and its algorithms. This leads us to the billion-dollar question. Literally, could Roblox truly fix this if they made it their one and only priority? Fix this if they made it their one and only priority? Or are these safety failures not just bugs but unavoidable side effects of the very business model that has made them so successful? To answer that, we have to follow the money. That's next.
Speaker 1:Alrighty, then, ladies and gentlemen, we have come to the end of today's episode. If you like this content, let me know. I love digging into stories like this. I found this one absolutely fascinating. We'll obviously keep following along as Chris and his team do their thing, and it excites me because there's a lot of bad people out there, and I love the good work that Chris Hansen does to protect children. So, with that being said, if you guys have not heard yet, we are launching our own network and website. You can go to it now. So if you're a listener, theproblemloungecom, go check it out. You can listen to episodes there. You can contact us there. So if you want to be a guest you have somebody that wants to be a guest. You have questions, you want us to talk about certain things? Go on there, leave us an email, send us a contact thing. Would love to hear from you and, as always, thank you for supporting Privacy, please, and the Problem Lounge Network. We're so excited about what's ahead and thank you again for listening Cameron Ivey over and out.