Fortnite, Discord, and a 14-Year-Old Girl. The Access Nobody Talks About
Fortnite, Discord,
and a 14-Year-Old Girl.
The Access Nobody Talks About.
A bedroom music career that started with Fortnite montages. An online world with almost no guardrails. A child who was 11 when she allegedly met the man now charged with her murder. Here's what the research says about how this happens — and why nobody is naming the access points.
The Question Nobody Is Asking
By now, if you have been anywhere near the internet in the past two months, you know the broad outline of what prosecutors allege happened to Celeste Rivas Hernandez. You know she was 14 when she died. You know she was 11 when she allegedly met the man now charged with her murder. You know her remains were found in a Tesla. You know her family has asked for justice.
What almost nobody is writing is the part that comes before all of that. The structural part. How, in 2022, a child from Lake Elsinore and a teenager building a music career from his bedroom in Houston could end up in the same online space. How that connection gets made. Why it is so easy. And why, despite years of warnings from researchers, lawmakers, and child safety organisations, the platforms that make it possible are still operating largely the same way.
This is not a piece about the D4vd case specifically. We have reported on Celeste's story separately, sourced entirely to court documents and named journalism. This is a piece about the access. The infrastructure. The gap between the world parents think their children are inhabiting online and the world those children are actually in.
Celeste Rivas Hernandez is not the only child this has happened to. She is the one whose name you know.
The Numbers That Should Have Changed Everything
Let's start with what the data actually shows, because it is worse than most people realise and it has been getting worse every single year.
These numbers are not abstract. They represent children. They represent families who thought their kids were playing a game. In July 2025, two 14-year-old girls filed separate lawsuits in California alleging grooming and sexual assault via Roblox and Discord. A family in Galveston filed federal litigation after their 13-year-old daughter was groomed on Roblox, moved to Discord, and assaulted in her home — despite parental controls being in place. The complaints charged both platforms with negligent design and failure to implement basic safety protocols.
The lawsuits keep coming. The systems stay largely the same.
Fortnite, Discord, Twitch — How Each Platform Works as an Access Point
The playbook is consistent across platforms and has been documented for years. It begins in a gaming space where adults and children interact without this being considered unusual — and it moves, gradually, to private channels with no moderation, no record, and no oversight.
Fortnite uses voice communication rather than text chat, meaning adult and child voices share the same game lobbies without this being flagged as unusual. Over 120,000 player-created experiences exist within the game as of 2024. Voice-masking technology is available but research suggests predators often don't need it — adult-child interaction in gaming is already normalised. Fortnite gaming content also pipelines directly to YouTube and TikTok audiences that skew young.
Discord is where gaming relationships go private. It operates as an unmoderated messaging layer beneath the gaming ecosystem — a place to continue conversations that began in Fortnite or Twitch streams. Discord has no meaningful age verification. Direct messages are private. Servers can be entirely unmonitored. Multiple lawsuits describe the same pattern: contact begins on a gaming platform, moves to Discord, escalates there. Roblox made age verification mandatory in January 2026 after a wave of lawsuits. Discord has not.
Twitch livestreams create real-time access between creators and audiences in a format that feels personal and reciprocal. Young fans — particularly isolated or lonely ones — can develop intense parasocial relationships with creators who appear to be speaking directly to them. The move from public stream to private message is a very short step, especially when a creator actively engages with fan comments during a live broadcast.
Gaming content on TikTok and YouTube builds audiences that skew young. A teenage creator posting Fortnite montages develops a fanbase of children who follow their content. The creator's youth makes this feel peer-to-peer to young fans even when an age gap exists. As the creator ages and their audience does not, the structural differential widens while the parasocial relationship remains intact.
In case after documented case, the route is the same: gaming platform → content creator relationship or shared server → Discord DMs → private contact. Each step feels natural to the child because each prior step has already normalised the relationship. By the time communication is fully private, the groundwork has been laid over weeks or months of public, apparently innocent interaction.
This is not a loophole. This is how the system was built. Engagement drives revenue. Private messaging drives engagement. Age verification reduces user numbers. The incentives are not pointed toward child safety.
The Isolation Factor — What Homeschooling Has to Do With It
This section requires care. Homeschooling itself is not the problem, and the majority of homeschooled children are not in danger. What matters is a specific set of structural conditions that homeschooling can, in some cases, create or amplify — and that predators are documented to exploit.
David Burke was homeschooled from eighth grade onward. In a Pollstar profile from 2024, he described video games as being "his only way to socialize and find like-minded people." In another interview he described living "vicariously" through his online friends. These are his own words, in published journalism — describing a life in which online spaces were the primary social world. This context matters because it illuminates how those same spaces function: as the main arena of social connection for young people without traditional school-based peer networks.
The Coalition for Responsible Home Education notes that unlike children in public school, homeschooled children are not seen regularly by mandatory reporters — teachers and school staff legally required to report signs of abuse or exploitation. Fewer peer relationships also reduce the likelihood of a child disclosing concerning behaviour to someone who might act on it. And reduced structured social contact outside the home means online spaces often become the primary site of peer connection for homeschooled teenagers.
Predators know this. Research from Our Rescue and the APA documents that predators specifically target children who appear isolated, who seek emotional validation from adults, and who lack peer reference points to recognise grooming behaviour as unusual.
The Seven Stages — What Grooming Actually Looks Like Online
Grooming is not a single moment. It is a process designed to be invisible — to the child, to the parents, and to the platforms. Understanding the stages is the most practical tool parents and children have.
What Keeps Not Changing — And Why
The data has been available for years. The lawsuits have been filed. The congressional hearings have been held. The child safety organisations have published their reports. And yet the platforms that enable predatory access to children continue to operate with minimal age verification, limited moderation of private channels, and business models that reward engagement over safety.
The reason is not mystery. It is incentive. Age verification reduces user numbers. Moderation is expensive. Parental controls reduce frictionless access, and frictionless access is the product. As long as the financial penalties for inadequate child protection are lower than the revenue generated by the features that enable it, the calculation stays the same.
Roblox implemented mandatory age verification in January 2026 — after a wave of lawsuits. Not before. Discord has not done the equivalent. The platforms that feature in the most documented cases of child exploitation are the platforms with the most to lose from safety measures that reduce their young user base.
We are careful here to stay within what is documented. Prosecutors allege the access to Celeste began when she was 11 years old. The full circumstances have not been detailed in public filings. What the documented timeline shows — sourced to court documents and the LA County DA — is that by the time Celeste was 13, the contact was already deeply established. By the time law enforcement explicitly told Burke she was a 13-year-old runaway, the relationship had been ongoing for two years.
The access point, whatever it was, had been operational for a very long time before any adult institution intervened. That is the pattern. That is what keeps not changing. Read the full documented timeline of Celeste's case here.
What Parents Can Actually Do — Beyond "Monitor Screen Time"
The standard advice — set parental controls, monitor screen time, keep computers in common areas — is not wrong, but it is insufficient. The children being targeted are not being reached through obvious channels. They are being reached through gaming platforms parents consider safe, by adults who have learned to move conversations to spaces parents don't know to check.