Curtis Blackwell found his nine-year-old daughter JackLynn in their Stephenville, Texas yard in a quiet moment that changed his family forever. She was not playing. She had been attempting what millions of children know as the blackout challenge, a trend that circulates on TikTok and encourages users to choke themselves until they lose consciousness.
JackLynn is not alone. Her parents believe she became one of the 80 documented cases of death from the challenge, according to the CDC. The tragedy raises hard questions about platform responsibility, the limits of parental supervision, and whether a child's developing brain can reasonably be expected to resist the pull of a billion-dollar algorithm designed to maximise engagement above all else.
The blackout challenge is not new. The challenge is based around the choking game, which deprives the brain of oxygen, and gained widespread attention on TikTok in 2021, primarily among children. What has changed is the scale and precision of distribution. Videos on social media of people intentionally choking themselves to get a brief euphoric high have been circulating for years. But TikTok's "For You" algorithm does something that offline videos could never do: it learns who is most vulnerable and delivers the content directly to them.
Curtis Blackwell said something that will resonate with any parent struggling with this question: "My mom told me that JackLynn had shown her a video before of a guy doing that with the cord, my mom told her, 'Don't you ever do that.'" The warning was given. The child knew better. She attempted it anyway, alone, in a moment of impulsive decision-making that her developing brain could not fully evaluate for consequences.
That vulnerability is precisely what makes JackLynn's death part of a broader pattern. Most of the kids involved are 9 to 14 years old, and their brains aren't fully developed, making them very easily influenced. The prefrontal cortex, which processes rational thought, is not fully developed until a person's mid-20s, meaning teens are naturally impulsive and can do things without stopping to consider consequences.
TikTok's response to these deaths has been to insist the challenge "predates" the platform and to point out that searching for it returns a warning page. But a federal appeals court last year took a different view of the company's role. The court ruled that TikTok's algorithm constitutes the platform's own "expressive conduct" and therefore may not enjoy blanket immunity under Section 230 of the Communications Decency Act, the law that has long shielded internet platforms from liability for content users upload.
The distinction matters. TikTok does not create the videos. But TikTok's customised algorithm places the videos on users' "For You Page"." If a child has never searched for the challenge, never sought it out, yet it appears in their feed because the algorithm calculated they were the right audience, the platform has moved beyond passive hosting into active promotion.
Multiple families are now suing TikTok over these deaths. The challenge is linked to the deaths of between 15 and 20 children, and several lawsuits have been filed against TikTok, though all have ended in dismissals based on legal immunity. The appeals court decision may change that calculus, but litigation moves slowly. In the meantime, JackLynn's parents are left with a tragedy that could have been prevented.
The harder truth is that no parent can fully supervise a child's exposure to algorithmic content. You cannot watch every video your child encounters. You cannot predict which trending challenge will next capture their attention. And you cannot reason with a nine-year-old's brain about risks that feel distant compared to the immediate social reward of joining a viral moment.
What TikTok can do, what any platform can do, is choose not to amplify the most dangerous content to the most vulnerable users. Experts argue that waiting for the platforms to do the work is not going to happen. But the law may soon demand it. JackLynn's case may force that reckoning.