More than a dozen states and the District of Columbia initiated lawsuits against TikTok, the immensely popular short-form video app, on Oct. 8. The core argument of these lawsuits is that TikTok’s design and functionality are deliberately addictive, especially to children, leading to serious mental health consequences. This move is part of an ongoing national investigation into TikTok’s practices, initiated in March 2022 by a bipartisan group of attorneys general from various states, including prominent figures from New York, California, Kentucky and New Jersey. Each lawsuit has been filed in state courts, emphasizing the urgency and seriousness of the concerns raised.
Central to these legal complaints is TikTok’s algorithm, which curates content for users on its “For You” feed based on their individual interests. Among these features are the app’s infinite scroll capability, attention-grabbing push notifications that create a sense of urgency, and the use of face filters that can promote unrealistic beauty standards. These elements, they argue, collectively contribute to a pattern of excessive engagement that can have detrimental effects on young users.
“They’ve chosen profit over the health and safety, well-being, and future of our children,” California Attorney General Rob Bonta stated. “And that is not something we can accept. So, we’ve sued.” His words encapsulate the growing frustration among lawmakers about the perceived neglect of children’s welfare by major tech companies in pursuit of profit.
In its filings, the District of Columbia called the algorithm “dopamine-inducing,” and said it was created to be intentionally addictive so the company could trap many young users into excessive use and keep them on its app for hours on end. TikTok does this despite knowing that these behaviors will lead to profound psychological and physiological harms, such as anxiety, depression, body dysmorphia and other long-lasting problems, the district said.
This wave of lawsuits against TikTok follows similar actions taken against Meta Platforms Inc., the parent company of Instagram. Over the past year, various states have targeted Meta, accusing the company of knowingly designing addictive features that contribute to a youth mental health crisis. District of Columbia Attorney General Brian Schwalb highlighted the financial motivations behind TikTok’s operational strategies, stating, “Keeping people on the platform is how they generate massive ad revenue. But unfortunately, that’s also how they generate adverse mental health impacts on the users.”
The challenges against TikTok are part of a broader reckoning with social media companies and their influence on the lives of young people. This legal response has been compared to the historic legal battles fought against the tobacco and pharmaceutical industries, where public health concerns took precedence over corporate interests.
However, TikTok faces an even more significant threat beyond these lawsuits. Under a federal law that came into effect earlier this year, the app could be banned in the U.S. by mid-January unless its parent company, ByteDance, divests its ownership. The situation is currently being contested in an appeals court in Washington, where a panel of judges is expected to issue a ruling that could potentially escalate to the U.S. Supreme Court.
In response to these lawsuits, TikTok has expressed disappointment, arguing that they had been collaborating with attorneys general for two years to address these concerns. Alex Haurek, a spokesperson for TikTok, asserted, “We strongly disagree with these claims, many of which we believe to be inaccurate and misleading. We’re proud of and remain deeply committed to the work we’ve done to protect teens, and we will continue to update and improve our product.”
Despite TikTok’s claims of ensuring a safe environment for its younger users, many states argue that children can easily circumvent age restrictions, gaining access to content that could be harmful. The District of Columbia’s lawsuit also raises issues regarding TikTok’s operations as an “unlicensed virtual economy,” allowing users to purchase TikTok Coins and send “Gifts” to streamers, who can convert these gifts into real money. This practice has raised questions about financial regulation and the potential exploitation of minors.
Moreover, the lawsuits highlight the troubling potential for exploitation within TikTok’s LIVE streaming feature, suggesting that it operates as a “virtual strip club” without appropriate age restrictions. The attorneys general involved are pursuing not just cessation of these practices but also financial penalties and damages for the alleged harm caused to users.
The use of social media among teenagers in the U.S. is nearly universal, with almost all teens aged 13 to 17 reporting engagement with various platforms. Research from the Pew Research Center indicates that about one-third of these teens use social media “almost constantly.” Alarmingly, high school students who engage frequently with social media report persistent feelings of sadness and hopelessness, according to a survey conducted by the Centers for Disease Control and Prevention.
In a parallel legal effort, 22 states, including Alabama, Colorado and Florida, have filed an amicus brief to support Tennessee’s ongoing investigation into TikTok, further signaling a concerted effort among state officials to hold social media companies accountable. This legal momentum follows another case where multiple states have previously challenged TikTok on grounds of harming children’s mental health and exposing them to inappropriate content.
As the legal landscape around TikTok continues to evolve, the outcomes of these lawsuits may set significant precedents regarding the responsibilities of social media platforms in protecting young users and addressing the pervasive concerns surrounding mental health in the digital age.