Big Tobacco’s moment on social media arrives with landmark child harm rulings

The ruling from the Los Angeles trial is noteworthy because this case was brought using tort law to hold social media companies liable for mental health harms suffered by an individual plaintiff, referred to as “Kelly” at trial. The jury found Meta and YouTube guilty of negligence in designing and operating an addictive product that was harmful to children, and that they failed to warn users of the harms. Never before has a case like this gone to trial, let alone a guilty verdict.

For more than a decade, child victims, teens, and parents have suffered countless harms caused by social media, such as suicide, self-harm, eating disorders, anxiety, and depression, but have been unable to obtain justice. If any other product, such as a defective toy or toxic food, had harmed children, parents would have already had their day in court.

Not so for social media. Tech companies hid behind the massive immunity shield of a law called… Article 230 It states that online platforms are not liable for harms caused by third-party speech they host, so any claims against platforms for harms to children from social media have been dismissed for years because of Section 230.

Greg Jarrett: Jury blames Meta and Google for harm to teens, but appeal could throw out case

Families react after Meta finds YouTube negligent in youth victimization issue.

Family members of victims speak to reporters outside Los Angeles Superior Court on March 25, in Los Angeles after a jury found Meta and YouTube negligent in a lawsuit alleging their platforms contributed to harmful behaviors among young users. (Kayla Bartkowski/Los Angeles Times via Getty Images)

Not anymore. This tort case took a new legal approach and focused exclusively on social media product design for being addictive and harmful to children – recommendation algorithms, “likes”, autoplay, infinite scrolling and notifications – regardless of content.

Their strategy has paid off. The jury saw the evidence for what it was. For example, when he is the co-founder and CEO of Facebook Mark Zuckerberg He took the stand at the trial and was asked about his decision to allow beauty filters that mimic plastic surgery on Instagram after 18 of Meta’s in-house experts warned they were harmful to teenage girls and could contribute to body dysmorphia.

He tried to wave it off, saying, “I think a lot of times telling people they can’t express themselves that way is arrogant.”

META vows to fight ‘hard’ after landmark rulings found the tech giant responsible for children’s addiction

They’ve seen emails and internal presentations that say, “younger is the best”, “omg oh my god, IG is dope” or “we’re basically push”. The jury could clearly see that these platforms were Designed to be addictiveThese companies were intentionally harming children and failed to warn users. As Mark Lanier, the plaintiff’s lead attorney, said in a news conference after the ruling: “We’ve sent the message that you’re only going to be charged because of the addictive features alone.”

There are thousands of other cases currently awaiting trial in the wings, 3000 in California Alone, with this positive first result, companies will be motivated to settle these other cases, rather than face trial again. Meta, YouTube, and other platforms named in the pending lawsuits, such as TikTok and Snapchat, should all be prepared to pay. Take the $6 million in damages awarded in this one judgment and multiply that by the thousands. This is Big Tobacco’s moment for Big Tech.

Allies and sympathizers of Big Tech argue that this ruling reduces parents’ responsibility for raising healthy children. They cite the plaintiff, as FIRE Executive Vice President Nico Perrino tweeted, “Callie says she started using YouTube at age 6 and Instagram at age 9 and told the jury she was on social media ‘all day long’ as a child.” He added: “Where were the parents?”

Nearly two-thirds of American voters support social media bans for children under 16, Fox News poll shows.

They are asking the wrong question. The problem is not absent parents, but addictive products without meaningful parental controls or robust age verification. As I explained in my book “Technical Exit”“Social media platforms actively work on parents to reach their children – they recruit young users and, as became clear in this Los Angeles trial, they do not effectively age their platforms or require any parental consent.

So the best outcome for these pending cases is not simply to pay huge sums to victims, but to restructure the way social media companies do business. One major pending lawsuit, a multidistrict lawsuit by 40 state attorneys general that will go to trial this summer, could do just that thing.

Click here for more Fox News opinions

In 1998, 52 state and territory attorneys general signed a Master Settlement Agreement (MSA) with the four largest U.S. tobacco companies to settle dozens of government lawsuits filed to recover billions of dollars in health care costs associated with treating smoking-related diseases.

Thousands of other cases are currently awaiting trial, including 3,000 in California alone, and with this first positive result, companies will be incentivized to settle these other cases, rather than face trial again.

This agreement changed the industry forever, Preventing tobacco from targeting young people in their advertising, prohibiting the use of cartoons (which appeal to children) in advertising or packaging, prohibiting payments for tobacco promotion in media, such as movies, television, music, and video games, providing money to states to fund smoking prevention campaigns, and more.

Click here to download the FOX NEWS app

As part of a potential master settlement agreement, prosecutors could similarly require strong age verification measures to keep underage minors out, parental consent requirements for social media accounts, or even force platforms to voluntarily raise the age of their accounts to 16 from 13.

The settlement agreement may also require companies to disable certain addictive features for minors under a certain age, such as recommendation algorithms, infinite scrolling, autoplay, “likes,” or other features. Social media It doesn’t have to be addictive. This initial positive ruling is important because it indicates that pending multi-district litigation could lead to a settlement as massive as Big Tobacco that changes the social media industry forever.

Click here to read more from Claire Morrell

Post Comment