Breaking News Stories

California sues Facebook parent Meta, alleging harm to young people

California and 32 other states on Tuesday accused Facebook’s parent company, Meta, of posting “harmful content” on the major social network and its platform, Instagram, despite knowing the mental health risks to young people. “We designed and introduced the function,” the suit said.

“Meta has leveraged its extraordinary innovation and technology to engage adolescents and teens to get the most out of their products,” said Atty. Gen. Rob Bonta said at a press conference in San Francisco. “In an effort to increase profits, Meta has repeatedly misled the public about the serious dangers of its products.”

The 233-page lawsuit, filed in federal court in Northern California, alleges the social media giant violated state consumer protection laws and federal laws aimed at protecting the privacy of children under 13. Other states are also suing, including Florida, Utah and Vermont. Separate lawsuits. A total of 41 states and Washington, D.C. have taken legal action against meth.

The legal action highlights how states are trying to address potential mental health risks exacerbated by social media platforms, including body image issues, anxiety and depression. At a separate press conference with a bipartisan group of state attorneys general including Colorado, Tennessee, New Hampshire and Massachusetts, meth was compared to the tobacco industry.

“This appears to be part of a corporate strategy where there is knowledge of harm to the public and it is hidden and lies are covered,” Bonta said.

State attorneys general from across the United States will gather in 2021 Investigating the meta Promote Instagram, a social media platform for sharing photos and videos, to children and young people. Advocacy groups, lawmakers and parents criticize Meta, saying the multibillion-dollar company isn’t doing enough to combat eating disorders, suicide and other content that could harm users. are doing.

As part of the investigation, the state attorney general investigated Meta’s strategies to get young people to spend more time on its platform. These tactics include allowing users to infinitely scroll through posts on the app, enticing teens to log in with periodic notifications, and ensuring that content disappears within his 24 hours. This includes directing you to view the content again. The lawsuit also alleges that Meta knew through internal investigations that the platform was potentially dangerous to teens, but failed to address the harms on its platform. Features such as “like” buttons can cause teens to compare the popularity of their posts with those of others, and beauty filters can promote body dysmorphic disorder, the lawsuit alleges. are doing.

Meta said it is committed to keeping teens safe, noting it has deployed more than 30 tools to support youth and families.

“We are disappointed that the Attorney General has chosen this path instead of working productively with companies across the industry to create clear age-appropriate standards for the many apps that teens use. ,” a Mehta spokesperson said in a statement.

In 2021, scrutiny of meth’s potential damage to young people’s mental health increased after former Facebook product manager Francis Haugen exposed tens of thousands of company documents. Among those documents was research showing that Facebook is “harmful to teenage girls” and exacerbates body image issues and suicidal thoughts. wall street journal Meta reported in 2021. the study says it’s been “mischaracterized,” and teens also report that Instagram makes them feel better about other issues like loneliness and sadness.

That year, executives from social media companies, including Instagram chief Adam Mosseri, testified before Congress. Instagram has since paused development of its app for kids and introduced more controls to help parents limit the amount of time their teens spend on the app. Social media apps like Instagram require users to be at least 13 years old, but children are accessing the platforms by lying about their age.

Families from various states have sued Meta, accusing Instagram of worsening eating disorders and increasing suicidal thoughts among teenage girls. But these legal actions are blocked because Section 230 of the Communications Decency Act of 1996 prevents online platforms from being held responsible for content posted by their users. In California, technology companies and industry groups are suing to block a new state law aimed at protecting child safety and promoting transparency around content moderation. While other lawsuits are still ongoing, Bonta said this lawsuit could potentially provide financial relief to the family.

California and other states are hoping to change the practices of social media companies through litigation. Bonta said platforms such as Meta could change default settings to limit the amount of time young people spend on apps. He could also tweak the way content is recommended to teens, potentially dragging them down a rabbit hole of harmful videos and images.

The lawsuit accuses Mehta of violating federal child privacy laws. The complaint alleges that the social media site promotes content aimed at children and that the platform collects personal data from children without parental consent, despite knowing about users under 13. . Meta, for example, launched an ad campaign to drive teens to Instagram, which also hosted “kid-friendly” content about Sesame Street, Lego, and Hello Kitty on its platform. The complaint states:

According to the Pew Research Center, while more young people are leaving Facebook, Instagram remains popular among U.S. teens. investigation It was released this year. In 2022, approximately 62% of teens reported using her Instagram. TikTok and Snapchat are also popular among teens.

Approximately 22 million teens in the United States log into Instagram every day, according to the complaint.

There are growing concerns about the amount of time teens spend on social media, especially as platforms use algorithms to recommend content that users are likely to want to view. In 2022, attorneys general across the country will launch an investigation. Potential harms of TikTok Also for young people.

Technology continues to evolve rapidly as social media platforms face years of litigation. Meta has powered virtual reality and artificial intelligence that can generate content.

Bonta said it might be worth considering if the state attorney general needs to amend the complaint in the future.

“We focus on the well-documented practices that have caused the harm that has led us to this day,” he said.

Share this post:

Leave a Reply