Ga., S.C. sue social media giant, alleging harm to children

AUGUSTA, Ga. - Georgia and South Carolina are among 33 states suing Meta Platforms Inc. for allegedly harming young people’s mental health by knowingly deg features on Instagram and Facebook that addict children to its platforms.
The states allege the company, owner of Facebook and Instagram, knowingly designed and deployed harmful features on Instagram and its other social media platforms that purposefully addict children and teens. At the same time, Meta falsely assured the public that these features are safe and suitable for young s.
The lawsuit filed in federal court in California also claims that Meta routinely collects data on children under 13 without their parents’ consent, in violation of federal law.
MORE FROM NEWS 12:
- Ga. Supreme Court upholds abortion law, sends challenge back to lower court
- 4th person pleads guilty in Trump Ga. election-meddling case
- Georgia prosecutors are picking up cooperators in Trump election case
“Protecting our children is one of our most important jobs and that’s exactly what we’re trying to do with these lawsuits,” South Carolina Attorney General Alan Wilson said. “We can’t stand by and do nothing while Big Tech continues to engage in behavior that knowingly harms our children and breaks the law.”
Wilson says Meta’s business practices violate state consumer protection laws and the federal Children’s Online Privacy Protection Act.
While much of the complaint relies on confidential material that is not yet available to the public, publicly available sources including those previously released by former Meta employees detail that Meta profited by purposely making its platforms addictive to children and teens.
Its platform algorithms push s into descending “rabbit holes” in an effort to maximize engagement, Wilson said. Features like infinite scroll and near-constant alerts were created with the express goal of hooking young s, Wilson said.
Meta knew these addictive features harmed young people’s physical and mental health, including undermining their ability to get adequate sleep, but did not disclose the harm nor did the company make meaningful changes to minimize the harm, Wilson said.
States ing the federal lawsuit are Arizona, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia, and Wisconsin. Florida is filing its own federal lawsuit in the U.S. District Court for the Middle District of Florida.
Filing lawsuits in their own state courts are the District of Columbia, Idaho, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah, and Vermont.
The action follows damning newspaper reports, first by The Wall Street Journal in the fall of 2021, based on Meta’s own research that found that the company knew about the harms Instagram can cause teenagers — especially teen girls — when it comes to mental health and body image issues. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
Following the first reports, a consortium of news organizations, including The Associated Press, published their own findings based on leaked documents from whistleblower s Haugen, who has testified before Congress and a British parliamentary committee about what she found.
The use of social media among teens is nearly universal in the U.S. and many other parts of the world. Up to 95% of youth ages 13 to 17 in the U.S. report using a social media platform, with more than a third saying they use social media “almost constantly,” according to the Pew Research Center.
To comply with federal regulation, social media companies ban kids under 13 from g up to their platforms — but children have been shown to easily get around the bans, both with and without their parents’ consent, and many younger kids have social media s.
Other measures social platforms have taken to address concerns about children’s mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for s under 18. But once the limit is reached, minors can simply enter a code to keep watching.
In May, U.S. Surgeon General Dr. Vivek Murthy called on tech companies, parents and caregivers to take “immediate action to protect kids now” from the harms of social media.
Copyright 2023 WRDW/WAGT. All rights reserved.