Dexter schools sue social media giants, citing child mental health crisis
by Isabel Lohman (Bridge Michigan)
What can school districts do to change how social media companies interact with children and teens?
It’s a question that may be answered in a federal lawsuit targeting the most popular and ubiquitous social media platforms used by children and adolescents.
On Monday, the Dexter Community Schools board unanimously approved a resolution to join a nationwide lawsuit against several social media giants, including Meta (parent company of Facebook, Instagram and WhatsApp), Snap, Tik Tok, YouTube and others.
Earlier this year, Seattle Public Schools and Pittsburgh Public Schools joined the lawsuit, filed in the Northern District of California. The suit attempts to hold social media companies responsible for a mental health crisis among young people by designing manipulative algorithms that essentially “addict” young people, encouraging them to spend long hours on the sites.
Dexter Board President Mara Greatorex told Bridge Michigan the district also has suffered from the spread of viral social media trends like the one where students rip soap dispensers off the walls in school bathrooms.
“As a school district our duty is to keep our kids safe,” Greatorex said. “Our duty is to keep our kids engaged in the educational setting and when you have those outside forces, it’s harder for us to reach our goal.”
There are about 3,400 students in the Washtenaw County district.
“Even the best of parents struggle to be able to support their kids on being a positive user of social media,” Superintendent Chris Timmis told Bridge.
He said it would be ideal if the lawsuit could lead to some “guardrails” on how social media companies interact with younger users.
“The legislative branch has been unable to put protections for our kids, someone needs to do it,” he told Bridge Thursday.
School board member Daniel Alabré said that, personally speaking, any individual attempting to hold social media companies accountable could get bogged down by the companies’ lawyers. He hopes a large collective effort that includes schools could force companies to adjust their algorithms and “stop targeting kids” in their advertisements.
While parents and lawmakers have long bemoaned the influence of social media on young users, Facebook came under intense scrutiny in 2021 when a former employee, Frances Haugen shared thousands of internal documents showing the company’s own research indicated its products harmed the mental health of adolescents, particularly young girls, including leading to unhealthy views on body image.
The Dexter district will work with Frantz Law Group, APLC, the same group representing at least 125 Michigan school districts in a nationwide lawsuit against Juul, the vaping company. The firm would only collect money from the district if the district receives money from the lawsuit.
The district uses Thrun Law Firm, P.C. on retainer and that firm referred the district to Frantz Law Group.
There are at least 11 Michigan school districts in Michigan that have signed on to be a part of this lawsuit against the social media companies, Thrun Attorney Piotr Matusiak told Bridge in an email Thursday. Matusiak declined to name the other districts.
In a letter, the Thrun Law Firm said the parties are seeking past and future damages stemming from social media use, including property damage from students following social media trends along with funding for “counselors or educational programming” to handle issues that stem from social media.
“I think once you say ‘this is hurting kids,’ people start to listen,” Greatorex, the board president, told Bridge.
At the Center for Democracy and Technology, Caitlin Vogus promotes law and policy to support people’s free expression rights on the internet. She told Bridge that these type of lawsuits raise some important questions, including:
How do we address the harms people claim social media companies are causing young people?
And how do we balance that with protecting young people’s access to seek out information and express themselves online?
“We can’t ignore the positive effects it can have, too,” Vogus said, noting how students often use social media to fight for political change, citing, for instance, students leading school walkouts to protest gun violence.
Instead of instituting bans on certain social media apps or creating an age verification process, state and federal lawmakers could instead require companies to provide more transparency about how their platforms work, Vogus said.
Elise Bruderly, a Dexter mother of two high school juniors and vice president of the school board, said the lawsuit is a way for parents in the community to speak out and address their issues with social media.
Personally speaking, Bruderly said she hopes the suit could provide more consumer protection and influence how new products are designed.
She said she is trying to make rules that help her teenagers navigate social media while also creating off ramps for some of those rules as they near adulthood.
“They’re going to be 18 in the next year and it’s my job to teach them to live with these devices.”
But another Dexter parent said it’s up to parents to set guidelines for how young people use social media.
“I understand the concern from educators, but I think ultimately it is a parent’s responsibility to set limitations for their kids when it comes to usage and time spent on social media,” Jennifer DeGregorio, a mother of a 15-year-old son, said in an email. “Social media is not going away, so it’s important to engage kids in other, healthier ways.”
Sarita Schoenebeck, an associate professor of information at the University of Michigan whose research includes the study of online harassment, online design and algorithms, said the issue is complicated.
There’s research showing the benefits of social media and other research that shows the harm. And it can be difficult to know exactly what people are doing on social media or how much time they are spending because the research is based on self-reported data.
“There’s such a collective interest from so many different bodies…that it’s clear to me that there’s enough desire to see more accountability and responsibility for how social media companies maintain experiences for children or enable experiences for children,” Schoenebeck said.
But what those changes would look like is more tricky. For example, she said she has concerns about users’ privacy rights if they are required to show some form of identification to prove their age.
(Schoenebeck has had research previously funded by Meta and is currently conducting research funded by Google.)
Melissa Svastisalee, a Dexter mom of a 16-year-old daughter and 18-year-old son, is supportive of her district joining the lawsuit. She said there are some children and teens who seem to always be on social media and that could foster a sense of not belonging.
Whereas “the kids that are socially grounded, I feel they look at social media as something fun to do,” she said.
The social media companies targeted in the suit dispute their portrayal as uncaring corporations motivated only by profit, saying they are undertaking significant efforts to protect the health of young users.
“We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online,” Meta Head of Safety Antigone Davis said in a statement.
“We’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks.”
Davis added that Meta also invests “in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us. These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”
Snap Inc., developer of Snapchat, released a statement saying “nothing is more important to us than the wellbeing of our community.” It noted its efforts to curate content and use human moderation to review material before it reaches a wide audience.
“We also work closely with leading mental health organizations to provide in-app tools for Snapchatters and resources to help support both themselves and their friends,” the statement said.
You must be logged in to post a comment Login