by Ty Tagami | Capitol Beat News Service
ATLANTA — When Meta announced this week that it will implement a filter to make Instagram safe for children, it drew reactions somewhere between skeptical and cynical among Georgians who want to rein in the platform.
The social media company said Tuesday that it would implement a PG-13 filter similar to the ratings system used by the movie industry.
The filter will hide or not recommend posts with strong language, certain risky stunts and other content that could encourage potentially harmful behaviors, such as posts showing marijuana paraphernalia, the company said.
Georgia lawmakers have been studying ways to restrain the industry ever since a bipartisan law they passed last year hit a wall in the courts.
A federal judge issued a preliminary injunction against enforcement of the Protecting Georgia’s Children on Social Media Act in June after an industry group that represents Meta and other tech companies sued.
Sharon Winkler, who has testified twice to a Senate committee that has been studying other strategies to protect children online, said Meta has failed to fix “known safety issues” for years.
Her son, Alex Peiser, was 17 when he died by suicide in 2017 after breaking up with his girlfriend. Winkler blames Instagram’s algorithm for sending him down a dark hole.
“I’m afraid that this latest announcement is another cynical attempt to lull parents and other concerned adults into a false sense of security about Instagram’s safety for teens,” Winkler said in an email.
The group Fairplay, an online safety advocacy group, cited research by whistleblower Arturo Béjar and said most of Instagram’s promoted safety tools for teens have not worked.
“Splashy press releases won’t keep kids safe, but real accountability and transparency will,” Josh Golin, the group’s executive director, said in a statement. He said Meta should stop lobbying against the Kids Online Safety Act, which is stalled in Congress and would mandate reporting on the effectiveness of safety measures.
Meta disputes the study cited by Fairplay, calling it misleading in a September report about it by the news outlet Reuters.
Laura Ladefian, a certified professional counselor in Atlanta who works with children, said Instagram use is pervasive among her clients, starting around fifth grade and certainly by the end of middle school.
Ladefian testified to the Senate study committee about the addictive power of platforms.
A PG-13 filter would catch some harmful content, but teen users could still bypass it with “coded” language that can expose other children to a higher risk of anxiety, depression and poor self-image through bullying, Ladefian said in an interview. Content is a concern, she said, but the algorithms that drive addictive behavior are the core problem.
“None of that is necessarily accounted for in a filter. It’s the way that peers use and misuse the platforms for social currency,” she said. “My impression is that this is an attempt to calm parents. ‘Hey, we’re doing the thing. We are hearing your concern and here’s how we’re protecting your kids.’ But that would be like telling parents ‘OK, we’ve put these filters on a slot machine, but your kids are still welcome to come.’”
New Mexico Attorney General Raúl Torrez called the Instagram PG-13 filter announcement “a reactive and half-baked PR strategy” that he said doesn’t address harmful algorithms and ineffective user age verification.
Torrez’s office has sued Meta for design choices that he said put children at risk of sexual abuse, human trafficking and mental health harms. The lawsuit, which goes to trial Feb. 2, cites Béjar’s research.
Georgia Attorney General Chris Carr is defending Georgia’s law in court. His office had no comment about the new PG-13 filter but said Carr will “push for commonsense measures” that empower parents and keep kids safe online.
Lt. Gov. Burt Jones, a Republican like Carr, championed that law. But Jones did not respond to a request for comment about the new filter policy. Neither did Sen. Shawn Still, R-Johns Creek, the co-chair of the study committee that is exploring an alternative to that law.
Sen. Sally Harrell, D-Atlanta, the other co-chair, reacted with skepticism to Meta’s announcement, saying by email that the tech industry has a habit of releasing eye-catching headlines and then falling short.
Harrell said her bipartisan committee is focused on taming the algorithms.
“Digital companies want their products to be addictive, because the longer kids stay on their phones, the more profit companies make,” she said. “Our committee is more interested in removing these addictive features than eliminating an occasional bad word.”