by Ty Tagami | Capitol Beat News Service
ATLANTA — A generation of children are reaping what the technology industry sowed when it merged smartphones with social media, and the results have spurred calls for regulation.
That is what motivated state senators on Wednesday to hold the first of a series of hearings about the impact of social media and artificial intelligence on children and what to do about it.
The bipartisan committee was authorized by Lt. Gov. Burt Jones, a Republican running for governor. He backed a new law that sought to limit social media companies’ access to children, but it is tied up in court.
So children remain targets for social media platforms and application designers, said several experts and lawmakers who spoke at the hearing at the Capitol.
The committee is jointly helmed by a Democrat and a Republican. The Democratic co-chair, Sen. Sally Harrell, D-Atlanta, opened the meeting with an anecdote about raising her own kids as “guinea pigs” in the new technology environment, and how they had become glued to their devices.
It disrupted family dinners and caused them to lose interest in going outdoors and reading books, she said. But the impact she saw was comparatively minor: three people testified Wednesday that their children committed suicide because of social media.
Sharon Winkler said her son left a note explaining why he had taken his own life: he had gone to an online platform for solace after a breakup with his girlfriend but was instead met with bullying.
“We have to hold these technology companies accountable,” Winkler said.
The question is how.
Georgia’s General Assembly has already tried. The Protecting Georgia’s Children on Social Media Act sailed into law with broad bipartisan support last year. It was a top priority for Jones, but tech companies sued and have stopped the measure for now.
A federal judge for the Northern District of Georgia ruled in June that the industry-backed plaintiff in the case was likely to prevail on claims that the law violates the First Amendment’s speech protections.
The plaintiff, a group called NetChoice, represents a who’s who of social media companies, including Instagram, YouTube and X. The group contended the law went too far by requiring everyone to prove their age and identity to access their platforms, forcing adults to turn over personal information.
The law required companies to “make commercially reasonable efforts” verify users are at least 16 — or obtain parent consent. It also regulated advertising to children.
Experts who testified Wednesday said rulings by other courts suggest a path forward, with several saying states have the authority to require companies to set privacy defaults at the strictest level rather than making them wide open from the start. Parents generally are too busy or lack the knowhow to fiddle with such settings, they said.
Winkler was among many who testified that tech companies face no significant regulatory limits on application and platform designs that addict children, even toddlers. The lawmakers were given copies of the 2024 book, “The Anxious Generation,” by psychologist Jonathan Haidt, who dissects such designs and the resulting impact.
The committee will meet several more times through the rest of this year before formally recommending new legislation. Harrell said the next meeting, on Sept. 17, will offer a detailed look at age-verification laws and explore regulatory approaches to “manipulative” algorithms.
Her co-chair, Sen. Shawn Still, R-Johns Creek, said the issue was non-partisan, which makes some kind of legislation likely.
“It’s a societal issue,” he said, “and we’ve got to solve this together.”