Grieving parents urge self-protection as lawmakers struggle to rein in social media

Photo courtesy of istock.com.

Date: September 08, 2025

by Ty Tagami | Capitol Beat News Service

ATLANTA — Vincent LaBella did not see signs that his daughter was suffering until the police issued a warrant that allowed him to see the social media activity on her phone.

He had noticed that his young teen was glued to the device, but so were her friends. They would sit in the car together tapping out messages rather than talking.

Then, in early February, a day after Amaya LaBella hosted a small party at her family’s Buckhead home, she died by suicide.

Her Snapchat account told her parents why.

“Her whole friend group turned on her and there was a text chain one night that was going on for four or five hours,” Vincent LaBella said.

square ad for junk in the box

TikTok’s algorithm started feeding her sad songs and memorials of children who had died, he said. “And it just kept pushing her deeper and deeper in a hole.”

He called her phone “poison.”

LaBella was among several parents who testified at a recent legislative hearing about the impact of social media on children.

The Georgia General Assembly has already tried to rein in the platforms, passing a 2024 law that limited their access to children.

It had bipartisan support, but the industry sued and, in June, a federal judge in Atlanta issued a preliminary injunction against enforcement.

So lawmakers are looking for another way to restrain them.

They may succeed, but the corporations have power and influence, and are sure to fight back, said Ben Pargman, whose son, Manny, was 18 when he died by suicide in December.

It happened nine days after the college freshman had turned in an English paper about smartphone dependency leading to anxiety and depression, Pargman told lawmakers at that same hearing.

He encouraged them to impose limitations on the industry but also tempered expectations.

“I applaud you for this undertaking and what you’re about to do,” he said, adding, “You’re going to get pushback. The industry is well funded. They are well organized, and you will hear from them.”

Pargman said in an interview that he expects it will take years for regulations to take hold, much like the lengthy battles to restrict smoking and to require seatbelts and air bags in automobiles.

In the meantime, he advised parents to learn about the industry’s practices and to talk about it with their children.

Like many who testified at the first hearing of the state Senate study committee, Pargman urged parents to read the book The Anxious Generation, by Jonathan Haidt, who teaches ethical leadership at New York University’s business school. Haidt reports that the marriage of smartphones and social media around 2010 birthed a generation of anxious and depressed adolescents, as mental illness diagnoses for undergraduates skyrocketed.

Pargman’s son graduated from high school in Sandy Springs, so he gets invited to talk to schools there. He tells students they are not broken if they feel depressed. They should talk with friends, parents or teachers about it. And if a friend discloses suicidal feelings, violate that trust and report it to an adult who can intervene.

It is better to lose a friendship than a friend, he coaches them, warning that they are “crash test dummies” for social media platforms and device makers.

But the main responsibility lies with parents, he said. In this era, knowing the risks, they should tell their kids to come to them if they are having suicidal thoughts.

“We don’t have those conversations with our children,” he said. “And in the absence of that conversation and in that silence our kids don’t know how to handle it. And where do they go? They go to their phones. And once they got locked into social media — and now AI — they’re on a downward spiral whirlpool into the pits of hell.”

That is what happened to Alex Peiser, said his mother, Sharon Winkler. 

He was a happy kid, active in band, theater and his church group, she said, until a breakup with a girlfriend in 2014. He died a few days later.

Winkler had seen no hint of his spiral to suicide until she read the note he left behind.

He had gone online for support but found discouraging responses on Instagram. He also came across dark memes, such as images of corpses.

“He was sad because he had a breakup, so he would linger on these and the algorithm would say ‘oh good he wants more of this content’,” Winkler said in an interview after she testified at the legislative hearing. “So even though he didn’t type in ‘I want to see suicide content,’ he was getting fed it anyway just because he lingered on it.”

After three days of being “pummeled” with this he couldn’t take it anymore, she said. She told the lawmakers that the social media business model is to blame. Companies need to deliver users to advertisers, and their algorithms can zero in on what each person demonstrates they want to consume, she said.

Winkler said she learned from books such as “Subprime Attention Crisis,” by Tim Hwang of Georgetown University, who dissected the business model; and “Robin Hood Math: Take Control of the Algorithms That Run Your Life,” by  Noah Giansiracusa, who teaches math at Bentley University.

square ad for junk in the box

Now, whenever her granddaughter by another son visits, she grabs the girl’s devices and tightens all the privacy settings on her apps.

Most parents probably do not know their way around these settings, though.

Schools do what they can to educate children and parents. That 2024 state law also required schools to teach students about the dangers of social media.

Samantha King, the director of technology and media services for the Savannah-Chatham Public School System, said the district is complying and offers such training for parents, as well.

So do some nonprofit organizations, she said, adding that it is not enough to address the problem.

“It’s not reaching the masses,” she said. “It’s only very small little pockets of people who take the initiative to learn more.”

She understands what parents go through. After her daughter changed schools, she and her husband decided to get her an iPhone. She is only 9, but King said she wanted to stay in touch with friends.

So King used parent controls to limit her daughter’s access to apps between 8:30 a.m. and 6 p.m. so she can only play on the device when they are watching. She also cranked up the privacy settings, to limit what her girl can access and who can see her on interactive games such as Roblox, an app that experts warned lawmakers about.

It takes time, but anyone can do it, King said. Open the “Tips” app in iOS and search for “child,” then dive in.

She knows the day will come when her daughter has full control of her life online, so she plays Roblox alongside her and coaches her through other platforms, hoping to instill some wisdom.

“If I can keep sheltering her, I can maintain that level of innocence,” King said. “Once we open them up to so much, we kind of take that away unintentionally.”

Pargman’s son was exposed to platforms and apps before the risks were as well known. Pargman said he advises kids to treat mental illness like one treats the flu. Do not ignore it; get help.

“Talk about it. It’s okay. It doesn’t mean you’re broken,” he said. “But you can be if you don’t address it.”

EDITOR’S NOTE — This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988. There is also an online chat at 988lifeline.org

What to Read Next

The Author

Comment Policy

The Augusta Press encourages and welcomes reader comments; however, we request this be done in a respectful manner, and we retain the discretion to determine which comments violate our comment policy. We also reserve the right to hide, remove and/or not allow your comments to be posted.

The types of comments not allowed on our site include:

  • Threats of harm or violence
  • Profanity, obscenity, or vulgarity, including images of or links to such material
  • Racist comments
  • Victim shaming and/or blaming
  • Name calling and/or personal attacks;
  • Comments whose main purpose are to sell a product or promote commercial websites or services;
  • Comments that infringe on copyrights;
  • Spam comments, such as the same comment posted repeatedly on a profile.