Shetal Shah, MD, FAAP is a neonatologist at Maria Fareri Children’s Hospital and the Immediate Past President of the NYS AAP-Chapter 2. He is the Chair of the Pediatric Policy Council and Executive Committee Member of the AAP Section on Neonatal Perinatal Medicine. This op-ed was published in Tampa Bay Times on June 3, 2023.
At restaurants, grocery stores, even at stoplights – everywhere, children and teens are online, exploring social media, interactive video games and apps that do everything from track school buses to inform you what Taylor Swift had for breakfast. According to a report released last week by the surgeon general, 95% of teens use social media, a third of them “almost constantly.”
The nation’s top physician is confirming what pediatricians have known for years: Social media, as currently used, can be addicting. These virtually unregulated digital spaces, run by for-profit companies, can pose legitimate threats to the mental health of our teens and young children.
As a pediatrician, I often discuss a child’s digital diet during clinical visits, instructing parents to refrain from using a screen as an electronic babysitter -a steep uphill battle when children as young as 2 use mobile devices daily. For teens, I ask what social media platforms they have, whom they follow and how they use their account.
For many adolescents, these apps allow them to forge real friendships. LGBTQ+ kids can connect with like-minded and supportive allies. Sexually active teens can access medically vetted information on topics they would feel uncomfortable disclosing to parents. These platforms allow isolated children to find communities. There are groups for children with cancer and those with parents with cancer. This is social media at its best.
Teens spend an average of 3½ hours on social media daily, and as the surgeon general’s report describes, exorbitant time on these apps negatively affects their health. Adolescents who spend more than 3 hours on social media are twice as likely to display anxiety or depressive symptoms, fueled by an unrelenting menu of 30-second videos strategically designed to hypnotize them.
And that is social media at its worst.
In the attention economy, time on a platform equals money, so technology companies create precise algorithms to keep you screen-bound. And they are very good at it. Unsurprisingly, one-third of 11- to 15-year-old girls feel addicted to social media. These features are the reason videos you didn’t select automatically play in seconds, or why you’re served videos, clickbait and intentionally provocative messages, which prolong your time on the app. It’s why I’m served up messages saying the moon landing was faked, vaccines contain microchips and children don’t “deserve” health care, baiting me to pass hours responding to misinformation and crackpots.
Since every scroll, click, tap and comment is tracked, algorithms provide you content you didn’t even know you wanted. As a child of the ‘80s, I have a feed replete with nostalgic TV clips of Saturday-morning cartoons intent on holding me hostage. Once the platforms discern the user is a body-conscious teenager, unrealistic posts featuring unhealthy exercise regimens, crash diets and extreme fasting are offered, amplifying potentially harmful content. Forty-six percent of 13- to 17-year-olds say social media negatively impacts their body image. Pairing these manipulative design practices with an impressionable and still-developing adolescent brain – neurologically primed for social approval but without understanding of long-term consequences – renders teens vulnerable to digital influence. These issues are already coming at a time when our children’s mental health is so poor the American Academy of Pediatrics declared it a national state of emergency, and Montana banned TikTok.
Even worse, no protections exist to prevent kids from being targeted with inappropriate advertising for things like vapes and alcohol. Children deserve a safe digital landscape that provides them the supportive environments they need without the exploitation for profit.
This spring, the Senate introduced the Kids Online Safety Act, a bipartisan bill to prevent promotion of social media content harmful to child well-being. Companies would have to de-prioritize content related to self-harm, suicide and eating disorders. Apps would also have to stop the suggested content feeds, and minimize the automatic video play and eternal scrolling elements that keep teens chained to devices.
Another bill, the Children and Teens Online Privacy Protection Act, would update privacy laws and prevent wholesale cultivation of information like search habits, which factor strongly into personalized advertising. It’s time for an update. The last law addressing online protections for children passed in 1998 – an eternity in the pace of technological innovation.
The bill also closes a crater-sized loophole in the quarter-century-old law, which allowed companies to exempt themselves from regulations by burying a disclaimer deep in their user agreements. In 2019, TikTok reached a $5.7 million settlement with the U.S. Federal Trade Commission over allegations it violated children’s privacy laws – pocket change for a company with a market cap of almost $200 billion.
No pediatrician, parent, school board or state can undo the ever-present influence of social media. We need strong federal laws that create safe digital spaces. Social media at its best.
Now back to resisting the urge to click on “Best Soundtrack from ‘80s Movies.”