PROTECTING RETAILERS AND A LIFESTYLE

SURF SKATE SNOW SUP WAKE

“Social Media’s Accountability Without Alienating Teens”

“Social Media’s Accountability Without Alienating Teens”

Teen social media users are in for a year of frustrating surprises. That’s because both the U.S. Senate and social media platform founders are facing pressures to make 2024 a year of remediation. YouTube expanded parental controls back in 2021, while TikTok placed a 60-second time limit on teen video viewing sessions in 2023. Meta and Substack, however, are just beginning to heed the call to protect teen users from destructive content. The key issue is can social increase accountability without alienating teens?

Social media use goes beyond just boosting depression and anxiety among the general teen population, it also has negative impacts on a bevy of mental and emotional disorders. Think about its impact on vulnerable body-conscious teens when algorithms show them pro eating disorder content, for instance.

Although social media can be used by brands and retailers as a force for good, it’s taken a turn to the dark side. Teen behavior is anything but immune to outside influence and, as teen social media usage increases, disturbing trends are beginning to emerge. Let’s take a look at the factors leading governments and parents to call for remediation. We’ll also review the various ways that social media platforms are altering their policies, and how teens are reacting to those strategies.

Disturbing Effects for Teen Social Media Users

It isn’t an amazing era to be young. Modern teens have to worry about pandemics, global warming, and the disintegrating middle class. In the last five years, retail theft in America has increased by 94 percent. And while social media can be used to foster connections and community among affinity group members, we’re still learning about the negative implications of being constantly plugged in.

Most teens spend the lion’s share of their time on social media: In a recent report by Pew Research Center, nearly half of teen social media users are on their preferred platforms “almost constantly.” And Gallup reports that teens spend an average of 4.8 hours on social media each day. 

research study of American teens aged 12-15 found that those who used social media over three hours each day had double the risk of depression and anxiety symptoms. And teen social media users are privy to the same algorithms as everyone else. “If a teen searches for any kind of mental health condition, such as depression or suicide, it’s going to feed them information about those things,” says Linda Mayes, MD, chair of the Yale Child Study Center.

But social media use goes beyond just boosting depression and anxiety among the general teen population, it also has negative impacts on a bevy of mental and emotional disorders. Think about its impact on vulnerable body-conscious teens when algorithms show them pro eating disorder content, for instance.

Here’s a recent example: the viral (recently banned) “leggings legs” challenge on TikTok. The trend was for teens to videotape themselves in the mirror, verbally dissecting their perceived body imperfections. A macabre evolution of the once viral Thigh Gap Trend, #leggingslegs shows how social media use can exacerbate a wide variety of mental illnesses, body dysmorphia included.

Meta, Substack, and Venmo Rollbacks

Social media platforms historically prioritize profit over the mental health of their users. There isn’t much that social media platforms haven’t been accused of, from influencing elections and empowering Nazis to causing depression and facilitating cult member recruitment. In January of this year, a former Meta employee testified before Senate and said that the company was aware of the “harassment and harm” that teens face on its platforms, but that Meta leaders chose not to act on this information.

Meta is now limiting the types of content that Facebook and Instagram users under 18 years of age can see. Certain search terms, like #leggingslegs, will also be banned from Instagram. Rather than feeding teens whatever type of content they seek out, Meta will now redirect teen searches for certain topics. This includes suicide, self-harm and eating disorders, which will now be directed to resources like the National Alliance on Mental Illness.

Substack, a publishing platform that has long been accused of enabling Nazi publishers, is cracking down, too. Substack recently removed five publishers for violating content rules. (However, CNN was quick to highlight that none of Substack’s nixed newsletters had paid subscribers or readership numbers above a hundred.)  Substack historically had an “anything goes” approach to harmful content. So, the company’s recent gesture looks like the last of the old guard folding.

Even Venmo is hopping aboard the bandwagon to monitor and restrict teen social media use. The company recently  launched a Venmo Teen Account option that gives parents enhanced control over their teens’ access and spending.

The Difference Between Protection and Alienation

In the ongoing conflict of protecting our youth versus profit, profit historically wins the battles. But the impacts of using social media platforms on teen mental health are becoming undeniable. More undeniable (and unbelievable) is the claim that social media companies aren’t aware of the impact that their platforms are having. Now that former Meta employees are testifying before the Senate about the harm faced by teens that use the platform, accountability and denial are no longer options—they’re a necessity.

With that said, it’s a thin line between protection and alienation. Sure, guarding teens from harmful content is essential… it’s also essential not to stifle their voices or isolate them from beneficial online communities. Let’s be honest, teens aren’t quick to forgive when authority figures implement restrictions “for their own good.” Whether that figurehead is a parent, a mall security guard, or a social media platform is irrelevant. The reality is that platforms do face pushback when they rollout restrictions for teens.

Social media isn’t all bad. For every teen with body dysmorphia that’s exposed to pro anorexia content, there’s an isolated gay kid in Arkansas connecting with a like-minded community. We’re living in a critical time when social media policy is being shaped from the ground up, which comes with both responsibility and opportunity.  Social media platforms can use this time to take accountability for their roles in shaping teen mental health. Social media platform leaders can foster environments that empower rather than exploit by embracing transparency, quickly responding to data about negative use cases, creating community and spreading the message of moderation. The data on social media’s impact on teen mental health is right in front of us: it can be an impetus for a more unified world, or to keep young people distracted and confused. The choice is ours.


About The Robin Report

The Robin Report provides insights and opinion on major topics in the retail apparel and related consumer product industries. It delivers provocative, unbiased analysis on retail, brands and consumer products, and covers industry-wide issues, trends and consumer behavior throughout the retail-related industries. TRR is delivered exclusively on TheRobinReport.com. Additionally, TRR produces executive briefings and industry events.


If you are not yet a BRA Retail Member, you can easily opt in to either Regular (no cost) or Distinguished ($100/yr.) Membership via this super simple join form