Children and young people today are turning primarily to social media as the means for interacting and expressing themselves. However, the use of these platforms has become addictive, and they seem to be purposively designed that way. In the US, charges are now being brought against top Tech companies.
For years, we have taken measures to protect our youth from big industries like tobacco, alcohol, and pharma. However, social media, such as Meta (Facebook, Instagram, Whatsapp, Messenger), Tiktok, etc., can be just as harmful. The risk is mainly due to addictive “feeds.”1 Ninety-seven percent of teenagers report being online daily.2 This use among teens has been linked to long-term developmental damage, such as poor quality of sleep and mental health issues.3 Those who spend more than three hours a day on social media are twice as likely to have a medical condition. Among young girls, the link between mental distress and social media use is stronger than that between poor mental health and binge drinking, sexual assault, obesity, or hard drug use.4 In addition, children’s activity is also being tracked, and their data is shared and sold online.5
A Whistleblower Testifies
Against this backdrop, more than forty US states have filed a lawsuit against Meta. They accuse the social media company of harming the mental health of young people by getting them addicted and deceiving the public about the safety of their platforms. Arturo Béjar, a Meta whistleblower, testified in a Senate hearing on Nov. 7, 2023. He commented later to the Associated Press: “I can safely say that Meta’s executives knew the harm that teenagers were experiencing, that there were things that they could do that are very doable, and that they chose not to do them. . . . We can’t trust them with our children.”6 The indictment alleges that Meta’s platforms “profoundly altered the psychological and social realities of a generation of young Americans” and used “powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens.”7 Béjar further stated in an interview with the BBC: “I can speak firsthand about how easy it is to build a button and a counter. . . . I think the reason they don’t do it is because there is no transparency about the damage . . . .” 8 The situation is dire. Big Tech’s voluntary self-policing measures have failed. It wasn’t until 2021 that Instagram accounts for young people under 16 became private by default, only letting older users message teens who follow them. So much for the lawsuit. But can we navigate between the Scylla (anarchy) and Charybdis (surveillance tendencies) of this modern medium?
The Responsibility of Adults is Growing
In the US, the Kids Online Safety Act (KOSA) is now being presented and was quickly linked to Béjar’s testimony. The law stipulates that social media platforms must offer minors options to protect their data, disable addictive product features, and opt out of algorithmic recommendations. The platforms would be required to activate the strongest settings by default. Parents would also be given new controls to support their children and identify harmful behaviour, and a dedicated channel would be made available to report harm to children directly to the platform.
But, doesn’t this play right into the hands of these companies—because data is everything to them? The law will require social media platforms to prevent and mitigate harm to minors, such as the promotion of suicide, eating disorders, substance abuse, sexual exploitation, and illegal activities for minors (e.g., gambling and drinking alcohol). Social media platforms would have to conduct an annual independent audit to assess the risks to minors, compliance with this legislation, and whether the platform is taking meaningful steps to prevent this harm. To this end, they would have to develop metrics that allow both the organization and outsiders to assess and track the harm experienced by users. Academic and nonprofit organizations would be given access to important data sets from social media platforms to promote research on harm to the safety and well-being of minors.
Apart from the legal level (making9 and repealing10 laws), there are initiatives everywhere to make the Internet safer for children, including international11 and national12 support groups and parents’ initiatives that prevent the Internet from creeping in too early through external13 and internal14 resolutions.
It was the author Yuval Harari who claimed that Big Tech is getting better and better at hacking our thoughts, feelings, and behaviour,15 effectively taking over the human soul. An essential goal of being human is to gain more and more freedom over the shaping of one’s own fate. So, it seems all the more fatal if we yield it to the vicissitudes and gauntlet of Big Tech.
Translation Joshua Kelberman
Photo Robin Worral. Image source: Unsplash
Footnotes
- An addictive feed is a feature of social media that aims to keep users engaged and coming back to it again and again. There is often an algorithm that curates content based on past behavior and interests and presents it in a visually appealing and easy to consume way. It encourages scrolling, clicking, and spending more time on the platform.
- Emily A. Vogels and Risa Gelles-Watnick, “Teens and Social Media: Key Findings from Pew Research Center Surveys,” Pew Research Center.
- “Jugendliche sind schädlichen Inhalten zu Depressionen und Suizid ausgesetzt” [Young people are exposed to harmful content featuring depression and suicide], Amnesty International Schweiz [Switzerland].
- J. M. Twenge, J. Haidt, J. Lozano, and K. M. Cummins, “Specification curve analysis shows that social media use is linked to poor mental health, especially among girls.” Acta Psychologica 224, 103512 (2022).
- For example, “Governments Harm Children’s Rights in Online Learning,” Human Rights Watch. 2022.
- Barbara Ortutay, “A Meta engineer saw his own child face harassment on Instagram. Now, he’s testifying before Congress,” Associated Press (Nov. 7, 2023).
- Full indictment available online.
- Zoe Kleinman, “’I blew the whistle on Meta, now I won’t work again’,” BBC News (Nov. 7, 2023).
- For example, Kids Online Safety Act (KOSA) in the US, or similar laws in many other countries.
- For example, in October 2023, Sweden repealed its law forcing daycare children to use the internet; cf. P. Sigg, “Schweden streicht Bildschirmzwang für Kleinkinder” [Sweden abolishes compulsory screen time for small children], Infosperber (Nov.3.2023).
- For example, WHO, UNICEF, End-violence, WePROTECT.
- For example, Germany, Switzerland, Austria health ministries, parental guidelines.
- For example, The European Alliance of Initiatives for Applied Anthroposophy (ELIANT), “For the right to screen free day care institutions kindergartens and primary schools,” ALLIANCE ELIANT.
- For example, parents, classes, schools who decide not to give children a smartphone until the 8th grade, e.g., Wait Until 8th.
- N. Thompson, “When Tech Knows You Better Than You Know Yourself,” Wired (Oct. 4 2018).