Social Media and Adolescent Health (2024) : This PDF Is Available at
Social Media and Adolescent Health (2024) : This PDF Is Available at
org/27396
DETAILS
274 pages | 6 x 9 | PAPERBACK
ISBN 978-0-309-71316-0 | DOI 10.17226/27396
CONTRIBUTORS
Sandro Galea, Gillian J. Buckley, and Alexis Wojtowicz, Editors; Committee on
the Impact of Social Media on Adolescent Health; Board on Population Health and
BUY THIS BOOK Public Health Practice; Health and Medicine Division; National Academies of
Sciences, Engineering, and Medicine
Visit the National Academies Press at nap.edu and login or register to get:
– Access to free PDF downloads of thousands of publications
– 10% off the price of print publications
– Email or social media notifications of new titles related to your interests
– Special offers and discounts
All downloadable National Academies titles are free to be used for personal and/or non-commercial
academic use. Users may also freely post links to our titles on this website; non-commercial academic
users are encouraged to link to the version on this website rather than distribute a downloaded PDF
to ensure that all users are accessing the latest authoritative version of the work. All other uses require
written permission. (Request Permission)
This PDF is protected by copyright and owned by the National Academy of Sciences; unless otherwise
indicated, the National Academy of Sciences retains copyright to all materials in this PDF with all rights
reserved.
Social Media and Adolescent Health
Design Features
137
groups of experts that are neither fully independent of the platform nor
fully open about their process (Douek, 2019).
A general lack of transparency regarding social media operations has
bred public distrust of the platforms and the companies that run them.
Figure 5-1 shows results of a 2021 survey conducted by researchers at
George Mason University and The Washington Post indicating widespread
mistrust of the platforms. Transparency is a remedy for distrust, as it pro-
vides some assurance that the platform is conforming to public values and
expectations (MacCarthy, 2021).
Some of the companies’ reluctance to share information is well
founded. Platform algorithms are proprietary, which can make obliging
companies to share seem unfair and uncompetitive. Social media plat-
forms also hold a great deal of information about ordinary people that
could, in the wrong hands, be used for surveillance or blackmail (Brad-
shaw and Barrett, 2022). Therefore, questions of data access and sharing
can be especially fraught. Not all researchers can trust that their work will
be free from government interference, nor can civil society organizations
always assume that their independence will be respected.
The need for more accountability and openness in social media has
attracted the attention of Congress. Dozens of pieces of legislation in the
0 20 40 60 80 100
Trust not much/at all Trust a great deal/a good amount No opinion
FIGURE 5-1 Response to the question, “How much do you trust each of the fol-
lowing companies or services to responsibly handle your personal information
and data on your internet activity?”
SOURCE: Kelly and Guskin, 2021.
last two sessions alone have taken aim at advertising transparency, data
privacy, protections for minors, oversight of mobile apps, targeted mar-
keting, and other aspects of platform operations.2 There is clearly momen-
tum and political will for a system for better oversight. In response to this
momentum, the social media companies, including Meta and Alphabet,
have come to accept data privacy legislation (Kang, 2018). A prompt
coordinated effort to develop technical standards for platform operations,
transparency, and data use would be a meaningful step toward a better,
global system for platform accountability.
2 A Congress.gov search for legislation in the 117th and 118th Congresses for legislation
about online advertising, social networks, social media, online privacy, or online data indi-
cated more than 40 pieces of immediately relevant legislation, with many hundreds more
tangentially relevant.
on request (user-level
measure)
The public content Reports on material
moderation policy, taken down, proportion
the number of content of moderation decisions
moderators appealed
The amount of resources Reports on the amount of
dedicated to ensuring harmful content served to
advertising algorithms do adolescents through ads
not expose adolescents to
harmful content
Network-based health To track the extent to The amount and type Reports on the fraction
measures which social connection of resources dedicated of user connections
on the platform is to discerning network that promote social
positive and the extent to quality connectedness
which it is negative
Privacy and security To better align privacy Privacy setting Reports on privacy The UK open banking
and security settings with portability and security that initiative, wherein nine major
user preferences To allow users to state measure how users’ banks developed an industry
privacy preferences once understanding of the standard for customers to
Data use To clarify what types of Predictive models to Proxy indicators such
data algorithms can use identify young people in as proportion of young
mental health crisis people in suspected
mental health crisis
seeing ads about support
services
A public database of Advertising algorithms
advertising targeting audited and audit reports
criteria shared with FTC
Operational To improve Reports on the actions Recommendation
transparency understanding of how taken to make the algorithms audited and
the platforms work platform’s operations audit reports shared with
more transparent FTC
fact that industry stakeholders are often in the best position to set out
operational policies underlies the prior recommendation’s specification
that industry should be part of the ISO technical work group. There is also
reason to believe that companies will have an interest in monitoring one
another against the standards the ISO group develops. For this reason,
the social media companies should formally adopt these standards and
reference them in their public documents.
The companies would do well to adopt such standards to forestall
more sweeping regulatory action (Cusumano et al., 2021). The UK’s pro-
posed Online Safety Bill, for example, put significant demands on plat-
forms, even specifying the type of content moderation technology they
must use (Church and Pehlivan, 2023). Such restrictions can be imprac-
tical and detract from the time and resources platforms can designate
for product improvement or even to developing better tools for content
moderation.
A public statement that platforms will comply with all the measures
included in the standard and a commitment to the standard in its terms of
service would be a meaningful step toward an enforceable legal structure
on social media. Section 5 of the Federal Trade Commission Act gives the
FTC the authority to penalize firms that engage in unfair or deceptive
business practices, although this provision includes an exception enacted
in 1980 prohibiting the FTC from using its unfairness authority to pro-
mulgate rules governing children’s advertising.3 Using this authority,
the agency has brought enforcement actions against companies that have
failed to honor commitments made in their privacy policies and other
similar documents (FTC, 2023).
Failure to honor basic cybersecurity standards may also represent an
unfair business practice (FTC, 2021). Unlike deception, which is judged
against a firm’s affirmative statement, unfairness can be seen as a more
general failure to meet society’s expectations, including standards of
industry practice (Pertschuk et al., 1980). Though applied more spar-
ingly, unfairness can be the basis for enforcement actions even against
egregious conduct by companies that have not actively incorporated those
standards into their terms of use (FTC, 2003).
The FTC’s ability to characterize business practices as unfair depends
on the agency giving firms sufficient notice of what is necessary to meet
4 FederalTrade Commission vs. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015).
5 Federal Trade Commission, “16 CFR Chapter I: Trade Regulation Rule on Commercial
Surveillance and Data Security,” Federal Register 87, No. 202 (October 20, 2022) 63738.
REFERENCES
Allyn, B. 2021. Here are 4 key points from the Facebook whistleblower’s testimony on Capitol
Hill. https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-
haugen-congress (accessed September 18, 2023).
Bradshaw, S., and B. Barrett. 2022. Civil society organizations’ data, access, and tooling
needs for social media research. https://informationenvironment.org/wp-content/
uploads/2022/09/RP5-Civil-Society-Organizations-Data-Access-and-Tooling-Needs-
for-Social-Media-Research.pdf (accessed September 18, 2023).
Brown, I. 2022. The UK’s Midata and Open Banking Programmes: A case study in data
portability and interoperability requirements. Technology and Regulation 2022:113-123.
Brown, N. 2021. Regulatory Goldilocks: Finding the just and right fit for content modera-
tion on social platforms moderation on social platforms. Texas A&M Law Review 8(3).
https://scholarship.law.tamu.edu/cgi/viewcontent.cgi?article=1219&context=lawreview
(accessed September 18, 2023).
Center for Social Media Responsibility. 2022. Iffy Quotient. University of Michigan, School
of Information. https://csmr.umich.edu/projects/iffy-quotient (accessed February 23,
2024).
Church, P., and C. N. Pehlivan. 2023. The Digital Services Act (DSA): A new era for online
harms and intermediary liability. Global Privacy Law Review 4(1):53-59.
Collinson, J., and J. Persson. 2022. What does the ‘best interests of the child’ mean for protect-
ing children’s digital rights? A narrative literature review in the context of the ICO’s
age appropriate design code. Communications Law 27(3):132-148.
Cranor, L. F. 2003. P3P: Making privacy policies more useful. IEE Security & Privacy (Nov/
Dec):50-55. https://users.ece.cmu.edu/~adrian/630-f05/readings/cranor-p2p.pdf (ac-
cessed September 18, 2023).
Cusumano, M. A., A. Gawer, and D. B. Yoffie. 2021. Social media companies should self-regulate.
Now. Harvard Business Review. https://hbr.org/2021/01/social-media-companies-should-
self-regulate-now (accessed December 31, 2023).
Daily Business Review. 2015. FTC consent decrees are best guide to cybersecurity policies.
https://www.bsfllp.com/news-events/ftc-consent-decrees-are-best-guide-to-cyber
security-policies.html (accessed September 18, 2023).
Douek, E. 2019. Verified accountability: Self-regulation of content moderation as an answer
to the special problems of speech regulation. Hoover Working Group on National Secu-
rity, Technology, and Law. https://www.hoover.org/sites/default/files/research/docs/
douek_verified_accountability_aegisnstl1903_webreadypdf.pdf (accessed September
18, 2023).
Farthing, R., R. Abbas, K. Michael, and G. Smith-Nunes. 2021. Age appropriate digital
services for young people: Major reforms. IEEE Consumer Electronics Magazine (July/
August):40-48.
Franqueria, V., J. Annor, and O. Kafali. 2022. Age appropriate design: Assessment of TikTok,
Twitch, and YouTube Kids. https://doi.org/10.48550/arXiv.2208.0263.
FTC (Federal Trade Comission). 2003. The FTC’s use of unfairness authority: Its rise, fall, and
resurrection. https://www.ftc.gov/news-events/news/speeches/ftcs-use-unfairness-
authority-its-rise-fall-resurrection (accessed September 18, 2023).
FTC. 2021. Federal Trade Commission 2020 privacy and data security update. https://www.ftc.
gov/system/files/documents/reports/federal-trade-commission-2020-privacy-data-
security-update/20210524_privacy_and_data_security_annual_update.pdf (accessed
July 12, 2023).
FTC. 2023. Privacy and security enforcement. https://www.ftc.gov/news-events/topics/
protecting-consumer-privacy-security/privacy-security-enforcement (accessed July
12, 2023).
Gebru, T., J. Morgenstern, B. Vecchione, J. W. Vaughan, H. Wallach, H. Daumé III, and K.
Crawford. 2021. Datasheets for datasets. Communications of the ACM 64(12):86–92.
Goldman, E. 2022. California legislators seek to burn down the internet—for the children. https://
www.techdirt.com/2022/06/29/california-legislators-seek-to-burn-down-the-internet-
for-the-children (accessed September 18, 2023).
Henderson, R., A. Migdal, and T. He. 2016. Note: Industry self-regulation sustaining the com-
mons in the 21st century? Harvard Business School Background Note 315-074, March
2015. (Revised March 2016.)
IEEE (Institute of Electrical and Electronics Engineers). 2021. IEEE standard for an age ap-
propriate digital services framework based on the 5Rights principles for children. https://doi.
org/10.1109/IEEESTD.2021.9627644.
Information Comissioner’s Office. 2023. Introduction to the children’s code. https://ico.org.uk/
for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-
code-guidance-and-resources (accessed September 18, 2023).
ISO (International Organisation for Standardization). 2021. ISO strategy 2030. Geneva: Inter-
national Organisation for Standardization.
ISO. 2022. ISO/IEC 27000 family: Information security management. https://www.iso.org/
standard/iso-iec-27000-family (accessed June 1, 2023).
ISO. 2023. Standards. https://www.iso.org/standards.html (accessed October 17, 2023).
ISO. n.d. About us. https://www.iso.org/about-us.html (accessed July 12, 2023).
Kang, C. 2018. Tech industry pursues a federal privacy law, on its own terms. The New York
Times. https://www.nytimes.com/2018/08/26/technology/tech-industry-federal-
privacy-law.html (accessed September 18, 2023).
Kelly, H., and E. Guskin. 2021. Americans widely distrust Facebook, TikTok and Instagram
with their data, poll finds. The Washington Post. https://www.washingtonpost.com/
technology/2021/12/22/tech-trust-survey (accessed September 18, 2023).
Klonick, K. 2020. The Facebook Oversight Board: Creating an independent institution to
adjudicate online free expression. Yale Law Journal 129:2418-2499.
Knight First Amendment Institute. 2021. Researchers, NYU, Knight Institute condemn Facebook’s
effort to squelch independent research about misinformation. https://knightcolumbia.
org/content/researchers-nyu-knight-institute-condemn-facebooks-effort-to-squelch-
independent-research-about-misinformation (accessed September 18, 2023).
Krass, P. 2022. Transparency: The first step to fixing social media. https://ide.mit.edu/insights/
transparency-the-first-step-to-fixing-social-media (accessed May 31, 2023).
MacCarthy, M. 2021. How online platform transparency can improve content moderation and
algorithmic performance. The Brookings Institution. https://www.brookings.edu/
blog/techtank/2021/02/17/how-online-platform-transparency-can-improve-content-
moderation-and-algorithmic-performance (accessed September 18, 2023).
MacCarthy, M. 2022. Transparency recommendations for regulatory regimes of digital platforms.
Centre for International Governance Innovation. https://www.cigionline.org/
publications/transparency-recommendations-for-regulatory-regimes-of-digital-
platforms (accessed September 18, 2023).
Maroni, M. 2019. Some reflections on the announced Facebook Oversight Board. https://cmpf.eui.
eu/some-reflections-on-the-announced-facebook-oversight-board (accessed September
18, 2023).
Mitchell, M., S. Wu, A. Zaldivar, P. Barnes, L. Vasserman, B. Hutchinson, E. Spitzer, I. D. Raji,
and T. Gebru. 2019. Model Cards for model reporting. Paper presented at the Conference
on Fairness, Accountability, and Transparency, Atlanta, GA.
Napoli, P. M. 2018. What social media platforms can learn from audience measurement:
Lessons in the self-regulation of “black boxes.” Paper presented at 2018 Telecommuni-
cations Policy Research Conference, Washington, DC.
Panditharatne, M. 2022. Law requiring social media transparency would break new ground.
https://www.brennancenter.org/our-work/research-reports/law-requiring-social-
media-transparency-would-break-new-ground (accessed September 18, 2023).
Pertschuk, M., P. R. Dixon, D. A. Clanton, R. Pitofsky, and P. P. Bailey. 1980. FTC policy
statement on unfairness. December 17. https://www.ftc.gov/legal-library/browse/ftc-
policy-statement-unfairness (accessed July 12, 2023).
Rideout, V., A. Peebles, S. Mann, and M. B. Robb. 2021. The common sense census: Media use by
tweens and teens. Common Sense Media. https://www.commonsensemedia.org/sites/
default/files/research/report/8-18-census-integrated-report-final-web_0.pdf (accessed
September 18, 2023).
Rosch, J. T. 2011. Consent decrees: Is the public getting its money’s worth? https://www.ftc.gov/
sites/default/files/documents/public_statements/consent-decrees-public-getting-its-
moneys-worth/110407roschconsentdecrees.pdf (accessed September 18, 2023).
Shahriari, K., and M. Shahriari. 2017. IEEE standard review - ethically aligned design: A vi-
sion for prioritizing human wellbeing with artificial intelligence and autonomous sys-
tems. 2017 IEEE Canada International Humanitarian Technology Conference (IHTC):197–201.
Wong, R. Y., M. A. Madaio, and N. Merrill. 2023. Seeing like a toolkit: How toolkits envi-
sion the work of AI ethics. Proceedings of the ACM on Human–Computer Interaction
7(CSCW1):Article 145.