0% found this document useful (0 votes)
16 views

Social Media and Adolescent Health (2024) : This PDF Is Available at

Social media

Uploaded by

avnx.ro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Social Media and Adolescent Health (2024) : This PDF Is Available at

Social media

Uploaded by

avnx.ro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

This PDF is available at http://nap.nationalacademies.

org/27396

Social Media and Adolescent Health


(2024)

DETAILS
274 pages | 6 x 9 | PAPERBACK
ISBN 978-0-309-71316-0 | DOI 10.17226/27396

CONTRIBUTORS
Sandro Galea, Gillian J. Buckley, and Alexis Wojtowicz, Editors; Committee on
the Impact of Social Media on Adolescent Health; Board on Population Health and
BUY THIS BOOK Public Health Practice; Health and Medicine Division; National Academies of
Sciences, Engineering, and Medicine

FIND RELATED TITLES SUGGESTED CITATION


National Academies of Sciences, Engineering, and Medicine. 2024. Social Media
and Adolescent Health. Washington, DC: The National Academies Press.
https://doi.org/10.17226/27396.

Visit the National Academies Press at nap.edu and login or register to get:
– Access to free PDF downloads of thousands of publications
– 10% off the price of print publications
– Email or social media notifications of new titles related to your interests
– Special offers and discounts

All downloadable National Academies titles are free to be used for personal and/or non-commercial
academic use. Users may also freely post links to our titles on this website; non-commercial academic
users are encouraged to link to the version on this website rather than distribute a downloaded PDF
to ensure that all users are accessing the latest authoritative version of the work. All other uses require
written permission. (Request Permission)

This PDF is protected by copyright and owned by the National Academy of Sciences; unless otherwise
indicated, the National Academy of Sciences retains copyright to all materials in this PDF with all rights
reserved.
Social Media and Adolescent Health

Design Features

As Chapter 2 explained, social media algorithms influence a user’s


experience of social media in ways that are both complicated and highly
variable. Young people’s experiences with social media can also be influ-
enced by exposure to violent or otherwise toxic content, by harassment, or
through introduction to bad actors, such as adults interested in grooming
to sexual predation or incitement to political radicalization. The commit-
tee recognizes that perfect controls over what users see is not a realistic
or necessarily desirable expectation for social media companies. But there
are provisions that can be incorporated into the design of apps, games,
and websites that limit the personal information companies collect, the
types of content available, and the prompts to extend time on a platform.
There will always be a place for a knowledgeable consumer to make
informed decisions about risks faced online. In the same way health lit-
eracy can allow patients to have more knowledgeable, better prepared
conversations with their health providers, media literacy, a topic dis-
cussed in detail in the next chapter, can allow for a more informed under-
standing about decisions made online. At the same time, the complexity
and pace of the online environment far exceed what adolescents—or any
layperson—could be reasonably expected to understand. This chapter
recommends steps at the level of platform design that would help tip the
balance of transparency to the users who support the platforms and the
government agencies that monitor the fairness of their operations.

137

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

138 SOCIAL MEDIA AND ADOLESCENT HEALTH

AGE-APPROPRIATE DESIGN CODE


The central mission of age-appropriate design is to make technology
safer for young people through “a set of processes for digital services
when end users are children [that] aids in the tailoring of the services
that are provided” (IEEE, 2021, p.11). Age-appropriate design can extend
to the way vendors collect and use information about minors and how
schools promote educational technology to students. It includes enhanced
privacy protections, whether in products specifically designed for minors
(e.g., children’s programming on YouTube) or products they are likely
to access (e.g., search engines). The rights and developmental needs of
children are central to the determination of the age appropriateness of a
product or platform.
Creating or modifying a product to meet child-friendly design stan-
dards starts with a full review of how the products’ features may influ-
ence children and a plan for how to mitigate those risks and test new
changes. The next steps, undertaken simultaneously, involve auditing
the features, identifying risks, and mitigating them. Steps to minimize
targeted advertising to children, for example, begin with an overview
of corporate policies on data privacy and the shareholders’ views on the
matter (IEEE, 2021).
A growing interest in designing digital technology for children led
the Institute for Electrical and Electronics Engineers (IEEE), an inter-
national association for electronic and electrical engineering, and the
5Rights Foundation to release a standard for age-appropriate design in
2021 (Shahriari and Shahriari, 2017). This design puts the burden of estab-
lishing users’ age on the producer of the technology. Knowledge of users’
age in turn allows companies to present terms of use that reflect adoles-
cents’ progressively growing capacity for understanding and independent
decision making (IEEE, 2021).
Age-appropriate design emphasizes the protection of young people’s
online privacy. The code requires platforms to collect only the necessary
information and use it in the way that the child (or, in some cases, the
responsible adult) had agreed to and not for commerce. It also discour-
ages the persuasive design features (i.e., features intended to extend the
time spent on a platform such as push notifications and tones when new
content is posted) that extend use, especially at night, and promotes a
high standard for content moderation. The standard also stipulates that
it is the technology developer’s responsibility to reduce the automated
recommendation of violent, offensive, or harmful content and misinfor-
mation (IEEE, 2021).
For product developers and vendors, age-appropriate design may
seem to impose burdensome restrictions, especially if they work in dif-

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 139

ferent jurisdictions with varying levels of relevant legal or regulatory


controls. In such a situation, the IEEE guidance encourages full review of
all applicable laws and regulations and, when in doubt as to the standard
required, to proceed with the service that more conservatively reflects the
best interests of the child (IEEE, 2021). For example, if a user declines to
enter a birthdate or if age cannot be verified, the technology developer
should not offer nudges to stay longer on the platform or push notifica-
tions at night.
The California Age-Appropriate Design Code Act,1 like similar legis-
lation in the UK, is primarily concerned with young people’s data privacy
(Information Comissioner’s Office, 2023). The IEEE standard goes slightly
further, including guidance on limiting features that encourage extended
use of platforms. But even the IEEE standard offers fewer specifics on
content moderation. Its guidance encourages companies to make invest-
ments in moderation proportionate to risk, that terms of content modera-
tion should be clear, and that parents and children should have a means
for redress (IEEE, 2021).
Age-appropriate design guidance can be technically vague and hard
to enforce, and assessment of compliance has been described as subjective
(Farthing et al., 2021; Franqueria et al., 2022; Goldman, 2022). Whether
other states follow the age-appropriate design code lead set by California
remains to be seen, and even if they do, critics of age-appropriate design
have seen it as infantilizing of children, especially older adolescents, as
the emphasis on acting in the best interest of the child presupposes that
the child is incapable of discerning what their best interests may be (Col-
linson and Persson, 2022). What is more, some of the most serious risks
to the mental and physical health of young people come from overuse
and algorithms that present unhealthy content. These are not necessarily
problems that age-appropriate design code aims to solve.
The age-appropriate design movement put concrete parameters on
what had been an abstract discussion about children’s privacy. Its empha-
sis on both the inputs to and outputs of a functional privacy system gives
researchers and companies a guideline against which to measure the data
collection risks that children encounter online. Yet threats to the mental
and physical health of young people are often traced to failures of content
moderation, algorithms that promote toxic content, and overuse. Social
media platforms would benefit from a similar standard to guide assess-
ment of how their algorithms influence well-being.

1 California Civil Code §§ 1798.99.28–1798.99.40.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

140 SOCIAL MEDIA AND ADOLESCENT HEALTH

GREATER TRANSPARENCY AND ACCOUNTABILITY


Social media is an important source of entertainment and connec-
tion for many people, especially adolescents. Given the importance these
platforms have in people’s lives there is a growing momentum for more
openness and oversight of their operation. Much of the public outrage
elicited by Frances Haugen’s revelations stemmed from the perception of
secrecy, the idea that harms known to executives inside the company were
kept from the public (Allyn, 2021). Allowing researchers and civil society
watchdogs access to social media data and review of their algorithms
would allow for a better understanding of how social media platforms
influence young people for better or worse.
It is difficult to determine what effect social media has on well-being
or the extent to which companies are doing due diligence to protect young
people from the more habit-forming affordances of their platforms, as
companies retain extremely tight control on their data and algorithms
(Krass, 2022). Publicly available data can support some research. The
University of Michigan’s Iffy Quotient, for example, aims to monitor the
extent to which Facebook and Twitter amplify disinformation (Center for
Social Media Responsibility, 2022). But even this is vulnerable. In 2021
Facebook sued researchers attempting to study political advertising—
using publicly available information—because the data scraping tools
they used violated the platforms’ terms of service, a topic discussed more
in Chapter 8 (Knight First Amendment Institute, 2021; Panditharatne,
2022). The tools Facebook authorizes for researchers, including a search-
able advertisement library called Ad Library API and a network data
analysis tool called CrowdTangle, provide “tightly circumscribed and
spotty information” (Panditharatne, 2022). Civil society groups requesting
access to social media data have reported an arbitrary lottery-like process,
highly dependent on personal relationships and subject to favoritism
(Bradshaw and Barrett, 2022).
In the same way, there can be a seemingly arbitrary approach to the
enforcement of content moderation guidelines. Participation in social
media has become an important part of modern life. When a platform’s
decisions seem unfair, aggrieved users may take the position that they are
a victim of corporate overreach, being denied access to a public venue,
in a manner similar to being disallowed entry to a movie theater or store
(MacCarthy, 2021). While such concerns are reasonable, there is also deep
public ambivalence regarding outside interference, especially from the
government, in determining which public statements should be amplified
and which ones should be silenced. In the balancing of trade-offs, a sys-
tem for content moderation has emerged that relies on oversight boards,

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 141

groups of experts that are neither fully independent of the platform nor
fully open about their process (Douek, 2019).
A general lack of transparency regarding social media operations has
bred public distrust of the platforms and the companies that run them.
Figure 5-1 shows results of a 2021 survey conducted by researchers at
George Mason University and The Washington Post indicating widespread
mistrust of the platforms. Transparency is a remedy for distrust, as it pro-
vides some assurance that the platform is conforming to public values and
expectations (MacCarthy, 2021).
Some of the companies’ reluctance to share information is well
founded. Platform algorithms are proprietary, which can make obliging
companies to share seem unfair and uncompetitive. Social media plat-
forms also hold a great deal of information about ordinary people that
could, in the wrong hands, be used for surveillance or blackmail (Brad-
shaw and Barrett, 2022). Therefore, questions of data access and sharing
can be especially fraught. Not all researchers can trust that their work will
be free from government interference, nor can civil society organizations
always assume that their independence will be respected.
The need for more accountability and openness in social media has
attracted the attention of Congress. Dozens of pieces of legislation in the

Facebook 72% 20% 8%

TikTok 63% 12% 25%

Instagram 60% 19% 20%

WhatsApp 53% 15% 32%

YouTube 53% 35% 12%

Google 47% 48% 4%

Microsoft 42% 43% 15%

Apple 40% 44% 16%

Amazon 40% 53% 7%

0 20 40 60 80 100

Trust not much/at all Trust a great deal/a good amount No opinion

FIGURE 5-1 Response to the question, “How much do you trust each of the fol-
lowing companies or services to responsibly handle your personal information
and data on your internet activity?”
SOURCE: Kelly and Guskin, 2021.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

142 SOCIAL MEDIA AND ADOLESCENT HEALTH

last two sessions alone have taken aim at advertising transparency, data
privacy, protections for minors, oversight of mobile apps, targeted mar-
keting, and other aspects of platform operations.2 There is clearly momen-
tum and political will for a system for better oversight. In response to this
momentum, the social media companies, including Meta and Alphabet,
have come to accept data privacy legislation (Kang, 2018). A prompt
coordinated effort to develop technical standards for platform operations,
transparency, and data use would be a meaningful step toward a better,
global system for platform accountability.

Recommendation 5-1: The International Organization for Stan-


dardization should convene an ongoing technical working group
including industry representatives, civil society, and academic
stakeholders to develop standards for social media platform design,
transparency, and data use.

Social media is a widely diverse set of tools, used differently, by dif-


ferent people. The extent to which particular platforms are committed to
maximizing the beneficial uses and curtailing the harmful ones is not clear
to anyone. The development of uniform standards is an essential precur-
sor to any transparent reporting, benchmarking of progress, or regulation.
Without such standards outside auditors cannot judge the effectiveness of
content moderation or the role of a platform’s advertising or recommen-
dation algorithms in promoting harmful content. Harmonized standards
are also the basis of comprehensible public disclosures, such as those gov-
erning terms and conditions of using an online service, or measures taken
to counter harassment or hate speech. A standard format for data at the
application program interface also greatly eases the study of confidential
algorithms (MacCarthy, 2022).
The International Organization for Standardization (known as ISO)
is an international, nongovernmental organization with a long history in
successfully setting and maintaining international standards. Its members
include the national standard-setting bodies of 168 countries and use a
process described as “voluntary, consensus-based and market relevant”
(ISO, n.d.). Given the worldwide reach of social media platforms and the
companies’ need to operate across borders and cultures, such interna-
tional support and buy-in are crucial. ISO also has long experience and
well-defined processes for updating standards to ensure their continued

2 A Congress.gov search for legislation in the 117th and 118th Congresses for legislation

about online advertising, social networks, social media, online privacy, or online data indi-
cated more than 40 pieces of immediately relevant legislation, with many hundreds more
tangentially relevant.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 143

relevance, something that will be necessary given the pace of change in


this field (ISO, 2021). ISO has considerable experience in similarly thorny
and technical topics. The ISO/IEC 27000 family of standards, for example,
provides a model for information security management and data protec-
tion (ISO, 2022). The committee envisions a similarly inclusive process
guiding the development of platform standards for social media.
The recommended standard setting process would be iterative and
dynamic, given the rapid pace of change in social media technology and
in society’s perception of threats. The ISO process is also designed to
include the full range of stakeholders needed to comment on the manage-
ment of technical processes, including “people such as manufacturers,
sellers, buyers, customers, trade associations, users or regulators” (ISO,
2023).

The Types of Standards Needed


The standards for social media operations and platform design would,
like the IEEE standard for age-appropriate design, articulate both inputs
and outputs of a functional system. Inputs refer to actions taken by the
platform, while outputs are partially driven by the platform but are also
shaped by the behaviors of users. Inputs can include processes for content
moderation or data use, content of privacy agreements, and mandatory
disclosures to users, all reflective of decisions largely within the plat-
form’s control. Outputs could include platform health measures, such as
the amount of toxicity on a platform, for example. A platform’s content
moderation and take-down policies will influence measures of toxicity,
but the platform cannot fully control something driven by the behavior of
its users. The distinction is key, as adherence to input standards requires
little if any margin for reaction time on the part of the platform.
At the platform level, measures should move beyond simple aggre-
gates and provide informative percentile summaries. The reported per-
centiles would aim to capture the harm experienced by those most vulner-
able, such as the amount of cyberbullying experienced by the most bullied
decile of adolescents. Since the association between social media use and
health outcomes varies across groups, standards should allow quantifica-
tion at the group or community level. Finally, aligned with algorithmic
transparency standards, on request, platforms should provide summaries
at the user level.
To better illustrate how this recommendation would work, Table 5-1
gives examples of key measures for which the ISO working group would
develop standards together with examples of input and output measures
that could be tracked. As discussed in Chapter 2, platform algorithms
cover ranking, ad-targeting, and content moderation. As such, Table 5-1

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

144 SOCIAL MEDIA AND ADOLESCENT HEALTH

gives examples of standards for each algorithmic type. As noted earlier,


standards should be measured comprehensively, so Table 5-1 gives exam-
ples of concrete platform-, community-, and user-level outcomes. When
applicable, past work with significant overlap is also presented, although
the many empty cells in this column illustrate the importance of building
these standards and encouraging their adoption.
One important measure that companies should report is their efforts
to remediate youth mental health problems. This information, like certain
audit and systemic risk reports, should be available on request to the
Federal Trade Commission (FTC). Better clarity on and tracking of the
standardized indicators would eventually allow for comparisons across
platforms and over time, giving both the public and the FTC better clarity
on the risks these platforms pose.
It is important to note that the examples provided in Table 5-1 are
proposed for illustrative purposes, not as a definitive list of needs. Part of
the value of a reoccurring convening via ISO is that the standards could
develop in line with a growing body of scientific consensus on the ways
social media influences adolescents. Consider for example, the influence
of image sharing platforms on body image discussed in Chapter 4. Given
the strength of this association, the recommended standards may do well
to include measures related to the amount of such content seen on a given
platform.

ADOPTING THE STANDARDS


Critics of the previous recommendation may maintain that such steps
are not necessary as the social media industry already has relevant rules
in place. Self-regulation has long been relied on in the media: television,
movies, videogames, and music all make use of industry standards for
content rating (Napoli, 2018). Recent years have seen greater effort at
industry self- and third-party regulation of social media, exemplified by
Facebook’s Oversight Board (Klonick, 2020). This oversight board could
help protect users from unfair treatment (Maroni, 2019). At the same
time, there will always be a suspicion that the real goal of such a board,
or any effort at self-regulation, is to bolster the platform’s market position
or authority (Maroni, 2019). Social media platforms’ success depends on
engaging as many users as possible, something controversy and emotion
can do (Brown, 2021). Asking companies to moderate the more sensa-
tional voices on their platform could be asking them to act against their
business interests (Brown, 2021).
Skepticism of self-regulation aside, enacting a regulatory framework
across jurisdictions on global companies is not always a legally or logisti-
cally viable option (Henderson et al., 2016). An acknowledgement of the

Copyright National Academy of Sciences. All rights reserved.


TABLE 5-1 Operationalizing Standards for Social Media Operations, Transparency, and Data Use
Aim Input Examples Output Examples Transferable Work
Content-based health To analyze the nature The amount and type of Reports on the amount
measures of the content with resources dedicated to of cyberbullying
implications for users’ ensuring harmful content experienced by the 25%,
health and well-being is identified and demoted 10%, 5%, and 1% most
Social Media and Adolescent Health

bullied users tracked


over time (platform-level
measure)

Reports on the amount of


cyberbullying found in a
specific subcommunity,
e.g., Facebook group
(community-level
measure)

Reports on the amount


of cyberbullying attacks
experienced by a user,
reported to that user
on request (user-level
measure)
Reports on material
taken down, proportion
of moderation decisions
appealed

Copyright National Academy of Sciences. All rights reserved.


(continued)
145
TABLE 5-1 Continued
146

Aim Input Examples Output Examples Transferable Work


Reports on the amount
of cyberbullying attacks
experienced by a user,
reported to that user
Social Media and Adolescent Health

on request (user-level
measure)
The public content Reports on material
moderation policy, taken down, proportion
the number of content of moderation decisions
moderators appealed
The amount of resources Reports on the amount of
dedicated to ensuring harmful content served to
advertising algorithms do adolescents through ads
not expose adolescents to
harmful content
Network-based health To track the extent to The amount and type Reports on the fraction
measures which social connection of resources dedicated of user connections
on the platform is to discerning network that promote social
positive and the extent to quality connectedness
which it is negative
Privacy and security To better align privacy Privacy setting Reports on privacy The UK open banking
and security settings with portability and security that initiative, wherein nine major
user preferences To allow users to state measure how users’ banks developed an industry
privacy preferences once understanding of the standard for customers to

Copyright National Academy of Sciences. All rights reserved.


and deploy them across privacy and security transfer their financial dataa
apps and platforms policy evolve over time
Privacy policies written Reports on the number The Platform for Privacy
in standard machine- of users that benefit Preferences (P3P) tool,
readable format that can from the use of machine- intended to enable users
be read automatically by readable privacy policies to limit their exposure to
a web browser reducing (users that port their websites with privacy policies
burden on user privacy settings to use that do not match their
elsewhere) preferencesb
Social Media and Adolescent Health

Data use To clarify what types of Predictive models to Proxy indicators such
data algorithms can use identify young people in as proportion of young
mental health crisis people in suspected
mental health crisis
seeing ads about support
services
A public database of Advertising algorithms
advertising targeting audited and audit reports
criteria shared with FTC
Operational To improve Reports on the actions Recommendation
transparency understanding of how taken to make the algorithms audited and
the platforms work platform’s operations audit reports shared with
more transparent FTC

NOTE: a Brown, 2022; b Cranor, 2003.

Copyright National Academy of Sciences. All rights reserved.


147
Social Media and Adolescent Health

148 SOCIAL MEDIA AND ADOLESCENT HEALTH

fact that industry stakeholders are often in the best position to set out
operational policies underlies the prior recommendation’s specification
that industry should be part of the ISO technical work group. There is also
reason to believe that companies will have an interest in monitoring one
another against the standards the ISO group develops. For this reason,
the social media companies should formally adopt these standards and
reference them in their public documents.
The companies would do well to adopt such standards to forestall
more sweeping regulatory action (Cusumano et al., 2021). The UK’s pro-
posed Online Safety Bill, for example, put significant demands on plat-
forms, even specifying the type of content moderation technology they
must use (Church and Pehlivan, 2023). Such restrictions can be imprac-
tical and detract from the time and resources platforms can designate
for product improvement or even to developing better tools for content
moderation.

Recommendation 5-2: Social media providers should adopt the


standards referenced in the previous recommendation as a matter
of policy and as specific provisions in their terms of service.

A public statement that platforms will comply with all the measures
included in the standard and a commitment to the standard in its terms of
service would be a meaningful step toward an enforceable legal structure
on social media. Section 5 of the Federal Trade Commission Act gives the
FTC the authority to penalize firms that engage in unfair or deceptive
business practices, although this provision includes an exception enacted
in 1980 prohibiting the FTC from using its unfairness authority to pro-
mulgate rules governing children’s advertising.3 Using this authority,
the agency has brought enforcement actions against companies that have
failed to honor commitments made in their privacy policies and other
similar documents (FTC, 2023).
Failure to honor basic cybersecurity standards may also represent an
unfair business practice (FTC, 2021). Unlike deception, which is judged
against a firm’s affirmative statement, unfairness can be seen as a more
general failure to meet society’s expectations, including standards of
industry practice (Pertschuk et al., 1980). Though applied more spar-
ingly, unfairness can be the basis for enforcement actions even against
egregious conduct by companies that have not actively incorporated those
standards into their terms of use (FTC, 2003).
The FTC’s ability to characterize business practices as unfair depends
on the agency giving firms sufficient notice of what is necessary to meet

3 15 U.S. Code § 57a.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 149

their legal obligations.4 The agency’s proposed new rule on commercial


surveillance and data security has identified the “extent [to which] com-
mercial surveillance practices or lax data security measures harm chil-
dren, including teenagers” as an area of particular concern.5 An industry
standard on data security and advertising could facilitate the agency’s
oversight of these practices.
The creation of a standard would also support the FTC’s use of
consent decrees as a regulatory tool. The agency will negotiate consent
decrees with companies that fail to meet expected standards, as it has
done for data protection (Daily Business Review, 2015). Once a company
agrees to a consent decree, the terms of the decree determine its obliga-
tions to remediate, regardless of whether or not those terms are strictly
within the FTC’s authority (Rosch, 2011).
The creation of industry standards for social media would inform
the FTC’s governance by consent decree, even for social media provid-
ers that do not explicitly adopt the standard into its terms of service.
Nevertheless, it is the committee’s hope that the standards development
process described in Recommendation 5-1 would trigger a virtuous cycle
of compliance. International standards can be a marker of good business
practice and even a badge of pride, a dynamic that would be analogous
to companies seeking green building certification in the absence of any
legal obligation to do so. The normative pressure of industry standards
could serve as a signal to the public of a company’s sincere and meaning-
ful steps to mitigate the harms associated with its product.

USING THE STANDARDS


A similar process is underway in artificial intelligence (AI) and
machine learning. Ethical AI tool kits are designed to enable more open
communication among technology developers, researchers, policy mak-
ers, and civil society (Wong et al., 2023). Tools such as Model Cards, which
provide short explanations of how and against which machine learning
tools are benchmarked, are a step toward transparency in AI (Mitchell
et al., 2019). Similarly, public documentation on the provenance of the
datasets used to calibrate machine learning models is gaining traction as
a way to mitigate the harms a biased model can cause (Gebru et al., 2021).
As part of the ethical AI movement, IEEE has set out standards and
guidelines to ensure that the AI systems prioritize human well-being in
design (Shahriari and Shahriari, 2017). The standards developed from the

4 FederalTrade Commission vs. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015).
5 Federal Trade Commission, “16 CFR Chapter I: Trade Regulation Rule on Commercial
Surveillance and Data Security,” Federal Register 87, No. 202 (October 20, 2022) 63738.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

150 SOCIAL MEDIA AND ADOLESCENT HEALTH

implementation of Recommendation 5-1 could draw on these principles,


evaluating the platform’s transparency about its policies and practices
and its accountability for data breaches or violations of user privacy.
The standards could evaluate whether the platform has age-verification
processes, data encryption, and robust privacy policies in place, along
with efforts to educate parents and other stakeholders on cyberbullying
and reporting and blocking mechanisms. The standards could shine a
light on the extent to which platforms are performing due diligence to
enforce their age minimums. In 2021, Common Sense Media found that
38 percent of children between ages 8 and 12 have used social media,
for example (Rideout et al., 2021). Standards could also clarify whether
a social media platform’s content is suitable for children and teens based
on age-appropriate criteria and whether the design of the platforms’ fea-
tures and affordances for young people are developmentally informed or
evidence based.
Practically speaking, such standards could form the basis of a rat-
ing system or a checklist assessment of items that enumerate respon-
sible design. Such a checklist could be used to create a library of ranked
social media platforms or apps wherein included apps have some level
of endorsement for children or teens. The library could even provide
clear language information to parents and guardians about the specific
purposes and affordances of each app—something particularly valuable
given the dynamic and changing landscape of new social media platforms.
The committee recognizes that greater transparency and accountabil-
ity in the design of social media do not necessarily prevent young people
from accessing inappropriate content or taking risks online. Many young
people are tech-savvy and can find ways to bypass age restrictions or
privacy settings. Nevertheless, an objective quality benchmark could be
invaluable to parents trying to determine which platforms could provide
the most positive experience for their children. Complying with certain
standards would be an important indicator to anyone in a position to
authorize a platform or app for personal or institutional use, as in a school
system. Some form of benchmarking apps could help school districts bet-
ter interpret the market for educational technology.
Social media operations are remarkably poorly understood, especially
for products so influential and widely used. Accessible and comparable
standards would be an aid to consumers who want a valid indicator of
various platforms’ commitment to data privacy, content moderation, and
other important aspects of the user experience. This important first step
toward product benchmarking could introduce greater transparency and
ultimately more fair competition into an opaque market.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 151

REFERENCES
Allyn, B. 2021. Here are 4 key points from the Facebook whistleblower’s testimony on Capitol
Hill. https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-
haugen-congress (accessed September 18, 2023).
Bradshaw, S., and B. Barrett. 2022. Civil society organizations’ data, access, and tooling
needs for social media research. https://informationenvironment.org/wp-content/
uploads/2022/09/RP5-Civil-Society-Organizations-Data-Access-and-Tooling-Needs-
for-Social-Media-Research.pdf (accessed September 18, 2023).
Brown, I. 2022. The UK’s Midata and Open Banking Programmes: A case study in data
portability and interoperability requirements. Technology and Regulation 2022:113-123.
Brown, N. 2021. Regulatory Goldilocks: Finding the just and right fit for content modera-
tion on social platforms moderation on social platforms. Texas A&M Law Review 8(3).
https://scholarship.law.tamu.edu/cgi/viewcontent.cgi?article=1219&context=lawreview
(accessed September 18, 2023).
Center for Social Media Responsibility. 2022. Iffy Quotient. University of Michigan, School
of Information. https://csmr.umich.edu/projects/iffy-quotient (accessed February 23,
2024).
Church, P., and C. N. Pehlivan. 2023. The Digital Services Act (DSA): A new era for online
harms and intermediary liability. Global Privacy Law Review 4(1):53-59.
Collinson, J., and J. Persson. 2022. What does the ‘best interests of the child’ mean for protect-
ing children’s digital rights? A narrative literature review in the context of the ICO’s
age appropriate design code. Communications Law 27(3):132-148.
Cranor, L. F. 2003. P3P: Making privacy policies more useful. IEE Security & Privacy (Nov/
Dec):50-55. https://users.ece.cmu.edu/~adrian/630-f05/readings/cranor-p2p.pdf (ac-
cessed September 18, 2023).
Cusumano, M. A., A. Gawer, and D. B. Yoffie. 2021. Social media companies should self-regulate.
Now. Harvard Business Review. https://hbr.org/2021/01/social-media-companies-should-
self-regulate-now (accessed December 31, 2023).
Daily Business Review. 2015. FTC consent decrees are best guide to cybersecurity policies.
https://www.bsfllp.com/news-events/ftc-consent-decrees-are-best-guide-to-cyber
security-policies.html (accessed September 18, 2023).
Douek, E. 2019. Verified accountability: Self-regulation of content moderation as an answer
to the special problems of speech regulation. Hoover Working Group on National Secu-
rity, Technology, and Law. https://www.hoover.org/sites/default/files/research/docs/
douek_verified_accountability_aegisnstl1903_webreadypdf.pdf (accessed September
18, 2023).
Farthing, R., R. Abbas, K. Michael, and G. Smith-Nunes. 2021. Age appropriate digital
services for young people: Major reforms. IEEE Consumer Electronics Magazine (July/
August):40-48.
Franqueria, V., J. Annor, and O. Kafali. 2022. Age appropriate design: Assessment of TikTok,
Twitch, and YouTube Kids. https://doi.org/10.48550/arXiv.2208.0263.
FTC (Federal Trade Comission). 2003. The FTC’s use of unfairness authority: Its rise, fall, and
resurrection. https://www.ftc.gov/news-events/news/speeches/ftcs-use-unfairness-
authority-its-rise-fall-resurrection (accessed September 18, 2023).
FTC. 2021. Federal Trade Commission 2020 privacy and data security update. https://www.ftc.
gov/system/files/documents/reports/federal-trade-commission-2020-privacy-data-
security-update/20210524_privacy_and_data_security_annual_update.pdf (accessed
July 12, 2023).
FTC. 2023. Privacy and security enforcement. https://www.ftc.gov/news-events/topics/
protecting-consumer-privacy-security/privacy-security-enforcement (accessed July
12, 2023).
Gebru, T., J. Morgenstern, B. Vecchione, J. W. Vaughan, H. Wallach, H. Daumé III, and K.
Crawford. 2021. Datasheets for datasets. Communications of the ACM 64(12):86–92.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

152 SOCIAL MEDIA AND ADOLESCENT HEALTH

Goldman, E. 2022. California legislators seek to burn down the internet—for the children. https://
www.techdirt.com/2022/06/29/california-legislators-seek-to-burn-down-the-internet-
for-the-children (accessed September 18, 2023).
Henderson, R., A. Migdal, and T. He. 2016. Note: Industry self-regulation sustaining the com-
mons in the 21st century? Harvard Business School Background Note 315-074, March
2015. (Revised March 2016.)
IEEE (Institute of Electrical and Electronics Engineers). 2021. IEEE standard for an age ap-
propriate digital services framework based on the 5Rights principles for children. https://doi.
org/10.1109/IEEESTD.2021.9627644.
Information Comissioner’s Office. 2023. Introduction to the children’s code. https://ico.org.uk/
for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-
code-guidance-and-resources (accessed September 18, 2023).
ISO (International Organisation for Standardization). 2021. ISO strategy 2030. Geneva: Inter-
national Organisation for Standardization.
ISO. 2022. ISO/IEC 27000 family: Information security management. https://www.iso.org/
standard/iso-iec-27000-family (accessed June 1, 2023).
ISO. 2023. Standards. https://www.iso.org/standards.html (accessed October 17, 2023).
ISO. n.d. About us. https://www.iso.org/about-us.html (accessed July 12, 2023).
Kang, C. 2018. Tech industry pursues a federal privacy law, on its own terms. The New York
Times. https://www.nytimes.com/2018/08/26/technology/tech-industry-federal-
privacy-law.html (accessed September 18, 2023).
Kelly, H., and E. Guskin. 2021. Americans widely distrust Facebook, TikTok and Instagram
with their data, poll finds. The Washington Post. https://www.washingtonpost.com/
technology/2021/12/22/tech-trust-survey (accessed September 18, 2023).
Klonick, K. 2020. The Facebook Oversight Board: Creating an independent institution to
adjudicate online free expression. Yale Law Journal 129:2418-2499.
Knight First Amendment Institute. 2021. Researchers, NYU, Knight Institute condemn Facebook’s
effort to squelch independent research about misinformation. https://knightcolumbia.
org/content/researchers-nyu-knight-institute-condemn-facebooks-effort-to-squelch-
independent-research-about-misinformation (accessed September 18, 2023).
Krass, P. 2022. Transparency: The first step to fixing social media. https://ide.mit.edu/insights/
transparency-the-first-step-to-fixing-social-media (accessed May 31, 2023).
MacCarthy, M. 2021. How online platform transparency can improve content moderation and
algorithmic performance. The Brookings Institution. https://www.brookings.edu/
blog/techtank/2021/02/17/how-online-platform-transparency-can-improve-content-
moderation-and-algorithmic-performance (accessed September 18, 2023).
MacCarthy, M. 2022. Transparency recommendations for regulatory regimes of digital platforms.
Centre for International Governance Innovation. https://www.cigionline.org/
publications/transparency-recommendations-for-regulatory-regimes-of-digital-
platforms (accessed September 18, 2023).
Maroni, M. 2019. Some reflections on the announced Facebook Oversight Board. https://cmpf.eui.
eu/some-reflections-on-the-announced-facebook-oversight-board (accessed September
18, 2023).
Mitchell, M., S. Wu, A. Zaldivar, P. Barnes, L. Vasserman, B. Hutchinson, E. Spitzer, I. D. Raji,
and T. Gebru. 2019. Model Cards for model reporting. Paper presented at the Conference
on Fairness, Accountability, and Transparency, Atlanta, GA.
Napoli, P. M. 2018. What social media platforms can learn from audience measurement:
Lessons in the self-regulation of “black boxes.” Paper presented at 2018 Telecommuni-
cations Policy Research Conference, Washington, DC.
Panditharatne, M. 2022. Law requiring social media transparency would break new ground.
https://www.brennancenter.org/our-work/research-reports/law-requiring-social-
media-transparency-would-break-new-ground (accessed September 18, 2023).

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

DESIGN FEATURES 153

Pertschuk, M., P. R. Dixon, D. A. Clanton, R. Pitofsky, and P. P. Bailey. 1980. FTC policy
statement on unfairness. December 17. https://www.ftc.gov/legal-library/browse/ftc-
policy-statement-unfairness (accessed July 12, 2023).
Rideout, V., A. Peebles, S. Mann, and M. B. Robb. 2021. The common sense census: Media use by
tweens and teens. Common Sense Media. https://www.commonsensemedia.org/sites/
default/files/research/report/8-18-census-integrated-report-final-web_0.pdf (accessed
September 18, 2023).
Rosch, J. T. 2011. Consent decrees: Is the public getting its money’s worth? https://www.ftc.gov/
sites/default/files/documents/public_statements/consent-decrees-public-getting-its-
moneys-worth/110407roschconsentdecrees.pdf (accessed September 18, 2023).
Shahriari, K., and M. Shahriari. 2017. IEEE standard review - ethically aligned design: A vi-
sion for prioritizing human wellbeing with artificial intelligence and autonomous sys-
tems. 2017 IEEE Canada International Humanitarian Technology Conference (IHTC):197–201.
Wong, R. Y., M. A. Madaio, and N. Merrill. 2023. Seeing like a toolkit: How toolkits envi-
sion the work of AI ethics. Proceedings of the ACM on Human–Computer Interaction
7(CSCW1):Article 145.

Copyright National Academy of Sciences. All rights reserved.


Social Media and Adolescent Health

Copyright National Academy of Sciences. All rights reserved.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy