Five Online Safety Task Forces Agree (PFF - Adam Thierer)
Five Online Safety Task Forces Agree (PFF - Adam Thierer)
Five Online Safety Task Forces Agree (PFF - Adam Thierer)
Public policy debates about online child safety have raged since the earliest days of the
Internet. Concerns about underage access to objectionable content (specifically pornography)
drove early “Web 1.0” efforts to regulate the Internet, and it continues to be the topic of much
discussion today. With the rise of the more interactive “Web 2.0” environment, however,
objectionable contact and communications (harassment, cyberbullying, predation, etc.) have
become a more significant concern and is now driving many regulatory proposals. 1
Over the past decade, five major online safety task forces or blue ribbon commissions have
been convened to study these concerns, determine their severity, and consider what should be
done to address them. Two of these task forces were convened in the United States and issued
reports in 2000 and 2002. Another was commissioned by the British government in 2007 and
issued in a major report in March 2008. Finally, two additional online safety task forces were
formed in the U.S. in 2008 and concluded their work, respectively, in January and July of 2009.
Altogether, these five task forces heard from hundreds of experts and produced thousands of
pages of testimony and reports on a wide variety of issues related to online child safety. While
each of these task forces had different origins and unique membership, what is striking about
them is the general unanimity of their conclusions. Among the common themes or
recommendations of these five task forces:
• Education is the primary solution to most online child safety concerns. These task
forces consistently stressed the importance of media literacy, awareness-building
efforts, public service announcements, targeted intervention techniques, and better
*
Adam Thierer (athierer@pff.org) is a Senior Fellow with PFF and the Director of The Progress & Freedom
Foundation (PFF) Center for Digital Media Freedom. He served as member of the Internet Safety Technical Task
Force and the “PointSmart. ClickSafe.” task forces highlighted in this report. He also served as an advisor to the
“Byron Commission” task force discussed herein. Finally, he was recently appointed as a member of the new
Online Safety Technical Working Group (OSTWG), which was created by Congress to study these same issues.
The views expressed here are his own, and are not necessarily the views of the PFF board, other PFF fellows or
staff, or any of the task forces on which he has served.
1
For a more extensive discussion of how these debates have played out over the past decade, see Adam Thierer,
The Progress & Freedom Foundation, Parental Controls and Online Child Protection: A Survey of Tools and
Methods, Special Report, Ver. 3.1, Fall 2008, www.pff.org/parentalcontrols/index.html
The consistency of these findings from those five previous task forces is important and it should
guide future discussions among policymakers, the press, and the general public regarding
online child safety. 2
The findings are particularly relevant today since Congress and the Obama Administration are
2
Importantly, this is also the general approach that many other child safety experts and authors have taken
when addressing these issues. For example, see Nancy E. Willard, Cyber-Safe Kids, Cyber-Savvy Teens (San
Francisco, CA: Jossey-Bass, 2007), www.cskcst.com; Larry Magid and Anne Collier, MySpace Unraveled: A
Parent’s Guide to Teen Social Networking (Berkeley, CA: Peachtree Press, 2007), www.myspaceunraveled.com;
Sharon Miller Cindrich, e-Parenting: Keeping Up with Your Tech-Savvy Kids (New York: Random House
Reference, 2007), www.pluggedinparent.com; Jason Illian, MySpace, MyKids: A Parent's Guide to Protecting
Your Kids and Navigating MySpace.com (Eugene, OR; Harvest House Publishers, 2007); Linda Criddle, Look Both
Ways: Help Protect Your Family on the Internet (Redmond, WA: Microsoft Press, 2006), http://look-both-
ways.com/about/toc.htm; Gregory S. Smith, How to Protect Your Children on the Internet: A Road Map for
Parents and Teachers (Westport, CT: Praeger, 2007), www.gregoryssmith.com.
Progress on Point 16.13 Page 3
actively studying these issues. For example, three federal agencies are currently exploring
various aspects of this debate:
• NTIA (OSTWG): The “Protecting Children in the 21st Century Act,” which was signed into
law by President Bush in 2008 as part of the “Broadband Data Services Improvement
Act,” 3 authorized the creation of an Online Safety and Technology Working Group
(OSTWG). The National Telecommunications and Information Administration (NTIA) at
the U.S. Department of Commerce, which is overseeing the effort, has appointed 35
members to serve 15-month terms to study the status of industry efforts to promote
online safety, best practices among industry leaders, the market for parental control
technologies, and assistance to law enforcement in cases of online child abuse. The U.S.
Department of Justice, the U.S. Department of Education, the Federal Communications
Commission, and the Federal Trade Commission all have delegates serving on the
working group. OSTWG began its work in early June 2009 and is due to report back to
Congress one year later. 4
• FTC: That same bill that created the OSTWG, also requires that the Federal Trade
Commission (FTC) “carry out a nationwide program to increase public awareness and
provide education” to promote safer Internet use. “The program shall utilize existing
resources and efforts of the Federal Government, State and local governments,
nonprofit organizations, private technology and financial companies, Internet service
providers, World Wide Web-based resources, and other appropriate entities, that
includes (1) identifying, promoting, and encouraging best practices for Internet safety;
(2) establishing and carrying out a national outreach and education campaign regarding
Internet safety utilizing various media and Internet-based resources; (3) facilitating
access to, and the exchange of, information regarding Internet safety to promote up to-
date knowledge regarding current issues; and, (4) facilitating access to Internet safety
education and public awareness efforts the Commission considers appropriate by
States, units of local government, schools, police departments, nonprofit organizations,
and other appropriate entities.”
• FCC: Pursuant to the requirements set forth in the Child Safe Viewing Act of 2007, 5 the
Federal Communications Commission (FCC) launched a Notice of Inquiry in March 2009
3 th
Broadband Data Services Improvement Act of 2008, P.L. 110-385, 110 Congress.
4
See Leslie Cantu, Newest Online Safety Group Will Report on Industry Efforts, Washington Internet Daily, Vol. 10
No. 107, June 5, 2009; Larry Magid, Federal Panel Takes a Fresh Look at Kids’ Internet Safety, San Jose Mercury
News, www.mercurynews.com/business/ci_12522370?nclick_check=1; Adam Thierer, The Progress & Freedom
Foundation, Online Safety Technology Working Group (OSTWG) Is Underway, PFF Blog, June 4, 2009,
http://blog.pff.org/archives/2009/06/online_safety_technology_working_group_ostwg_is_un.html
5 th
Child Safe Viewing Act of 2007, P.L. 110-452, 110 Congress. Also see Adam Thierer, The Progress & Freedom
Foundation, “Child Safe Viewing Act” (S. 602) Signed by President Bush, PFF Blog, Dec. 2, 2008,
http://blog.pff.org/archives/2008/12/child_safe_view.html
Page 4 Progress on Point 16.13
to survey the parental controls marketplace.6 Specifically, the Act requires the FCC to
examine: (1) the existence and availability of advanced blocking technologies that are
compatible with various communications devices or platforms;
(2) methods of encouraging the development, deployment, and use of such technology
by parents that do not affect the packaging or pricing of a content provider's offering;
and, (3) the existence, availability, and use of parental empowerment tools and
initiatives already in the market. The proceeding prompted a diverse assortment of
filings from industry and non-profit groups discussing the technologies and rating
systems on the market today. 7 The Act requires that the FCC issue a report to Congress
about these technologies no later than August 29, 2009. 8
As these agencies, future task forces, academics, and others continue to study these issues,
they should keep the findings of past online safety task forces in mind. What follows is an
expanded chronological discussion of the major findings of each of the five major online safety
task forces that have been convened since 2000.
6
Federal Communications Commission, Notice of Inquiry In the Matter of Implementation of the Child Safe
Viewing Act; Examination of Parental Control Technologies for Video or Audio Programming, FCC 09-14, MB
Docket No. 09-26, March 2, 2009 (hereinafter FCC, Child Safe Viewing Act Notice).
7
See Adam Thierer, The Progress & Freedom Foundation, Major Filings in FCC's ‘Child Safe Viewing Act’ Notice of
Inquiry, PFF Blog, Apr. 20, 2009,
http://blog.pff.org/archives/2009/04/major_filings_in_fccs_child_safe_viewing_act_notic.html.
8
For more discussion of the possible implications of this proceeding, see Adam Thierer, The Progress & Freedom
Foundation, Dawn of Convergence-Era Content Regulation at the FCC? ‘Child Safe Viewing Act’ NOI Launched,
PFF Blog, March 3, 2009, http://blog.pff.org/archives/2009/03/dawn_of_convergence-
era_content_regulation_at_the.html.
9
COPA Commission, Report to Congress, October 20, 2000, www.copacommission.org.
10
See Adam Thierer, The Progress & Freedom Foundation, Closing the Book on COPA, PFF Blog, Jan. 21, 2009,
http://blog.pff.org/archives/2009/01/closing_the_boo.html.
Progress on Point 16.13 Page 5
reduce access by minors to certain sexually explicit material online. Congress directed the
Commission to evaluate the accessibility, cost, and effectiveness of protective technologies and
methods, as well as their possible effects on privacy, First Amendment values and law
enforcement. The Commission was chaired by Donald Telage, then Executive Advisor for Global
Internet Strategy for Network Solutions Inc., and it had 18 members from academia,
government, and industry. After hearing from a diverse array of parties and considering a wide
range of possible solutions, 11 the COPA Commission concluded that:
no single technology or method will effectively protect children from harmful
material online. Rather, the Commission determined that a combination of public
education, consumer empowerment technologies and methods, increased
enforcement of existing laws, and industry action are needed to address this
concern. 12
The COPA Commission also made specific recommendations concerning education, law
enforcement and industry action, which are listed in Exhibit 1. 13 The clear conclusion of the
COPA Commission was that a layered, multi-faceted approach to online safety was essential.
Education, empowerment, and targeted law enforcement strategies were the key. Finally, the
COPA Commission helped highlight for policymakers “the unique characteristics of the Internet
and its impact on the ability to protect children”:
The Internet’s technical architecture creates new challenges as well as
opportunities for children and families. Material published on the Internet may
originate anywhere, presenting challenges to the application of the law of any
single jurisdiction. Methods for protecting children in the U.S. must take into
account this global nature of the Internet. In addition, thousands of access
providers and millions of potential publishers provide content online. Methods to
protect children from content harmful to minors must be effective in this diverse
and decentralized environment, including the full range of Internet activity such as
the Web, email, chat, instant messaging, and newsgroups. The Internet is also
rapidly changing and converging with other, more traditional media. Effective
protections for children must accommodate the Internet’s convergence with other
media and extend to new technologies and services offered on the Internet, [since]
… unlike one-way broadcast media, the Internet is inherently multi-directional and
interactive. 14
11
The Commission evaluated: filtering and blocking services; labeling and rating systems; age verification efforts;
the possibility of a new top-level domain for harmful to minors material; “green” spaces containing only child-
appropriate materials; Internet monitoring and time-limiting technologies; acceptable use policies and family
contracts; online resources providing access to protective technologies and methods; and options for increased
prosecution against illegal online material. Id. at 14.
12
Id. at 9.
13
Id. at 9-10.
14
Id. at 13.
Page 6 Progress on Point 16.13
15
Title IX, Sec. 901 of The Protection of Children from Sexual Predators Act of 1998, Pub. Law 105-314.
16
Computer Science and Telecommunications Board, National Research Council, Youth, Pornography and the
Internet (Washington, DC: National Academy Press, 2002), www.nap.edu/html/youth_internet/
17
Id. at 224.
18
Id. at 221.
19
Id. at 222.
Page 8 Progress on Point 16.13
The Thornburgh Commission also found that “Technology-based tools, such as filters, provide
parents and other responsible adults with additional choices as to how best fulfill their
responsibilities.” 20 In other words, technological tools and approaches could supplement
educational strategies.21 However, the report also concluded, however, “there is no single or
simple answer to controlling the access of minors to inappropriate material on the Web.” 22
Thus, the Thornburgh Commission advocated a layered approach to the issue:
Though some might wish to think otherwise, no single approach—technical, legal,
economic, or education—will be sufficient. Rather, an effective framework for
protecting our children from inappropriate materials and experiences on the
Internet will require a balanced composite of all these elements, and real progress
will require forward movement on all these fronts. 23
20
Id. at 12.
21
“While technology and public policy have important roles to play, social and educational strategies that impart
to children the character and values to exercise responsible choices about Internet use and the knowledge
about how to cope with inappropriate material and experiences is central to promoting children’s safe Internet
use.” Id. at 388.
22
Id. at 12.
23
Id. at 13.
24
Safer Children in a Digital World: The Report of the Byron Review, March 27, 2008,
www.dcsf.gov.uk/byronreview. The complete final report can be found at:
www.dcsf.gov.uk/byronreview/pdfs/Final%20Report%20Bookmarked.pdf.
Progress on Point 16.13 Page 9
managing the risks. At worst they can be dangerous – lulling parents into a false
sense of security and leaving children exposed to a greater level of risk than they
would otherwise be.25
The Byron Review also emphasized the importance of education and building resiliency:
Just like in the offline world, no amount of effort to reduce potential risks to
children will eliminate those risks completely. We cannot make the internet
completely safe. Because of this, we must also build children’s resilience to the
material to which they may be exposed so that they have the confidence and skills
to navigate these new media waters more safely. 26
[And ] crucial and central to this issue is a strong commitment to changing behavior
through a sustained information and education strategy. This should focus on
raising the knowledge, skills and understanding around e-safety of children, parents
and other responsible adults. 27
The Byron Review recommended a comprehensive information and education strategy through
a partnership of government, schools, child safety experts, and industry. It also recommended
that government policy be more tightly coordinated by a new UK Council for Child Internet
Safety, which would report to the Prime Minister. Finally, the Byron Review outlined a variety
of industry best practices that could help parents and children achieve greater online safety.
25
Id. at 81.
26
Id. at 5.
27
Id. at 7.
28
MySpace and Attorneys General Announce Joint Effort to Promote Industry-Wide Internet Safety Principles,
News Corp., Press Release, January 14, 2008, www.newscorp.com/news/news_363.html
29
Adam Thierer, The Progress & Freedom Foundation, The MySpace-AG Agreement: A Model Code of Conduct for
Social Networking? Progress on Point 15.1, Jan. 2008, www.pff.org/issues-
pubs/pops/pop15.1myspaceAGagreement.pdf
Page 10 Progress on Point 16.13
law professor John Palfrey, the Co-Director of Harvard’s Berkman Center for Internet & Society,
included representatives from many child safety groups, non-profit organizations, and Internet
companies.
The ISTTF convened a Research Advisory Board (RAB), which brought together leading
academic researchers in the field of child safety and child development and a Technical
Advisory Board (TAB), which included some of America’s leading digital technologists and
computer scientists, who reviewed child safety technologies submitted to the ISTTF. The RAB’s
literature review30 and TAB’s assessment of technologies 31 were the most detailed assessments
of these issues to date. They both represent amazing achievements in their respective arenas.
On December 31, 2008, the ISTTF issued its final report, Enhancing Child Safety & Online
Technologies. 32 Consistent with previous task force reports, the ISTTF found that “there is no
one technological solution or specific combination of technological solutions to the problem of
online safety for minors.” 33 And, while the ISTTF was, “optimistic about the development of
technologies to enhance protections for minors online and to support institutions and
individuals involved in protecting minors,” it ultimately “caution[ed] against overreliance on
technology in isolation or on a single technological approach”: 34
Instead, a combination of technologies, in concert with parental oversight,
education, social services, law enforcement, and sound policies by social network
sites and service providers may assist in addressing specific problems that minors
face online. All stakeholders must continue to work in a cooperative and
collaborative manner, sharing information and ideas to achieve the common goal of
making the Internet as safe as possible for minors. 35
Finally, the ISTTF recognized the importance of providing adequate resources to law
enforcement, schools, and social service organizations so they can better deal with child safety
concerns:
To complement the use of technology, greater resources should be allocated: to
schools, libraries, and other community organizations to assist them in adopting risk
management policies and in providing education about online safety issues; to law
30
http://cyber.law.harvard.edu/sites/cyber.law.harvard.edu/files/ISTTF_Final_Report-
APPENDIX_C_Lit_Review_121808.pdf
31
http://cyber.law.harvard.edu/sites/cyber.law.harvard.edu/files/ISTTF_Final_Report-
APPENDIX_D_TAB_and_EXHIBITS.pdf
32
Internet Safety Technical Task Force, Enhancing Child Safety & Online Technologies: Final Report of the Internet
Safety Technical Task Force to the Multi-State Working Group on Social Networking of State Attorneys General
of the United States, Dec. 31, 2008, at 10, http://cyber.law.harvard.edu/pubrelease/isttf
33
Id. at 6
34
Id.
35
Id.
Progress on Point 16.13 Page 11
36
Supra note 32 at 6.
37
www.pointsmartclicksafe.org
38
Adam Thierer, The Progress & Freedom Foundation, Cable’s Commitment to Online Safety, Progress Snapshot
3.7 June 2007, www.pff.org/issues-pubs/ps/2007/ps3.7cablecodeconduct.pdf.
39
www.pointsmartreport.org
Page 12 Progress on Point 16.13
• Modified for a specific service or application (e.g. ISP, blog, chat, social
network),
• Scaled based on the number of intended or actual users,
• Designed and created as part of the product development cycle, and
• Continuously updated to reflect growth and change in the application or
service.
The task force then outlined several tools and strategies that industries could use to accomplish
these goals. These “Recommendations for Best Practices” are summarized in Exhibit 2.
Progress on Point 16.13 Page 13
1.1 Provide access to information that will educate parents, educators, and children about media literacy and
ethical digital citizenship, and help them think critically about the content consumed and created on the
Internet.
1.2 Make safety information for users, parents, and caregivers prominent, easily accessible, and clear.
1.3 Provide information that is easy to find and access from the home page, available during registration, and
that can also be found in other appropriate places within the Web site or service.
1.4 Include specific information or FAQs about the services offered by the provider, especially safety tools and
how to use them (e.g., conducting a safe search, setting filtering options, defining and setting appropriate
privacy levels).
1.5 Provide links to additional resources that offer relevant safety and security information.
1.6 To make messages about online safety clear and easily recognizable to a variety of users, consider using
consistent themes, and common words and phrases. Provide messages in multiple languages as
appropriate.
1.7 Consider display of an icon on Web sites or services that denotes meaningful participation in Best Practice
efforts for children's online safety.
2.1 Provide a clear explanation of how information collected at registration and set up will be used, what is
public vs. private on the site, and a user’s ability to modify, hide, and prevent access to user information.
2.2 Make safety information available during the registration process, prominent from the homepage and in
appropriate places within the service (e.g. welcome email/message, point of sale information).
2.3 Provide information in the terms and conditions and elsewhere that defines acceptable behavior, states
that users are not anonymous and can be traced, and details the consequences of violating the standards
of behavior.
2.4 Provide notice that violating terms or conditions will result in specific consequences, including legal ones if
required.
3.1 Continue to explore age-verification and identity-authentication technologies and work to develop better
safety and security solutions and technologies.
(cont.)
Page 14 Progress on Point 16.13
(cont.)
4. Content screening
4.1 Initially set defaults at a moderate level as a minimum, but instruct users in how to customize settings for
their own needs.
4.2 Information should be provided about company policy on filtering, including the default settings,
explanations of the meanings of different safety, security and filtering options (e.g., what is blocked by
certain levels of filtering), how to make adjustments, and when settings might need to be reapplied (e.g., a
new version).
4.3 Consider carefully the placement and highlighting of sites belonging to and designed by children and youth
(e.g., a child’s profile page could become a “safe zone,” don’t locate children’s content near ads for adult-
targeted materials).
4.4 Consider a “walled garden” approach when relevant with products aimed at children eight years of age
and younger.
5. Safe searching
5.1 Include specific information about how to conduct a safe search, how to set filtering options, and an
explanation of privacy settings.
6.1 Have in place a robust procedure, backed by appropriate systems and resources, to handle complaints.
Ideally, each company should have an Internet-safety staff position or cross-functional team charged with
supervising the procedures and resources and given the authority and resources to be effective.
6.2 Provide a reporting mechanism visible from all relevant pages or sections of a site or service.
6.3 Consider providing a designated page with relevant information and instructions about how to submit a
report or complaint including:
• How users can determine the appropriate individual or agency to contact when reporting a problem
(e.g., customer service, law enforcement, or safety hotline) and links to these services.
• What types of content and behaviors should be reported, the reporting procedure, and what
supporting information might need to be included.
• How to remove unwanted content or information from a user’s page or profile.
• How to cancel an account.
6.4 Cooperate with law enforcement, where applicable, and follow all relevant statutes.
Progress on Point 16.13 Page 15
Of these various strategies, however, education is the one with the most lasting impact.
Education teaches lessons and builds resiliency, providing skills and strength that can last a
lifetime. Specifically, education can help teach kids how to behave in—or respond to—a wide
variety of situations.40 The focus should be on encouraging “digital citizenship” 41 and “social
media literacy.” 42
If policymaker convene additional task forces or working groups in coming years, it would be
wise to have them focus on devising and refining online safety educational methods and digital
literacy efforts. In particular, focusing on how to integrate such education and literacy programs
into existing K-12 education (including curriculum and professional development) would be a
worthwhile undertaking. Of course, many groups are already busy studying how to do this, but
if lawmakers feel compelled to bring together experts once more to study these issues, this sort
of targeted focus on education and media literacy implementation would be welcome.
Importantly, it is worth noting that such education and media literacy-based approaches have
the added benefit of remaining within the boundaries of the Constitution and the First
Amendment. By adopting education and awareness-building approaches, government would
not be seeking to restrict speech, but simply to better inform and empower parents regarding
the parental control tools and techniques already at their disposal.43 The courts have shown
themselves to be amenable to such educational efforts, and not just in the case of online
40
See Nancy Willard, A Web 2.0 Approach to Internet Safety, Education Week, Aug. 21, 2007, www.education-
world.com/a_tech/columnists/willard/willard008.shtml
41
See Common Sense Media, Digital Literacy and Citizenship in the 21st Century: Educating, Empowering, and
Protecting America's Kids, June 2009, www.commonsensemedia.org/sites/default/files/CSM_digital_policy.pdf;
Nancy Willard, Center for Safe and Responsible Internet Use, Comprehensive Layered Approach to Address
Digital Citizenship and Youth Risk Online, Nov. 2008, www.cyberbully.org/PDFs/yrocomprehensiveapproach.pdf
42
See Anne Collier, Net Family News, Social Media Literacy: The New Online Safety, Feb. 27, 2009,
www.netfamilynews.org/labels/new%20media%20literacy.html
43
“Although government’s ability to regulate content may be weak, its ability to promote positive programming
and media research is not. Government at all levels should fund the creation and evaluation of positive media
initiatives such as public service campaigns to reduce risky behaviors and studies about educational programs
that explore innovative uses of media.” Jeanne Brooks-Gunn and Elisabeth Hirschhorn Donahue, “Introducing
the Issue,” in Children and Electronic Media, The Future of Children, Vol. 18, No. 1, Spring 2008, p. 8.
Page 16 Progress on Point 16.13
44
In the video game context, courts have noted the education typically provides the more sensible, and
constitution, method of dealing with concerns about access to objectionable content.
45
See Berin Szoka & Adam Thierer, The Progress & Freedom Foundation, Cyberbullying Legislation: Why
Education is Preferable to Regulation, , Progress on Point 16.2, June 19, 2009, www.pff.org/issues-
pubs/pops/2009/pop16.12-cyberbullying-education-better-than-regulation.pdf; Adam Thierer, The Progress &
Freedom Foundation, Two Sensible, Education-Based Approaches to Online Child Safety, Progress Snapshot
3.10, Sept. 2007, www.pff.org/issues-pubs/ps/2007/ps3.10safetyeducationbills.pdf.
46
See Adam Thierer, The Progress & Freedom Foundation, Congress, Content Regulation, and Child Protection:
The Expanding Legislative Agenda, Progress Snapshot 4.4, Feb. 6, 2008, www.pff.org/issues-
pubs/ps/2008/ps4.4childprotection.html; Adam Thierer, The Progress & Freedom Foundation, Is MySpace the
Government’s Space?, Progress Snapshot 2.16, June 2006, www.pff.org/issues-
pubs/ps/2006/ps_2.16_myspace.pdf
47
See Berin Szoka & Adam Thierer, The Progress & Freedom Foundation, COPPA 2.0: The New Battle over Privacy,
Age Verification, Online Safety & Free Speech, Progress on Point 16.11, May 2009, available at
http://pff.org/issues-pubs/pops/2009/pop16.11-COPPA-and-age-verification.pdf; Adam Thierer, The Progress
& Freedom Foundation, Social Networking and Age Verification: Many Hard Questions; No Easy Solutions,
Progress on Point No. 14.5, Mar. 2007, www.pff.org/issues-pubs/pops/pop14.8ageverificationtranscript.pdf;
Adam Thierer, The Progress & Freedom Foundation, Statement Regarding the Internet Safety Technical Task
Force’s Final Report to the Attorneys General, Jan. 14, 2008, www.pff.org/issues-
pubs/other/090114ISTTFthiererclosingstatement.pdf; Nancy Willard, Center for Safe and Responsible Internet
Use, Why Age and Identity Verification Will Not Work—And is a Really Bad Idea, Jan. 26, 2009,
www.csriu.org/PDFs/digitalidnot.pdf; Jeff Schmidt, Online Child Safety: A Security Professional’s Take, The
Guardian, Spring 2007, www.jschmidt.org/AgeVerification/Gardian_JSchmidt.pdf.
48
See Adam Thierer, The Progress & Freedom Foundation, Closing the Book on COPA, PFF Blog, Jan. 21, 2009,
http://blog.pff.org/archives/2009/01/closing_the_boo.html
Progress on Point 16.13 Page 17
The Progress & Freedom Foundation is a market-oriented think tank that studies the digital revolution and its
implications for public policy. Its mission is to educate policymakers, opinion leaders and the public about issues
associated with technological change, based on a philosophy of limited government, free markets and civil liberties.
Established in 1993, PFF is a private, non-profit, non-partisan research organization supported by tax-deductible
donations from corporations, foundations and individuals. The views expressed here are those of the authors, and do not
necessarily represent the views of PFF, its Board of Directors, officers or staff.
The Progress & Freedom Foundation 1444 Eye Street, NW Suite 500 Washington, DC 20005
202-289-8928 mail@pff.org www.pff.org