Letter To Tim Cook With Signatures
Letter To Tim Cook With Signatures
Letter To Tim Cook With Signatures
CEO, Apple
Apple Park
Cupertino
Dear Tim,
Two years ago, under your strong leadership, Apple made a landmark announcement
that you would detect child sexual abuse images and videos in iCloud.
We were energized by the public comments from your leadership around this decision,
including your SVP of Software Engineering Craig Federighi, who was quoted saying,
“We wish this had come out a little more clearly for everyone because we feel very
positively and strongly about what we’re doing.” This is why we were shocked and
discouraged by your decision to reverse course and not institute these important
changes.
The public agrees with us. 90% of Americans say Apple has a responsibility to detect,
report, and remove child sexual abuse images and videos.1 The technological solution
that you announced would allow you to maintain your commitment to being the world
leader in user privacy, while simultaneously eliminating millions of child sex abuse
images and videos in iCloud. And it would be honoring the privacy of those who have
experienced this horrific crime firsthand. They deserve privacy.
This is why we are asking you to honor your original intention to:
● Detect, report, and remove child sexual abuse images and videos from iCloud.
● Create a robust reporting mechanism for users to report child sexual abuse
images and videos to Apple.
We understand that ensuring people’s right to privacy is crucial. Apple doesn’t have to
choose between meeting this obligation, while also protecting the rights of survivors
and millions of children worldwide. Child sexual abuse images and videos must be
treated differently than other content and requires a different solution because such
1
Bellwether Research - Full Citation coming
images and videos are illegal and evidence of the rape or molestation of children. 60%
of all reported child sexual abuse material features children under the age of 12.2
In our recent research, we have come across hundreds of cases of child sexual abuse
that have been documented and spread specifically on Apple devices and stored in
iCloud. Had Apple been detecting these images and videos, many of these children
would have been removed from their abusive situations far sooner.
That is why the day you make the choice to start detecting such harmful content,
children will be identified and will no longer have to endure sexual abuse. Waiting
continues to put children in harm's way, and prevents survivors, or those with lived
experience, from healing.
We know you can lead here and we urge you to make this choice. This immensely
positive action could be your legacy at Apple.
Sincerely,
2
InterPol (2018). Towards a Global Indicator on Unidentified Victims in Child Sexual Exploitation Material, Summary Report:
https://ecpat.org/wp-content/uploads/2021/05/TOWARDS-A-GLOBAL-INDICATOR-ON-UNIDENTIFIED-VICTIMS-IN-CHILD-SEXU
AL-EXPLOITATION-MATERIAL-Summary-Report.pdf