Abstract
In the summer of 2021, Apple announced it would release a Child Safety Feature (CSF) aimed at reducing Child Sex Abuse Materials (CSAM) on its platform. The CSF would scan all images a user uploaded to their iCloud for CSAM, and Apple would report an account with 30 or more flagged images to the National Center for Missing and Exploited Children. Despite Apple’s good intentions, they received intense backlash, with many critics arguing the proposed CSF eroded a user’s privacy. This article explores the technology behind Apple’s CSF and compares it to similar features used by other prominent tech companies. The article further looks at how the Fourth Amendment has grown with technology yet struggles to find a balance between protecting children from online exploitation and privacy rights. Finally, the article proposes three solutions to this balancing issue: Supreme Court action, a uniform circuit test, or Congressional action.
Recommended Citation
McGarvie, Jessica
(2023)
"From Hashtag to Hash Value: Using the Hash Value Model to Report Child Sex Abuse Material,"
Seattle Journal of Technology, Environmental, & Innovation Law: Vol. 13:
Iss.
2, Article 4.
Available at:
https://digitalcommons.law.seattleu.edu/sjteil/vol13/iss2/4
Included in
Computer Law Commons, Criminal Law Commons, Evidence Commons, Human Rights Law Commons, Intellectual Property Law Commons, Internet Law Commons, Juvenile Law Commons, Law Enforcement and Corrections Commons, Privacy Law Commons