•  
  •  
 

Abstract

In the summer of 2021, Apple announced it would release a Child Safety Feature (CSF) aimed at reducing Child Sex Abuse Materials (CSAM) on its platform. The CSF would scan all images a user uploaded to their iCloud for CSAM, and Apple would report an account with 30 or more flagged images to the National Center for Missing and Exploited Children. Despite Apple’s good intentions, they received intense backlash, with many critics arguing the proposed CSF eroded a user’s privacy. This article explores the technology behind Apple’s CSF and compares it to similar features used by other prominent tech companies. The article further looks at how the Fourth Amendment has grown with technology yet struggles to find a balance between protecting children from online exploitation and privacy rights. Finally, the article proposes three solutions to this balancing issue: Supreme Court action, a uniform circuit test, or Congressional action.

Share

COinS