Apple, Surveillance and CSAM
Apple has recently released their plans for on device detection of Child Sexual Abuse Material (CSAM). For me as well as many others this has raised some flags, since it have the potential to greatly impact the privacy of Apple users. I will not comment on the overall security of the solution put forward by Apple, just summarize the description on how it works, as well as highlighting my concerns with the solution. Apple intends to roll out the detection in three stages. Firstly images shared through the messages app (AKA iMessage) will be screened for CSAM content. If it is detected (being either sent or received) the user and in applicable cases their parent will be warned about the detection. The second stage includes iCloud photos, where any photo will be matched against known CSAM (on device matching) before being uploaded. If there is a number of matches that meets a threshold Apple will be notified and after manual validation it will be forward to the relevant (American) authorities. The last step is to improve Siri and search, adding better support for reporting CSAM, or where to get support if you, or you think someone else is subject to sexual abuse. ...