FAQs

  1. Does Apple scan all photos on my device?

    • No, only photos in the iCloud Photo Library syncing system.
  2. How does Apple scan photos?

    • By checking their code against a database of known child abuse imagery.
  3. Why is Apple implementing this scanning system?

    • To combat child abuse imagery.
  4. Is my privacy at risk with this scanning system?

    • Apple insists their system is privacy-protecting because scans happen on devices, not servers.
  5. Can outside experts audit the scanning system?

    • Yes, Apple will publish a hash for its database that can be audited.
  6. What is the “hash” of Apple’s database?

    • A code identifiable to the database that can only be generated with the help of child safety organizations.
  7. Is the scanning feature related to Apple’s plan to alert children about explicit images?

    • No, the scanning feature is separate.
  8. How will Apple’s retail employees respond to questions about the scanning system?

    • They will refer to an FAQ and an independent auditor’s review of the system.
  9. How does Apple’s scanning system compare to other companies?

    • Apple’s system scans photos before they are uploaded to the internet, unlike other companies that scan after uploading.
  10. What is the National Center for Missing and Exploited Children (NCMEC)?

    • The database of known child abuse images that Apple uses for scanning.
  11. What is the “multiple levels of auditability” that Apple mentions?

    • This refers to the measures Apple has put in place to ensure the scanning system is not misused.
  12. How does Apple’s scanning system advance privacy protections?

    • By performing scans on devices rather than servers, Apple claims it increases privacy and transparency.

Summary

In an effort to combat child abuse imagery, Apple is implementing a scanning system that will check photos on iPhones, iPads, and Mac computers against a database of known child abuse images maintained by the NCMEC. This scanning system has raised concerns among privacy advocates about potential misuse or surveillance. However, Apple maintains that the system is privacy-protecting and has multiple levels of auditability to prevent misuse.

Apple has positioned itself as a leader in privacy and security, emphasizing their focus on device sales rather than advertising revenue. This new scanning system, while intended to improve child safety, has brought into question Apple’s commitment to user privacy. The company’s handling of the announcement has been criticized for lacking clarity, leading to misunderstandings and concerns.

Apple asserts that the scanning is done on the device, not on their servers, and that the system is designed specifically to identify exact fingerprints of known child pornographic images. The company also highlights that outside experts will be able to audit the system through the hash code.

Regarding the separate feature of alerting children about explicit images in Messages, Apple emphasizes that this does not involve scanning against its database of child abuse imagery. It focuses on educating parents and children.

Overall, Apple’s scanning system sparks a balance between child protection and privacy concerns. The company insists on the privacy-protecting aspects and auditability of the system, but questions remain about its potential impact on user privacy.