Apple’s Photo Scanning Plan Sparks Outcry From Policy Groups


More than 90 policy groups from the US and around the world signed an open letter urging Apple to drop it plan there are Apple devices scanning photographs for child sexual abuse material (CSAM).

“Marked organizations studying civil rights, human rights, and digital rights around the world wrote to urge Apple to abandon plans announced on Aug. 5, 2021 that would provide surveillance capabilities. IPhones, IPads, and other Apple products, “the writing of Apple CEO Tim Cook said. “Even if its capabilities are designed to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that it could be used to censor protected language, threatening privacy. and security of people around the world, and devastating consequences for many children. “

The Center for Democracy and Technology (CDT) announced the letter, with CDT Security and Surveillance Project codirector Sharon Bradford Franklin saying, “We can expect governments to take advantage of the surveillance capability that Apple has made into iPhones, iPads, and computers. They are asking to be. Apple will scan and suppress images of human rights abuses, political protests, and other content that should be protected as free expression, forming the backbone of a free and democratic society. “

The open letter was signed by groups from six continents (Africa, Asia, Australia, Europe, North America, and South America). Some of the U.S. signatories are the American Civil Liberties Union, the Electronic Frontier Foundation, Fight for the Future, the LGBT Technology Partnership and Institute, New America’s Open Technology Institute, STOP (Surveillance Technology Oversight Project), and the Sex Workers Project at the Urban Justice Center. The signatories also included groups from Argentina, Belgium, Brazil, Canada, Colombia, Dominican Republic, Germany, Ghana, Guatemala, Honduras, Hong Kong, India, Japan, Kenya, Mexico, Nepal, Netherlands, Nigeria, Pakistan, Panama, Paraguay, Peru, Senegal, Spain, Tanzania, and the UK. The full list of signatories is HERE.

Scanning Photos and Messages in iCloud

Apple Office has partnered two weeks ago that iCloud Photos-enabled devices would scan images before uploading them iCloud. An iPhone uploads each photo to iCloud immediately after it’s taken, so scanning happens immediately if a user has previously been restored to iCloud Photos.

Apple it is said that its technology “analyzes an image and converts it to a unique number specific to that image” and flags the image if its hash is the same or nearly the same as the hash of anything found in one database of the known CSAM. An account can be reported to the National Center for Missing and Exploited Children (NCMEC) if 30 CSAM photos are seen, a threshold set by Apple to ensure there are “less than one in 1 trillion times each year an error is flagged is accounted for. ”That standard may be changed in the future to maintain the one trillion error positive rate.

Apple has also added a tool to the Messages application that will “analyze attached images and determine if a photo is sexually explicit” without giving Apple access to the messages. The system can be an option for parents, and if you turn on “warn children and their parents to accept or send sexually explicit photos.”

Apple says the new systems will be released later this year with updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Only here in the United States.

The same systems scan about letter signers. In scanning the Messages that parents can make, the letter says:

Algorithms designed to detect explicit material are clearly unreliable. They tend to misrepresent flag art, health information, educational resources, advocacy messages, and other imagery. The rights of children to send and receive such information are protected by the UN Convention on the Rights of the Child. In addition, the system developed by Apple assumes that “parent” and “child” accounts belong to an adult parent of a child, and that individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organizer of the account, and the consequences of notifying the parent may jeopardize the safety and well-being of the child. LGBTQ + youth on family accounts with ruthless parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to users through an end-to-end encrypted messaging system to which the sender and intended recipient only have access. of the transmitted information. Once this backdoor feature is enabled, governments could force Apple to forward the notice to other accounts, and to detect images that are inappropriate for reasons other than sexual harassment.



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *