What Apple Can Do Next to Combat Child Sexual Abuse


In May 2019, Melissa Polinsky, director of Apple’s global investigation and child safety team, is facing investigators working on the UK investigation into child sexual abuse. Within two hours of questioning, Polinsky claimed Apple employs just six people on its global team responsible for investigating child abuse images. Polinsky also said the technology Apple uses to scan existing images of child abuse online is “effective.”

Fast-forward two years, and it’s Apple’s job to address the child abuse material that has fallen off the old rails. On Sept. 3 the company made a rare public U-turn because it stopped plans to introduce a system that detects known child sexual abuse materials, or CSAM, on iPhones and iPad to people in the US. “We have decided to take more time in the coming months to collect input and fix it before releasing critically important aspects of child safety,” Apple said in a statement, citing ” feedback ”received.

What will Apple do next? The company may not be able to win or congratulate each of the following – and the fallout from its plans has created a mighty upheaval. The technical complexities of Apple’s proposals have diminished some public discussions of useless, for-or-against statements, and explosive language, in how many times, the debate is polarized. The fall comes as the European Commission prepares child protection law to become mandatory for technology companies to scan for CSAM.

“The step [for Apple] to do a kind of content review is long overdue, ”said Victoria Baines, a cybersecurity expert who works with both Facebook and Europol on child safety investigations. Technology companies are required by U.S. law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a non-profit U.S. nonprofit child safety organization. , but later Apple was behind its rivals.

In 2020, NCMEC was received 21.7 million CSAM reports, from 16.9 million in 2019. Facebook tops the list in 2020-making 20.3 million reports last year. Google made 546,704; Dropbox 20,928; Twitter 65,062, Microsoft 96,776; and Snapchat 144,095. Apple only made 265 CSAM reports at NCMEC in 2020.

There are many “reasonable” reasons for the conflicts, Baines said. Not all technology companies are the same. For example, Facebook was built to share and connect with new people. Apple focuses primarily on its hardware, and most people use the company’s services to communicate with people they already know. Or, to be honest with it, no one can find iMessage for kids that they can send naughty messages to. Another issue at play here is visibility. The number of reports sent by a company to NCMEC can be based on how much effort is made to locate CSAM. Better analysis tools may also mean that more abusive material is found. And some tech companies more has been done than others to root CSAM.

Detecting existing child sexual abuse material primarily involves scanning what people send, or uploading, when that piece of content reaches the servers of a company. The codes, known as hashes, are created for photos and videos, and are compared to existing hashes for known child sexual abuse material. Hash lists are made by child protection organizations, such as NCMEC and the Internet Watch Foundation in the UK. If a positive match is detected, technology companies can take action and also report the findings to NCMEC. Usually the process is done through PhotoDNA, which is done by Microsoft.

Apple’s plan to scan for CSAM uploaded to iCloud turned this method over in its head and, using some clever cryptography, shifted the part of waking up people’s phones. (Apple scans iCloud Mail for CSAM from 2019, but does not scan iCloud Photos or backup iCloud.) The proposal has proven controversial for a number of reasons.



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *