The child safety group Heat Initiative plans to launch a campaign urging Apple to scan for child sexual abuse material and make user reports. The company issued a rare detailed response on Thursday.
In December, Apple said it was ending work on a privacy-preserving iCloud Photos scanning tool designed to detect child sexual abuse material (CSAM) on the platform. The project was originally announced in August 2021 and has been controversial since its launch. Apple first suspended the tool in September in response to concerns from digital rights groups and researchers that it would inevitably be abused and exploited, compromising the privacy and security of all iCloud users. This week, a new child safety group called the Heat Initiative told Apple it is organizing a campaign to ask the company to “detect, report and remove” child sexual abuse material from iCloud and give users more tools to report it to them. Apple reports CSAM. company.
In a rare response to the Heat Initiative, Apple today outlined its reasons for ditching development of iCloud CSAM scanning features and instead focusing on providing users with a set of on-device tools and resources collectively called “communication security” features. Apple shared the company’s response to the Heat Initiative with WIRED this morning, and it offers a rare look at not only the rationale for its move to communications security, but also its commitment to creating circumvention of user privacy protections. A broader view of mechanisms (e.g. encryption, surveillance). data. The stance is relevant to the broader encryption debate, especially as countries like the UK consider passing laws requiring tech companies to have access to user data to comply with law enforcement requests.
“Child sexual abuse content is abhorrent, and we are committed to breaking down the practices that make children vulnerable to it,” Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote in the company’s response to the Heat Initiative. chain of coercion and impact.” However, he added that after working with a range of privacy and security researchers, digital rights groups and child safety advocates, the company concluded that it could not continue to develop the CSAM scanning mechanism. , even mechanisms built specifically for privacy.
“Scanning each user’s privately stored iCloud data will create new threat vectors for data thieves to discover and exploit,” Neuenschwander wrote. “It may also have unintended consequences.” For example, scanning a type of Content opens the door to bulk surveillance and may create a desire to search for other encrypted messaging systems across content types. “
Wired couldn’t immediately reach Heat Initiative for comment on Apple’s response. The group is led by Sarah Gardner, former vice president of external affairs at Thorn, a nonprofit that uses new technologies to combat online child exploitation and sex trafficking. In 2021, Thorne praised Apple’s plan to develop iCloud CSAM scanning capabilities. Gardner said in an email to CEO Tim Cook on Wednesday that the Heat Initiative found Apple’s decision to remove the feature “disappointing.” Apple also shared the email with Wired.
Gardner wrote to Cook: “We firmly believe that the solutions you have introduced not only make Apple a global leader in user privacy, but also promise to eliminate millions of child sexual abusers from iCloud. images and video.” “About, that’s why it’s been silenced and left behind. We’re here to make sure that doesn’t happen.”
Apple insists that ultimately, even its best-intentioned designs cannot adequately protect it in practice, and that on-device nudity detection for Messages, FaceTime, AirDrop, photo pickers and more is a safer alternative. Apple has also begun providing an application programming interface (API) for its communications security features so that third-party developers can incorporate them into their apps. Apple said messaging platform Discord is integrating the features, and app makers have generally been enthusiastic about adopting them.
“A few years ago we decided not to continue with a hybrid client-server approach to detecting CSAM in iCloud Photos,” Neuenschwander wrote to the Heat Initiative. “We’ve concluded that it’s practically impossible to implement without ultimately compromising the security and privacy of users.”
In response to the Heat Initiative’s request that Apple create a CSAM reporting mechanism for users, the company told Wired that its focus is on connecting its vulnerable or victimized users directly to local resources and law enforcement in their area to help them, Instead of positioning yourself as a middleman handling reports. The company said offering such intermediary services could make sense for interactive platforms such as social networks.
Still, the need to protect children from online sexual abuse looms, and Apple’s resolve to resist data scanning will continue to be tested as those concerns become intertwined with the broader encryption debate.
Read the full exchange between the Heat Initiative and Apple below. To protect the privacy of senders and recipients, WIRED has redacted sensitive personal information:
Categories: Security
Source: thptvinhthang.edu.vn