Apple’s child protection features spark concern within its own ranks

Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud. (File/AFP)
Short Url
Updated 13 August 2021
Follow

Apple’s child protection features spark concern within its own ranks

  • Apple employees join ranks in criticizing the company's move to scan US customer phones for child abuse images
  • Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests

SAN FRANCISCO: A backlash over Apple’s move to scan US customer phones and computers for child sex abuse images has grown to include employees speaking out internally, a notable turn in a company famed for its secretive culture, as well as provoking intensified protests from leading technology policy groups.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
Though coming mainly from employees outside of lead security and privacy roles, the pushback marks a shift for a company where a strict code of secrecy around new products colors other aspects of the corporate culture.
Slack rolled out a few years ago and has been more widely adopted by teams at Apple during the pandemic, two employees said. As workers used the app to maintain social ties during the work-from-home era by sharing recipes and other light-hearted content, more serious discussions have also taken root.
In the Slack thread devoted to the photo-scanning feature, some employees have pushed back against criticism, while others said Slack wasn’t the proper forum for such discussions.
Core security employees did not appear to be major complainants in the posts, and some of them said that they thought Apple’s solution was a reasonable response to pressure to crack down on illegal material.
Other employees said they hoped that the scanning is a step toward fully encrypting iCloud for customers who want it, which would reverse Apple’s direction on the issue a second time.

PROTEST
Last week’s announcement is drawing heavier criticism from past outside supporters who say Apple is rejecting a history of well-marketed privacy fights.
They say that while the US government can’t legally scan wide swaths of household equipment for contraband or make others do so, Apple is doing it voluntarily, with potentially dire consequences.
People familiar with the matter said a coalition of policy groups are finalizing a letter of protest to send to Apple within days demanding a suspension of the plan. Two groups, the Electronic Frontier Foundation (EFF)and Center for Democracy and Technology (CDT) both released newly detailed objections to Apple’s plan in the past 24 hours.
“What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in,” CDT project director Emma Llanso said in an interview. “It seems so out of step from everything that they had previously been saying and doing.”
Apple declined to comment for this story. It has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.
Outsiders and employees pointed to Apple’s stand against the FBI in 2016, when it successfully fought a court order to develop a new tool to crack into a terrorism suspect’s iPhone. Back then, the company said that such a tool would inevitably be used to break into other devices for other reasons.
But Apple was surprised its stance then was not more popular, and the global tide since then has been toward more monitoring of private communication.
With less publicity, Apple has made other technical decisions that help authorities, including dropping a plan to encrypt widely used iCloud backups and agreeing to store Chinese user data in that country.
A fundamental problem with Apple’s new plan on scanning child abuse images, critics said, is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect’s phone.
Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups.
But any country’s legislature or courts could demand that any one of those elements be expanded, and some of those nations, such as China, represent enormous and hard to refuse markets, critics said.
Police and other agencies will cite recent laws requiring “technical assistance” in investigating crimes, including in the United Kingdom and Australia, to press Apple to expand this new capablity, the EFF said.
“The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible,” wrote EFF General Counsel Kurt Opsahl.
Lawmakers will build on it as well, said Neil Brown, a UK tech lawyer at decoded.legal: “If Apple demonstrates that, even in just one market, it can carry out on-device content filtering, I would expect regulators/lawmakers to consider it appropriate to demand its use in their own markets, and potentially for an expanded scope of things.”


MenaML hosts 2026 Winter School in Saudi Arabia to boost AI education, collaboration in region

Updated 16 January 2026
Follow

MenaML hosts 2026 Winter School in Saudi Arabia to boost AI education, collaboration in region

  • Second edition of Winter School will be hosted in partnership with KAUST

DUBAI: The Middle East and North Africa Machine Learning Winter School will host its second edition in Saudi Arabia this year, in partnership with the King Abdullah University of Science and Technology.

The non-profit held its inaugural edition in Doha last year in partnership with the Qatar Computing Research Institute.

The initiative began when like-minded individuals from Google DeepMind and QCRI came together to launch a platform connecting a “community of top-tier AI practitioners with a shared interest in shaping the future of the MENA region,” Sami Alabed, a research scientist at Google DeepMind and one of the co-founders of MenaML, told Arab News.

Along with Alabed, the core team includes Maria Abi Raad and Amal Rannen-Triki from Google DeepMind, as well as Safa Messaoud and Yazan Boshmaf from QCRI.

Maria Abi Raad

Messaoud said that the school has three goals: building local talent in artificial intelligence, enhancing employability and connection, and reversing brain drain while fostering regional opportunity.

AI has dominated boardrooms and courtrooms alike globally, but “AI research and education in MENA are currently in a nascent, yet booming, stage,” she added.

Launched at a pivotal moment for the region, the initiative was timed to ensure “regional representation in the global AI story while cultivating AI models that are culturally aligned,” said Rannen-Triki.

The school’s vision is to cultivate researchers capable of developing “sophisticated, culturally aligned AI models” that reflect the region’s values and linguistic and cultural diversity, said Messaoud.

This approach, she added, enables the region to contribute meaningfully to the global AI ecosystem while ensuring that AI technologies remain locally relevant and ethically grounded.

MenaML aims to host its annual program in a different city each year, partnering with reputable institutions in each host location.

“Innovation does not happen in silos; breakthroughs are born from collaboration that extends beyond borders and lab lines,” said Alabed.

“Bringing together frontier labs to share their knowledge echoes this message, where each partner brings a unique viewpoint,” he added.

This year, MenaML has partnered with KAUST, which “offers deep dives into specialized areas critical to the region, blending collaborative spaces with self-learning and placement programs,” said Abi Raad.

The program, developed in partnership with KAUST, brings together speakers from 16 institutions and focuses on four key areas: AI and society, AI and sciences, AI development, and regional initiatives.

“These themes align with the scientific priorities and research excellence pillars of KAUST as well as the needs of regional industries seeking to deploy AI safely and effectively,” said Bernard Ghanem, professor of electrical and computer engineering and computer science at KAUST and director of the Center of Excellence in Generative AI.

The program will also highlight efficiency in AI systems, with the overall goal of equipping “participants with the conceptual and practical understanding needed to contribute meaningfully to next-generation AI research and development,” he told Arab News.

For KAUST, hosting the MenaML Winter School aligns with Saudi Arabia’s ambition to become a global hub for AI research under Vision 2030.

By attracting top researchers, industry partners, and young talent to the Kingdom, it helps cement the Kingdom’s position as a center for AI excellence, Ghanem said.

It also aligns closely with Vision 2030’s “goals of building human capital, fostering innovation, and developing a knowledge-based economy” and “contributes to the long-term development of a world-leading AI ecosystem in Saudi Arabia,” he added.

Although the program accepts students from around the world, participants must demonstrate a connection to the MENA region, Abi Raad said.

The goal is to build bridges between those who may have left the region and those who remain, enabling them to start conversations and collaborate, she added.

A certain percentage of spots is reserved for participants from the host country, while a small percentage is allocated to fully international students with no regional ties, with the objective of offering them a glimpse into the regional AI ecosystem.

Looking ahead, MenaML envisions growing from an annual event into a sustainable, central pillar of the regional AI ecosystem, inspired by the growth trajectory of global movements like TED or the Deep Learning Indaba, a sister organization supporting AI research and education in Africa.

Boshmaf said MenaML’s long-term ambition is to evolve beyond its flagship event into a broader movement, anchored by local MenaMLx chapters across the region.

Over time, the initiative aims to play a central role in strengthening the regional AI ecosystem by working with governments and the private sector to support workforce development, AI governance and safety education, and collaborative research, while raising the region’s global visibility through its talent network and international partnerships.

He added: “If TED is the global stage for ‘ideas worth spreading,’ MenaML is to be the regional stage for ‘AI ideas worth building.’”

The MenaML Winter School will run from Jan. 24 to 29 at KAUST in Saudi Arabia.