Brands blast Twitter for ads next to child pornography accounts

A promoted tweet on Twitter app is displayed on a mobile phone near a Twitter logo, in this illustration picture taken Sept. 8, 2022 (REUTERS)
Short Url
Updated 29 September 2022
Follow

Brands blast Twitter for ads next to child pornography accounts

  • Mazda, Forbes and Dyson are among the brands to suspend their marketing campaigns on the platform

 

Some major advertisers including Dyson, Mazda, Forbes and PBS Kids have suspended their marketing campaigns or removed their ads from parts of Twitter because their promotions appeared alongside tweets soliciting child pornography, the companies told Reuters.

DIRECTV and Thoughtworks also told Reuters late on Wednesday they have paused their advertising on Twitter.

Brands ranging from Walt Disney Co (DIS.N), NBCUniversal (CMCSA.O) and Coca-Cola Co (KO.N) to a children's hospital were among more than 30 advertisers that appeared on the profile pages of Twitter accounts peddling links to the exploitative material, according to a Reuters review of accounts identified in new research about child sex abuse online from cybersecurity group Ghost Data.

Some of tweets include key words related to “rape” and “teens,” and appeared alongside promoted tweets from corporate advertisers, the Reuters review found. In one example, a promoted tweet for shoe and accessories brand Cole Haan appeared next to a tweet in which a user said they were “trading teen/child” content.

“We’re horrified,” David Maddocks, brand president at Cole Haan, told Reuters after being notified that the company’s ads appeared alongside such tweets. “Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.”

In another example, a user tweeted searching for content of “Yung girls ONLY, NO Boys,” which was immediately followed by a promoted tweet for Texas-based Scottish Rite Children's Hospital. Scottish Rite did not return multiple requests for comment.

In a statement, Twitter spokesperson Celeste Carswell said the company “has zero tolerance for child sexual exploitation” and is investing more resources dedicated to child safety, including hiring for new positions to write policy and implement solutions.

She added that Twitter is working closely with its advertising clients and partners to investigate and take steps to prevent the situation from happening again.

Twitter’s challenges in identifying child abuse content were first reported in an investigation by tech news site The Verge in late August. The emerging pushback from advertisers that are critical to Twitter’s revenue stream is reported here by Reuters for the first time.

Like all social media platforms, Twitter bans depictions of child sexual exploitation, which are illegal in most countries. But it permits adult content generally and is home to a thriving exchange of pornographic imagery, which comprises about 13 percent of all content on Twitter, according to an internal company document seen by Reuters.

Twitter declined to comment on the volume of adult content on the platform.

Ghost Data identified the more than 500 accounts that openly shared or requested child sexual abuse material over a 20-day period this month. Twitter failed to remove more than 70 percent of the accounts during the study period, according to the group, which shared the findings exclusively with Reuters.

Reuters could not independently confirm the accuracy of Ghost Data’s finding in full, but reviewed dozens of accounts that remained online and were soliciting materials for "13+" and “young looking nudes.”

After Reuters shared a sample of 20 accounts with Twitter last Thursday, the company removed about 300 additional accounts from the network, but more than 100 others still remained on the site the following day, according to Ghost Data and a Reuters review.

Reuters then on Monday shared the full list of more than 500 accounts after it was furnished by Ghost Data, which Twitter reviewed and permanently suspended for violating its rules, said Twitter’s Carswell on Tuesday.

In an email to advertisers on Wednesday morning, ahead of the publication of this story, Twitter said it “discovered that ads were running within Profiles that were involved with publicly selling or soliciting child sexual abuse material.”

Andrea Stroppa, the founder of Ghost Data, said the study was an attempt to assess Twitter’s ability to remove the material. He said he personally funded the research after receiving a tip about the topic.

Twitter’s transparency reports on its website show it suspended more than 1 million accounts last year for child sexual exploitation.

It made about 87,000 reports to the National Center for Missing and Exploited Children, a government-funded non-profit that facilitates information sharing with law enforcement, according to that organization's annual report.

“Twitter needs to fix this problem ASAP, and until they do, we are going to cease any further paid activity on Twitter,” said a spokesperson for Forbes.

“There is no place for this type of content online,” a spokesperson for carmaker Mazda USA said in a statement to Reuters, adding that in response, the company is now prohibiting its ads from appearing on Twitter profile pages.

A Disney spokesperson called the content “reprehensible” and said they are “doubling-down on our efforts to ensure that the digital platforms on which we advertise, and the media buyers we use, strengthen their efforts to prevent such errors from recurring.”

A spokesperson for Coca-Cola, which had a promoted tweet appear on an account tracked by the researchers, said it did not condone the material being associated with its brand and said “any breach of these standards is unacceptable and taken very seriously.”

NBCUniversal said it has asked Twitter to remove the ads associated with the inappropriate content.

CODE WORDS

Twitter is hardly alone in grappling with moderation failures related to child safety online. Child welfare advocates say the number of known child sexual abuse images has soared from thousands to tens of millions in recent years, as predators have used social networks including Meta’s Facebook and Instagram to groom victims and exchange explicit images.

For the accounts identified by Ghost Data, nearly all the traders of child sexual abuse material marketed the materials on Twitter, then instructed buyers to reach them on messaging services such as Discord and Telegram in order to complete payment and receive the files, which were stored on cloud storage services like New Zealand-based Mega and US-based Dropbox, according to the group’s report.

A Discord spokesperson said the company had banned one server and one user for violating its rules against sharing links or content that sexualize children.

Mega said a link referenced in the Ghost Data report was created in early August and soon after deleted by the user, which it declined to identify. Mega said it permanently closed the user's account two days later.

Dropbox and Telegram said they use a variety of tools to moderate content but did not provide additional detail on how they would respond to the report.

Still the reaction from advertisers poses a risk to Twitter’s business, which earns more than 90 percent of its revenue by selling digital advertising placements to brands seeking to market products to the service's 237 million daily active users.

Twitter is also battling in court Tesla CEO and billionaire Elon Musk, who is attempting to back out of a $44 billion deal to buy the social media company over complaints about the prevalence of spam accounts and its impact on the business.

A team of Twitter employees concluded in a report dated February 2021 that the company needed more investment to identify and remove child exploitation material at scale, noting the company had a backlog of cases to review for possible reporting to law enforcement.

“While the amount of (child sexual exploitation content) has grown exponentially, Twitter’s investment in technologies to detect and manage the growth has not,” according to the report, which was prepared by an internal team to provide an overview about the state of child exploitation material on Twitter and receive legal advice on the proposed strategies.

“Recent reports about Twitter provide an outdated, moment in time glance at just one aspect of our work in this space, and is not an accurate reflection of where we are today,” Carswell said.

The traffickers often use code words such as “cp” for child pornography and are “intentionally as vague as possible,” to avoid detection, according to the internal documents.

The more that Twitter cracks down on certain keywords, the more that users are nudged to use obfuscated text, which “tend to be harder for (Twitter) to automate against,” the documents said.

Ghost Data’s Stroppa said that such tricks would complicate efforts to hunt down the materials, but noted that his small team of five researchers and no access to Twitter’s internal resources was able to find hundreds of accounts within 20 days.

Twitter did not respond to a request for further comment.


MenaML hosts 2026 Winter School in Saudi Arabia to boost AI education, collaboration in region

Updated 16 January 2026
Follow

MenaML hosts 2026 Winter School in Saudi Arabia to boost AI education, collaboration in region

  • Second edition of Winter School will be hosted in partnership with KAUST

DUBAI: The Middle East and North Africa Machine Learning Winter School will host its second edition in Saudi Arabia this year, in partnership with the King Abdullah University of Science and Technology.

The non-profit held its inaugural edition in Doha last year in partnership with the Qatar Computing Research Institute.

The initiative began when like-minded individuals from Google DeepMind and QCRI came together to launch a platform connecting a “community of top-tier AI practitioners with a shared interest in shaping the future of the MENA region,” Sami Alabed, a research scientist at Google DeepMind and one of the co-founders of MenaML, told Arab News.

Along with Alabed, the core team includes Maria Abi Raad and Amal Rannen-Triki from Google DeepMind, as well as Safa Messaoud and Yazan Boshmaf from QCRI.

Maria Abi Raad

Messaoud said that the school has three goals: building local talent in artificial intelligence, enhancing employability and connection, and reversing brain drain while fostering regional opportunity.

AI has dominated boardrooms and courtrooms alike globally, but “AI research and education in MENA are currently in a nascent, yet booming, stage,” she added.

Launched at a pivotal moment for the region, the initiative was timed to ensure “regional representation in the global AI story while cultivating AI models that are culturally aligned,” said Rannen-Triki.

The school’s vision is to cultivate researchers capable of developing “sophisticated, culturally aligned AI models” that reflect the region’s values and linguistic and cultural diversity, said Messaoud.

This approach, she added, enables the region to contribute meaningfully to the global AI ecosystem while ensuring that AI technologies remain locally relevant and ethically grounded.

MenaML aims to host its annual program in a different city each year, partnering with reputable institutions in each host location.

“Innovation does not happen in silos; breakthroughs are born from collaboration that extends beyond borders and lab lines,” said Alabed.

“Bringing together frontier labs to share their knowledge echoes this message, where each partner brings a unique viewpoint,” he added.

This year, MenaML has partnered with KAUST, which “offers deep dives into specialized areas critical to the region, blending collaborative spaces with self-learning and placement programs,” said Abi Raad.

The program, developed in partnership with KAUST, brings together speakers from 16 institutions and focuses on four key areas: AI and society, AI and sciences, AI development, and regional initiatives.

“These themes align with the scientific priorities and research excellence pillars of KAUST as well as the needs of regional industries seeking to deploy AI safely and effectively,” said Bernard Ghanem, professor of electrical and computer engineering and computer science at KAUST and director of the Center of Excellence in Generative AI.

The program will also highlight efficiency in AI systems, with the overall goal of equipping “participants with the conceptual and practical understanding needed to contribute meaningfully to next-generation AI research and development,” he told Arab News.

For KAUST, hosting the MenaML Winter School aligns with Saudi Arabia’s ambition to become a global hub for AI research under Vision 2030.

By attracting top researchers, industry partners, and young talent to the Kingdom, it helps cement the Kingdom’s position as a center for AI excellence, Ghanem said.

It also aligns closely with Vision 2030’s “goals of building human capital, fostering innovation, and developing a knowledge-based economy” and “contributes to the long-term development of a world-leading AI ecosystem in Saudi Arabia,” he added.

Although the program accepts students from around the world, participants must demonstrate a connection to the MENA region, Abi Raad said.

The goal is to build bridges between those who may have left the region and those who remain, enabling them to start conversations and collaborate, she added.

A certain percentage of spots is reserved for participants from the host country, while a small percentage is allocated to fully international students with no regional ties, with the objective of offering them a glimpse into the regional AI ecosystem.

Looking ahead, MenaML envisions growing from an annual event into a sustainable, central pillar of the regional AI ecosystem, inspired by the growth trajectory of global movements like TED or the Deep Learning Indaba, a sister organization supporting AI research and education in Africa.

Boshmaf said MenaML’s long-term ambition is to evolve beyond its flagship event into a broader movement, anchored by local MenaMLx chapters across the region.

Over time, the initiative aims to play a central role in strengthening the regional AI ecosystem by working with governments and the private sector to support workforce development, AI governance and safety education, and collaborative research, while raising the region’s global visibility through its talent network and international partnerships.

He added: “If TED is the global stage for ‘ideas worth spreading,’ MenaML is to be the regional stage for ‘AI ideas worth building.’”

The MenaML Winter School will run from Jan. 24 to 29 at KAUST in Saudi Arabia.