Thousands of fake Facebook accounts shut down by Meta were primed to polarize voters ahead of 2024

Attendees visit the Meta booth at the Game Developers Conference 2023 in San Francisco on March 22, 2023. (AP/File)
Short Url
Updated 30 November 2023
Follow

Thousands of fake Facebook accounts shut down by Meta were primed to polarize voters ahead of 2024

  • The network of nearly 4,800 fake accounts hints at serious threats posed by online disinformation 
  • National elections will occur in the US, Pakistan, India, Ukraine, Taiwan and other nations next year 

WASHINGTON: Someone in China created thousands of fake social media accounts designed to appear to be from Americans and used them to spread polarizing political content in an apparent effort to divide the US ahead of next year’s elections, Meta said Thursday. 

The network of nearly 4,800 fake accounts was attempting to build an audience when it was identified and eliminated by the tech company, which owns Facebook and Instagram. The accounts sported fake photos, names and locations as a way to appear like everyday American Facebook users weighing in on political issues. 

Instead of spreading fake content as other networks have done, the accounts were used to reshare posts from X, the platform formerly known as Twitter, that were created by politicians, news outlets and others. The interconnected accounts pulled content from both liberal and conservative sources, an indication that its goal was not to support one side or the other but to exaggerate partisan divisions and further inflame polarization. 

The newly identified network shows how America’s foreign adversaries exploit US-based tech platforms to sow discord and distrust, and it hints at the serious threats posed by online disinformation next year, when national elections will occur in the US, India, Mexico, Ukraine, Pakistan, Taiwan and other nations. 

“These networks still struggle to build audiences, but they’re a warning,” said Ben Nimmo, who leads investigations into inauthentic behavior on Meta’s platforms. “Foreign threat actors are attempting to reach people across the Internet ahead of next year’s elections, and we need to remain alert.” 

Meta Platforms Inc., based in Menlo Park, California, did not publicly link the Chinese network to the Chinese government, but it did determine the network originated in that country. The content spread by the accounts broadly complements other Chinese government propaganda and disinformation that has sought to inflate partisan and ideological divisions within the US 

To appear more like normal Facebook accounts, the network would sometimes post about fashion or pets. Earlier this year, some of the accounts abruptly replaced their American-sounding user names and profile pictures with new ones suggesting they lived in India. The accounts then began spreading pro-Chinese content about Tibet and India, reflecting how fake networks can be redirected to focus on new targets. 

Meta often points to its efforts to shut down fake social media networks as evidence of its commitment to protecting election integrity and democracy. But critics say the platform’s focus on fake accounts distracts from its failure to address its responsibility for the misinformation already on its site that has contributed to polarization and distrust. 

For instance, Meta will accept paid advertisements on its site to claim the US election in 2020 was rigged or stolen, amplifying the lies of former President Donald Trump and other Republicans whose claims about election irregularities have been repeatedly debunked. Federal and state election officials and Trump’s own attorney general have said there is no credible evidence that the presidential election, which Trump lost to Democrat Joe Biden, was tainted. 

When asked about its ad policy, the company said it is focusing on future elections, not ones from the past, and will reject ads that cast unfounded doubt on upcoming contests. 

And while Meta has announced a new artificial intelligence policy that will require political ads to bear a disclaimer if they contain AI-generated content, the company has allowed other altered videos that were created using more conventional programs to remain on its platform, including a digitally edited video of Biden that claims he is a pedophile. 

“This is a company that cannot be taken seriously and that cannot be trusted,” said Zamaan Qureshi, a policy adviser at the Real Facebook Oversight Board, an organization of civil rights leaders and tech experts who have been critical of Meta’s approach to disinformation and hate speech. “Watch what Meta does, not what they say.” 

Meta executives discussed the network’s activities during a conference call with reporters on Wednesday, the day after the tech giant announced its policies for the upcoming election year — most of which were put in place for prior elections. 

But 2024 poses new challenges, according to experts who study the link between social media and disinformation. Not only will many large countries hold national elections, but the emergence of sophisticated AI programs means it’s easier than ever to create lifelike audio and video that could mislead voters. 

“Platforms still are not taking their role in the public sphere seriously,” said Jennifer Stromer-Galley, a Syracuse University professor who studies digital media. 

Stromer-Galley called Meta’s election plans “modest” but noted it stands in stark contrast to the “Wild West” of X. Since buying the X platform, then called Twitter, Elon Musk has eliminated teams focused on content moderation, welcomed back many users previously banned for hate speech and used the site to spread conspiracy theories. 

Democrats and Republicans have called for laws addressing algorithmic recommendations, misinformation, deepfakes and hate speech, but there’s little chance of any significant regulations passing ahead of the 2024 election. That means it will fall to the platforms to voluntarily police themselves. 

Meta’s efforts to protect the election so far are “a horrible preview of what we can expect in 2024,” according to Kyle Morse, deputy executive director of the Tech Oversight Project, a nonprofit that supports new federal regulations for social media. “Congress and the administration need to act now to ensure that Meta, TikTok, Google, X, Rumble and other social media platforms are not actively aiding and abetting foreign and domestic actors who are openly undermining our democracy.” 

Many of the fake accounts identified by Meta this week also had nearly identical accounts on X, where some of them regularly retweeted Musk’s posts. 

Those accounts remain active on X. A message seeking comment from the platform was not returned. 

Meta also released a report Wednesday evaluating the risk that foreign adversaries including Iran, China and Russia would use social media to interfere in elections. The report noted that Russia’s recent disinformation efforts have focused not on the US but on its war against Ukraine, using state media propaganda and misinformation in an effort to undermine support for the invaded nation. 

Nimmo, Meta’s chief investigator, said turning opinion against Ukraine will likely be the focus of any disinformation Russia seeks to inject into America’s political debate ahead of next year’s election. 

“This is important ahead of 2024,” Nimmo said. “As the war continues, we should especially expect to see Russian attempts to target election-related debates and candidates that focus on support for Ukraine.” 


Teenage preacher to alleged mass killer: Bondi attack suspect’s background emerges

Updated 5 sec ago
Follow

Teenage preacher to alleged mass killer: Bondi attack suspect’s background emerges

SYDNEY/MANILA: Standing in the rain outside a suburban Sydney train station, seventeen-year-old Naveed Akram stares into the camera and urges those watching to spread the word of Islam.
“Spread the message that Allah is One wherever you can ... whether it be raining, hailing or clear sky,” he said.
Another since-deleted video posted in 2019 by Street Dawah Movement, a Sydney-based Islamic community group, shows him urging two young boys to pray more frequently.
Authorities are now trying to piece together what happened in the intervening six years that led a teenager volunteering to hand out pamphlets for a non-violent community group to allegedly carry out Australia’s worst mass shooting in decades.
Akram, who remains under heavy guard in hospital after being shot by police, was briefly investigated by Australia’s domestic intelligence agency in 2019 for links to individuals connected to Islamic State, but authorities found he did not have extremist tendencies at the time.
“In the years that followed, that changed,” Home Affairs Minister Tony Burke said on Tuesday.
Police have not formally identified Naveed Akram, 24, as one of the alleged gunmen who killed 15 people at a Jewish event on a Sydney beach on Sunday. His father Sajid Akram, 50, is the other gunman who was shot and killed by police, local media reported.
Officials have said the second gunman is the deceased man’s son and is in a critical condition in hospital.

MOTIVATED BY DAESH
Prime Minister Anthony Albanese said on Tuesday the attack was likely motivated by the ideology of Daesh, but that the two men appeared to have acted alone.
Homemade Daesh flags were found in the suspects’ car after Sunday’s attack, and police said on Tuesday the pair had last month visited the Philippines, where offshoots of the militant group have a presence.
A spokesperson for the Philippines Bureau of Immigration said Akram, an Australian national, arrived in the country on November 1 with his father, who was traveling on an Indian passport.
Both reported Davao as their final destination, the main city on Mindanao island, which has a history of Islamist insurgency. A months-long conflict on the island in 2017 between armed forces and two militant groups linked to IS left over a thousand dead and a million displaced, though the country’s military says these groups are now fragmented and weakened.
The pair left the Philippines on November 28, two weeks before Sunday’s attack using high-powered shotguns and rifles.

’NEVER DID ANYTHING UNUSUAL’
Local media reported that Akram, an unemployed bricklayer, attended high school in Cabramatta, a suburb around 30 kilometers by road from Sydney’s central business district and close to the family’s current home in Bonnyrigg, which was raided by police after the attacks.
“I could have never imagined in 100 years that this could be his doing,” former classmate Steven Luong told The Daily Mail.
“He was a very nice person. He never did anything unusual. He never even interrupted in class.”
After leaving school, Akram showed a keen interest in Islam, seeking tutoring and attending several Street Dawah Movement events. The group confirmed he appeared in the videos.
“We at Street Dawah Movement are horrified by his actions and we are appalled by his criminal behavior,” the group said in a statement, adding Akram had attended several events in 2019 but was not a member of the organization.
Months after the videos were posted, Akram approached tutor Adam Ismail seeking tuition in Arabic and the Qur'an, studying with him for a combined period of one year.
Ismail’s language institute posted a photo in 2022, since deleted, showing Akram smiling while holding a certificate in Qur'anic recitation.
“Not everyone who recites the Qur'an understands it or lives by its teachings, and sadly, this appears to be the case here,” Ismail said in a video statement late on Monday.
“I condemn this act of violence without hesitation.”

EARLIER TIES TO DAESH NOT PROVEN
Two of the people he was associated with in 2019 were charged and went to jail but Akram was not seen at that time to be a person of interest, Albanese said.
However he was radicalized, Akram’s journey from a teenager interested in Islam to one of Australia’s worst alleged killers has taken not just the public, but also law enforcement by surprise.
“We are very much working through the background of both persons,” New South Wales Police Commissioner Mal Lanyon told reporters on Monday.
“At this stage, we know very little about them.”