Australia should force Meta to pay for news, News Corp. executive says

Publishers argue that Facebook and other Internet giants profit unfairly from advertising revenue when links to news articles appear on their platforms. (Reuters)
Short Url
Updated 05 June 2024
Follow

Australia should force Meta to pay for news, News Corp. executive says

  • Meta said in March it would stop paying Australian news publishers for content
  • Australian government now considering whether to apply a 2021 law that would force it to do so

CANBERRA: Australia should force Meta Platforms to pay news companies for content that appears on Facebook and impose broader regulation on social media firms, a senior News Corp. executive said.
Meta said in March it would stop paying Australian news publishers for content. The government is now considering whether to apply a 2021 law that would force it to do so.
“Meta must be designated under the Media Bargaining Code and challenged to negotiate in good faith,” News Corp. Australia executive chairman Michael Miller said in a speech in Canberra, using the jargon of the 2021 legislation.
“We had a deal — and they walked away. I believe they have an obligation to renew the agreements, and honor our laws,” he said.
“We can’t let ourselves be bullied.”
Asked for comment, Meta referred Reuters to a company blog post that said interest in news was declining on its platforms and cast those platforms as free distribution channels that media companies could use to expand their audiences.
Publishers argue that Facebook and other Internet giants profit unfairly from advertising revenue when links to news articles appear on their platforms.
Meta struck payment deals with Australian media firms in 2021, most of which lapse this year.
If the government tries to enforce the 2021 law, Meta could block users from reposting news articles as it did briefly in Australia in 2021 and has done since 2023 in Canada, which has similar laws and where academics have noted an increased spread of misinformation as a result.
Meta has been reducing its promotion of news and political content to drive traffic and has said it will discontinue a tab on Facebook promoting news in Australia.
In his speech, Miller also decried the impact of social media on mental health and its amplification of scams and social ills such as misogyny.
He proposed a regulatory framework for tech firms such as Meta, Tik Tok and X, formerly known as Twitter, that he said would protect Australians.
This would include making companies liable for all content on their platforms, competition laws for digital advertising, better handling of consumer complaints and donations to mental health programs.
Companies that do not abide by these rules should be barred from the Australian market, he said.
A spokesperson for Meta said: “The suggestion that Meta doesn’t respect Australian laws or community standards is preposterous.”
The company has restricted access to content in line with Australian laws, worked with law enforcement to prevent real world harm and trained thousands of young Australians in online safety, they said.


Disinformation the new enemy in disaster zones, says Red Cross

Updated 05 March 2026
Follow

Disinformation the new enemy in disaster zones, says Red Cross

  • “Harmful information and dehumanizing narratives” undermines humanitarian aid and putting lives of aid workers at risk
  • Between 2020 and 2024, disasters affected nearly 700 million people, displaced over 105 million, and killed more than 270,000 — doubling the number in need of humanitarian aid

GENEVA: The rise of disinformation is undermining humanitarian aid and putting lives at risk, while disasters are affecting ever more people, the Red Cross warned Thursday.
“Between 2020 and 2024, disasters affected nearly 700 million people, caused more than 105 million displacements, and claimed over 270,000 lives,” the International Federation of Red Cross and Red Crescent Societies said.
The number of people needing humanitarian assistance more than doubled in the same timeframe, the IFRC said in its World Disasters Report 2026.
But the world’s largest humanitarian network said that “harmful information and dehumanizing narratives” were increasingly undermining trust, putting the lives of aid workers at risk.
“In polarized and politically-charged contexts, humanitarian principles such as neutrality and impartiality are increasingly misunderstood, misrepresented or deliberately attacked online,” it said.
The IFRC has more than 17 million volunteers across more than 191 countries.
“In every crisis I have witnessed, information is as essential as food, water and shelter,” said the Geneva-based federation’s secretary general Jagan Chapagain.
“But when information is false, misleading or deliberately manipulated, it can deepen fear, obstruct humanitarian access and cost lives.”
He said harmful information was not a new phenomenon, but it was now moving “with unprecedented speed and reach.”
Chapagain said digital platforms were proving “fertile ground for lies.”
The IFRC report said the challenge nowadays was no longer about the availability of information but its reliability, noting that the production and spread of disinformation was easily amplified by artificial intelligence.

- ‘Life and death’ -

The report cited numerous recent examples of harmful information hampering crisis response.
During the 2024 floods in Valencia, false narratives online accused the Spanish Red Cross of diverting aid to migrants, which in turn fueled “xenophobic attacks on volunteers,” the IFRC said.
In South Sudan, rumors that humanitarian agencies were distributing poisoned food “caused people to avoid life-saving aid” and led to threats against Red Cross staff.
In Lebanon, false claims that volunteers were spreading Covid-19, favoring certain groups with aid and providing unsafe cholera vaccines eroded trust and endangered vulnerable communities, the IFRC said.
And in Bangladesh, during political unrest, volunteers faced “widespread accusations of inaction and political alignment,” leading to harassment and reputational damage, it added.
Similar events were registered by the IFRC in Sudan, Myanmar, Peru, the United States, New Zealand, Canada, Kenya and Bulgaria.
The report underlined that around 94 percent of disasters were handled by national authorities and local communities, without international interventions.
“However, while volunteers, local leaders and community media are often the most trusted messengers, they operate in increasingly hostile and polarized information environments,” the IFRC said.
The federation called on governments, tech firms, humanitarian agencies and local actors to recognize that reliable information “is a matter of life and death.”
“Without trust, people are less likely to prepare, seek help or follow life-saving guidance; with it, communities act together, absorb shocks and recover more effectively,” said Chapagain.
The organization urged technology platforms to prioritize authoritative information from trusted sources in crisis contexts, and transparently moderate harmful content.
And it said humanitarian agencies needed to make preparing to deal with disinformation “a core function” of their operations, with trained teams and analytics.