The CEO, Paul Ash, responds to the Meta decision to ditch fact-checkers, among other changes that come just ahead of Trump’s return, along with the recent activity of Elon Musk.
One of the most resounding of New Year resolutions this month came from Mark Zuckerberg, CEO and chair of the board at social media behemoth Meta. Just in time for the return of Donald Trump to the White House, Zuckerberg – who along with a procession of tech leaders has paid personal homage to the president-elect at Mar-a-Lago – announced the dismantling of fact-checking teams across Facebook and other services, in favour of user generated “community notes”.
That wasn’t the only announcement sure to have made music in Trump’s ears. Diversity programmes at the social media giant have also been disestablished. Nonbinary and transgender flag options have been wiped from the Messenger app. The “hateful conduct” policy has been tweaked so that posts calling gay people mentally ill are now acceptable. Zuckerberg took aim at his own fact checking team, calling them biased, as well as aiming shots at the Biden presidency and the mainstream media. He also went on the Joe Rogan podcast, where he declared corporate America had become “culturally neutered” and needed more “masculine energy”.
Personnel have changed, too. Nick Clegg, a former UK deputy prime minister, has been replaced as head of Meta global affairs by Joel Kaplan, a “Trumpian bulldog”. UFC boss and close Trump ally Dana White has been appointed to the board. Zuckerberg is, according to US reports, set to be seated prominently at the inauguration ceremony alongside a pair of tech owners even richer than he is: Jeff Bezos and Elon Musk.
While many have cheered the grand Zuckerberg pivot, others have raised the alarm, suggesting, for example, that what happened in Myanmar means the decision to end existing fact-checking processes “could have disastrous consequences”. The head of advocacy group Accountable Tech, Nicole Gill, condemned “a gift to Donald Trump and extremists around the world.” She added: “Zuckerberg is re-opening the floodgates to the exact same surge of hate, disinformation, and conspiracy theories that caused January 6th – and that continue to spur real-world violence.” A spokesperson for the group Free Press said the Meta boss was “saying yes to more lies, yes to more harassment, yes to more hate” as part of a “chilling new era of big tech capitulation”.
The Christchurch Call and Meta
I was curious to know whether those concerns were shared by the Christchurch Call, the initiative that emerged after the Christchurch terrorist attack of 2019, when Jacinda Ardern and Emmanuel Macron brought together governments, tech companies and civic groups to tackle online manifestations of terrorism and violent extremism. Paul Ash, CEO of the Christchurch Call, which has operated as a charitable foundation with a secretariat based in New Zealand – with Ardern now patron – since it was untethered from the government in mid-2024, said he’d been watching with interest the changes at Meta, but was confident that the company remained committed to the core mission in countering eliminating terrorist and violent extremist content.
Ash, who took time out of his summer break to speak to The Spinoff, said it was “assuring that Meta has said it will continue to monitor and act on terrorist content as it has been doing. It has not changed its Dangerous Organisations and Individuals policy and that’s the core of policy that underpins the work that the Christchurch Call does – dealing with terrorist and violent extremist content.”
Ash was confident that Meta, together with other members of the Christchurch Call “multistakeholder community”, remained committed to the goal of eliminating terrorist and violent extremist content. “We haven’t seen explicit signs that tech firms are moving away from that,” he said. “I think the community as a whole would be extremely concerned if they saw signs that social media firms who participate in the Call were looking to make significant changes in areas that impact on the commitments relating to terrorist and violent extremist content.”
The principles of the Christchurch Call, Ash said, remained relevant in this period of turmoil. “If anything, in the face of changes in technology, and in particular, in AI, and changes in the global political environment, it’s probably more needed than ever.”
Ash said: “The Christchurch Call community as a whole – and it is a large global community, of 56 countries, 19 tech firms and more than 50 civil society organisations – is paying really close attention to this pattern, both by social media companies and, in fact, governments as they seek to regulate in this area, which is one of the most complex areas to regulate that I think you could imagine.” There was an ongoing commitment, he said, “to check that as they do that, both the company steps and the governmental steps align with the Christchurch Call commitments around making sure that terrorist and violent extremist content isn’t available online and doing so in a way that doesn’t constrain public discourse and freedom of expression.”
As to whether the decisions taken at Meta might expose people to more disinformation, dangerous extremist ideas and potentially embolden potential terrorist actors, Ash said: “I think there’s a distinction to be made between extreme views and extreme views that advocate violence or dehumanising behavior. I think any platform that lifts a set of controls, can expect to see some more extreme views popping up. That’s human nature, that’s how people behave. Where that starts to go towards dehumanising behaviour and advocacy of either violence or discrimination, then the set of tools required to respond to that start to kick in, whether that’s community moderation, whether it’s company moderation, whether it’s user moderation using ‘middleware’, all those are an option.”
He added: “I do think you can expect – and Mark Zuckerberg himself said this – to see a range of content that is more controversial or potentially more toxic in terms of discourse. And in that sense, it wouldn’t surprise me to see that giving encouragement to extremists as they look to test the boundaries of the new policy settings at Facebook and Meta products more broadly. Is that a concern for the Christchurch Call? It is if it leads to radicalisation towards violence and towards terrorist content or activity. And that’s a discussion that the community will have – it’s one that it’s been having for the five-and-a-half years since the Christchurch Call was established. It’s right at the core of the biggest challenges it’s dealing with TVEC [terrorist and violent extremist content], working out where those lines are between violent extremism and radicalisation towards it. Changes to content moderation policies or fact checking inevitably give rise to discussion around that.”
The Musk factor
Elon Musk, of course, was aboard the Trump train long before the Meta boss, bringing funds and visibility, not to mention the force of X, the social network he purchased when it was Twitter in 2022. Never one to shy from publicity, Musk’s front-of-stage and behind-the-scenes influence is such that some commentators have suggested he has more power than vice-president-elect JD Vance, while at least one political opponent has cast him as Trump’s “co-president”.
For all Musk’s commitments to government and across several multibillion-dollar businesses, he finds plenty of time to tweet. In recent months on X he has shared his unvarnished opinions well beyond the US border, spanning Canada, Denmark and Germany, where he is an enthusiastic cheerleader for the far-right AfD. He has become so vociferous on British domestic politics, with a flurry of tweets peddling falsehoods over investigations into grooming gangs, and accusing the prime minister, Keir Starmer of having been “complicit in the rape of Britain”. Britain’s independent terrorism and state threat watchdog has suggested Musk could be “involved in the crime of foreign interference under the National Security Act”. Anthony Albanese beseeched Musk not to interfere in this year’s Australian election. X has previously clashed with Australian authorities over requests to remove footage of an alleged terrorist attack.
Back in 2022, when Musk purchased what was then Twitter, Jacinda Ardern said it heralded “unknown territory” as far as the site was concerned. “Social media and platforms like Twitter have a huge responsibility,” the then prime minister told the Spinoff. “They can be a force for democracy, a force for connection and for good. But also if misused they can do a huge amount of harm.”
On the subject of X – another Christchurch Call member – Ash again chooses his words carefully. “A platform owner or any individual shareholder in a company will inevitably have their own political views … When they spill over into how the platform is conducted, that’s, I guess, an issue [similar to those] over many generations, the development of mores around the consolidation of media and the ability of media moguls, for instance, to influence political discourse. That’s a matter for Mr Musk and for the environments and governments in which he chooses to operate.”
As far as the Christchurch Call is concerned, he said, “the issue is whether there’s a proliferation of violent extremist and terrorist content on the platform. And we have seen with the reduction in the number of moderation staff, and some of the moderation capabilities at X, particularly through that initial period of ownership, a little bit of increased presence of the Christchurch terrorist video, for instance. When that has happened, we have engaged directly with X, and we’ve found them responsive in getting rid of that content when it occurs. We’re still in regular and active contact as needed when that video crops up.” The use by X of community moderation – the model for the Meta switch – had proved “reasonably successful”, he said. “I think community notes or community moderation per se is not necessarily problematic, as long as the expectation is not that the community becomes a large unpaid workforce for the social media firm.”
“But I do think there’s a need to try and distinguish between the political views of an owner and the question of TVEC on the platform. And there’s a bigger question there, I guess, for democracies to look at as they think about media consolidation and political activity.”
How the platforms have changed
Ash said he was confident that recent trajectories did not portend a backsliding from Christchurch Call signatories. “I would hope that those core commitments remain intact and that we will continue to see them being respected. I think any serious platform owner that saw the sort of live-streaming from Christchurch or similar activity on their platform today would be really concerned about that happening … There have been many, many attempts since Christchurch to live stream [terrorist acts]. One of the least pleasant strands of activity online is those that look at what happened in Christchurch and seek to emulate or better it,” he said.
“And for the most part, those have been successfully dealt with in the time since. We’ve never seen anything like Christchurch since, and there have been many, many efforts to have a go at doing so. So I think there’s some we can take real heart from that as a positive outcome. And I think the fact that we’ve still got continued engagement in the work of the Christchurch Call from a broad swath of industry, governments and civil society, is indicative of both the relevance of the work and the importance of continuing it. It’s always a bit complicated and messy doing things in that multi-stakeholder way. On the one hand, companies could unilaterally set their terms and conditions, or governments could unilaterally regulate. The experience has been that where that happens, it’s not as good an outcome, generally, either for freedom of expression or for dealing with the content, than if it’s done with a range of multi stakeholder participants feeding in their perspectives and looking for best practice.”
One of the “interesting and a little unusual” characteristics of the Christchurch Call, Ash said, was that “it not only sets out commitments for tech firms, but also for governments. That reflects the fact that the challenges of terrorist and violent extremist content in the online environment aren’t just online challenges. They often come from societal conditions or from a range of other factors. And that is something that I think the Christchurch Call community has turned its mind to increasingly over the last three to four years.” That remained one of the most pressing issues participants confronted: “the centrality of technology to people’s lives these days and the fact that conditions offline also have a real input into how people operate online. The broader trend indicates a need to really keep a focus on that holistic approach to dealing with some of these challenges.”