MPs to summon Elon Musk to testify about X’s role in UK summer riots

MPs are to summon Elon Musk to testify about X’s role in spreading disinformation, in a parliamentary inquiry into the UK riots and the rise of false and harmful AI content, the Guardian has learned.

Senior executives from Meta, which runs Facebook and Instagram, and TikTok are also expected to be called for questioning as part of a Commons science and technology select committee social media inquiry.

The first hearings will take place in the new year, amid rising concern that UK online safety laws risk being outpaced by rapidly advancing technology and the politicisation of platforms such as X.

The MPs will investigate the consequences of generative AI, which was used in widely shared images posted on Facebook and X inciting people to join Islamophobic protests after the killing of three schoolgirls in Southport in August. They will also investigate Silicon Valley business models that “encourage the spread of content that can mislead and harm”.

“[Musk] has very strong views on multiple aspects of this,” said Chi Onwurah, the Labour chair of the select committee. “I would certainly like the opportunity to cross-examine him to see … how he reconciles his promotion of freedom of expression with his promotion of pure disinformation.”

Musk, the owner of X, fumed when he was not invited to a UK government international investment summit in September. Onwurah told the Guardian: “I’d like to make up for that by inviting him to attend.”

Former Labour minister Peter Mandelson, tipped to become the next UK ambassador to Washington, this week called for an end to the “feud” between Musk and the UK government.

“He is a sort of technological, industrial, commercial phenomenon,” Mandelson told the How to Win an Election podcast. “And it would be unwise, in my view, for Britain to ignore him. You cannot pursue these feuds.”

X did not respond when asked if Musk would testify in the UK, although it appears unlikely. The world’s richest man is preparing to take on a senior role in the Trump White House and has been highly critical of the Labour government, including weighing in on changes to inheritance tax on farms by saying on Monday that “Britain is going full Stalin”. During the riots that followed the Southport killings he said: “Civil war is inevitable.”

The Commons inquiry comes amid fresh turbulence in the social media landscape as millions of X users move to Bluesky, a new platform, with many migrating in protest at misinformation, the presence of once-banned users such as Tommy Robinson and Andrew Tate, and updated service terms that allow the platform to train its AI models on user data.

Keir Starmer said on Tuesday he had “no plans” to join Bluesky personally, or for government departments to open official account. The prime minister told reporters at the G20 summit in Brazil: “What’s important for a government is that we’re able to reach as many people and communicate with as many people as possible, and that’s the sole test for any of this as far as I’m concerned.”

After Musk was not invited to the UK government investment summit, he said: “I don’t think anyone should go to the UK when they’re releasing convicted paedophiles in order to imprison people for social media posts.”

One person jailed after the riots was Lucy Connolly, who posted on X: “Mass deportation now, set fire to all the fucking hotels full of the bastards for all I care.” She was convicted under the Public Order Act for publishing material intending to stir up racial hatred. X found the post did not violate its rules against violent threats.

Onwurah said the inquiry would attempt to “get to the bottom of the links between social media algorithms, generative AI, and the spread of harmful or false content”.

It will also look at the use of AI to supplement search engines such as Google, which was found recently to be regurgitating false and racist claims about people in African countries having low average IQs. Google said the AI overviews containing the claims had violated its policies and had been removed.

After the Southport killings on 29 July misinformation swept through social media, with accounts with more than 100,000 followers falsely naming the alleged attacker as a Muslim asylum seeker.

Ofcom, the UK communications regulator, has already concluded some platforms “were used to spread hatred, provoke violence targeting racial and religious groups, and encourage others to attack and set fire to mosques and asylum accommodation”.

Next month, Ofcom will publish the rules on illegal harms under the Online Safety Act, which are expected to require social media companies to prevent the spread of illegal material and mitigate safety risks, including policing activity that provokes violence or stirs up hatred, and false communications intended to cause harm.

Firms will have to remove illegal material once they are aware of it and address safety risks in the design of their products.

This post was originally published on this site