Tech firms poised to mass hire factcheckers before EU elections

Social media firms including TikTok, X, Facebook and Instagram will be required to put an army of factcheckers and moderators in place with a collective knowledge of 24 EU languages amid fears that the European parliamentary elections will be a prime target for disinformation campaigns run by Russia and others including the far right.

The new rules flow from the EU’s Digital Services Act (DSA), which regulates content on social media, and follows a public consultation with civil society and election observation groups in February.

“The European parliament elections are particularly vulnerable and they need to put special measures in place,” said an EU official unveiling the new processes.

Under the guidelines, tech companies will be required to take six actions including the installation of native-language factchecking teams and of risk assessment units.

Content generated by AI will have to be labelled as such.

Escalation protocols and rapid alert systems to combat potential spikes in disinformation before and after voting, which takes place between 6 June and 9 June, must also be in place.

These systems will be stress-tested by the EU next month.

Unveiling the new guidelines, officials listed examples of fake news and weaknesses in moderation observed in elections in the last nine months as the DSA was coming into force.

During the last Dutch election campaign, claims on Facebook incorrectly told voters who couldn’t choose between Geert Wilders’ far-right party PVV and a rival that “this year” they could tick two boxes instead of one. Fact checkers pointed out that this invalidated the ballot paper.

After the Netherlands went to the polls in November, far-right groups looked to emulate Donald Trump and claimed the general election was “stolen” after hashtags such as #voterfraud and #stopthesteal began to appear online.

In Spain, fake accounts spread claims of there being bombs in voting booths on the day of the general election last July.

Platforms are not legally obliged to put the new measures in place, but if they don’t have “mitigation systems in place” that are “effective” or “better”, they can face fines of up to 6% of their revenues or daily fines if they do not ultimately comply with the DSA, which requires them to crack down on disinformation, fake news and harmful content.

The EU says being alert to local disinformation is just as important as guarding against campaigns orchestrated by the Kremlin or any other foreign bad-faith actor.

“We have to be on our toes regardless where the disinformation campaign comes from, whether it’s domestic or whether it’s foreign, given the geopolitical context we are in in the EU, with the two war zones in our neighbourhood,” said a senior EU official.

The European elections will be a big test for social media companies and the effectiveness of the new laws.

“An important kind of power of the DSA will be the transparency that it will provide. That is a big gamechanger compared to where we come from, the self regulatory world where … we never got to the data of the platforms, and that is fundamentally change brought about by the DSA,” said a senior EU official.

This post was originally published on this site