The EU tells Twitter to hire more human content moderators amid concerns of rise of illegal content
The European Union and Twitter boss Elon Musk are currently butting heads over the social media giant’s content moderation policies
According to the Financial Times, EU regulators have asked Musk to hire more people to fact-check and review illegal content and disinformation.
All major social media platforms will have to comply with the EU’s Digital Services Act expected to come into full effect in 2024. It’s going to force platforms to put in place measures to fight illegal content or face sweeping fines.
- Elon Musk backtracks after publicly mocking Twitter employee who asked if he still had a job
But there are concerns over whether Twitter is going to be able to comply with these new and strict rules after Elon Musk fired more than half of Twitter’s staff including entire content moderation teams.
Melissa Ingle, a data scientist at Twitter’s civic integrity team was one of them. Before she was laid off, her job was to moderate the platform for political misinformation.
In November, a notification popped up on her phone, saying she had been locked out of her work emails. “That’s how I knew I was fired. I wasn’t directly told. I was just locked out of the system,” she told Euronews.
She says that out of her team of 30 people, only eight remain at the San Francisco office. “They’re under a lot of pressure and they talk about working long hours with not a lot of work-life balance,” Ingle claimed.
Twitter’s new owner says he wants to use artificial intelligence and volunteers to police content, instead of more costly human moderators.
- Project Clover: TikTok’s charm offensive aiming to calm data sharing fears in the UK and Europe
Spike in hate speech after Twitter layoffs
Like other critics, Melissa Ingle believes human moderators are essential to intercepting disinformation and hateful content.
“You really need the data scientists because there’s 30 million tweets every hour or 500 million a day. You need humans because as good as I would love to say our algorithms are, there’s still a lot that gets through,” she explained.
Since the mass layoffs, many analysts have noticed a spike in hate speech and disinformation on the platform.
“One of the biggest sources of misinformation were state governments trying to push through political propaganda. So we’re seeing more of that as well as all the hate speech,” she said.
“Number two: we’re seeing an increase in site outages. We’ve seen the site go down. So I’m very concerned with these rises in hate speech and these website problems”.
- More layoffs reported at Twitter, including ‘hardcore’ Elon Musk loyalists
Other concerns have been raised that Twitter does not have enough volunteer moderators such as Wikipedia and has a poor record when it comes to policing content that’s not in English.
“The lack of human moderators, insufficient training in human rights, and also their content moderation systems that are being predominantly trained for English language or more Western speaking audience in contrast to minority languages or languages of the global South. That’s the prevalent issue that Twitter and other platforms face,” said Elishka Pirkova, a policy analyst at digital rights group Access Now.
What’s next? The largest social media platforms will have until September 1, 2023, to comply with the EU’s Digital Services Act or face a penalty of up to 6 per cent of their yearly revenue.
The European Union and Twitter boss Elon Musk are currently butting heads over the social media giant’s content moderation policies According to the Financial Times, EU regulators have asked Musk to hire more people to fact-check and review illegal content and disinformation. All major social media platforms will have to comply with the EU’s Digital…