Meta’s Top Executive Admits to Removing Too Much Content Across Apps
Nick Clegg, Meta’s president of global affairs, told reporters on Monday that the company’s moderation “error rates are still too high” and pledged to “improve the precision and accuracy with which we act on our rules.”
Moderation Failures and Error Rates
“We know that when enforcing our policies, our error rates are still too high, which gets in the way of the free expression that we set out to enable,” Clegg said during a press call. “Too often, harmless content gets taken down, or restricted, and too many people get penalized unfairly.”
Regrets Over Aggressive Removals
Clegg said the company regrets aggressively removing posts about the COVID-19 pandemic. CEO Mark Zuckerberg recently told the Republican-led House Judiciary Committee that the decision was influenced by pressure from the Biden administration.
Over-Enforcement and Mistakes
“We had very stringent rules removing very large volumes of content through the pandemic,” Clegg said. “No one during the pandemic knew how the pandemic was going to unfold, so this really is wisdom in hindsight. But with that hindsight, we feel that we overdid it a bit. We’re acutely aware because users quite rightly raised their voice and complained that we sometimes over-enforce and we make mistakes and we remove or restrict innocuous or innocent content.”
Examples of Moderation Failures
Clegg’s comments suggest that, after years of ramping up to what is now billions of dollars in annual spend on moderation, Meta’s automated systems have become too ham-fisted. Examples of “moderation failures” were recently trending on Threads, which has been plagued with takedown errors in recent months. The company publicly apologized after its systems suppressed photos of President-elect Donald Trump surviving an attempted assassination. And its own Oversight Board recently warned ahead of the US presidential election that its moderation errors risk the “excessive removal of political speech.”
Changes to Content Rules
Meta has yet to make any major known changes to its content rules since the election, though it sounds like big changes could be coming. Clegg referred to the rules as “a sort of living, breathing document” during the call with reporters.
Clegg’s Comments on Zuckerberg’s Dinner with Trump
“I can’t give you a running commentary on conversations I was not part of,” he said of Zuckerberg’s recent dinner with Trump. “The administration is still being assembled and the inauguration has not happened, so the conversations at this stage are clearly fairly high level. Mark is very keen to play an active role in the debates that any administration needs to have about maintaining America’s leadership in the technological sphere, which, of course, is tremendously important given all the geostrategic uncertainties around the world and particularly the pivotal role that AI will play in that scenario.”
Conclusion
Meta’s top executive has acknowledged that the company’s moderation efforts have been too aggressive, resulting in the removal of too much content across its apps. The company is pledging to improve the precision and accuracy of its moderation efforts, but it remains to be seen what specific changes will be made.
FAQs
Q: What are Meta’s error rates for moderation?
A: According to Nick Clegg, Meta’s president of global affairs, the company’s error rates for moderation are still too high.
Q: What has Meta done to address moderation failures?
A: Meta has apologized for its mistakes and is pledging to improve the precision and accuracy of its moderation efforts.
Q: Will Meta make changes to its content rules?
A: It is unclear what specific changes Meta will make to its content rules, but the company’s president of global affairs referred to the rules as “a sort of living, breathing document” and suggested that big changes could be coming.
Q: What is the significance of Meta’s moderation efforts?
A: Meta’s moderation efforts are critical to maintaining free expression online and ensuring that users have a safe and respectful experience on the platform.

