Photo by Dima Solomin on Unsplash
Meta’s company-funded oversight body is planning to trim its workforce, a downsizing effort that could affect the board’s ability to police the world’s largest social media network.
Meta’s Oversight Board, an independent collection of academics, experts and lawyers who oversee the social media giant’s thorny content decisions, told some employees last week that their jobs were at risk of being cut, according to people familiar with the matter who spoke on the condition of anonymity to discuss internal matters.
The job eliminations probably will add to challenges facing the Oversight Board at a time when it has faced criticism for moving too slowly and cautiously to issue decisions and opinions that significantly shape how Meta handles contentious free speech debates. The cuts also could affect the Oversight Board’s longtime quest to be considered a viable model of governance for the social media industry in the eyes of regulators, civil society groups and the general public.
Stephen Neal, the chairperson of the Oversight Board Trust, said in a statement that the “targeted cuts” will allow the Oversight Board “to further optimize our operations by prioritizing the most impactful aspects of our work.”
“The Board continues to be one of the leading content moderation organizations in the world and Meta remains committed to its ongoing success and the Board is fully confident that the company will provide additional funding in the future,” Neal added.
Meta spokesman Andy Stone said in a statement that the company “remains committed to the Oversight Board” and “continues to strongly support its work.”
“We value the Board’s perspective, have implemented all their binding content decisions to date and will continue to update our policies and practices in response to their feedback,” Stone added.
Meta conceived the Oversight Board as an experiment in 2019 as regulators around the world were attempting to craft uniform rules governing social media platforms. Meta CEO Mark Zuckerberg had argued that the his company should not be in the position of making “so many important decisions” about how internet platforms should balance their desire to support free expression with their desire to protect users from harmful content.
Read full article here