Meta’s Oversight Board Expands Its Role: Regulating Threads’ Content Moderation
2 min readMeta, the social media giant, has recently announced an expansion of its Oversight Board’s role. The independent group, which has been instrumental in content moderation decisions for Facebook and Instagram, will now have the power to influence policies for Meta’s newest app, Threads.
The Oversight Board’s expansion comes at a time when Threads, a Twitter-like app launched last summer, has seen significant growth. With over 130 million users and Mark Zuckerberg’s prediction of potential future growth to a billion users, the app’s influence is undeniable.
The addition of Threads’ content moderation to the board’s scope is a significant development. According to the Oversight Board, user appeals on Threads will function similarly to how they do on Instagram and Facebook. When users have exhausted Meta’s internal process, they will be able to request a review from the Oversight Board. The board’s decisions regarding specific posts are binding on Meta, but the company is not obligated to adhere to the board’s policy recommendations.
The expansion of the Oversight Board’s role underscores the growing influence of Threads and the importance of independent accountability for a new app. Helle Thorning-Schmidt, co-chair of the Oversight Board, stated, “Having independent accountability early on for a new app such as Threads is vitally important.”
Meta has already faced some pushback from users over its policies for recommending content on Threads. The app currently blocks search terms related to COVID-19 and other potentially sensitive topics. Meta’s decision to not recommend accounts that post too much political content unless users opt-in to such suggestions also raised some eyebrows.
Regardless of whether the board ends up weighing in on these choices, it is likely that Threads users will see some changes as a result of the board’s recommendations. However, it may take some time for these changes to be implemented. The Oversight Board only accepts a tiny fraction of user appeals, and it can take several weeks or months for the group to make a decision. Meta is then required to change any of its rules as a result of the guidance, which can take additional months.
The expansion of the Oversight Board’s role is a significant step towards ensuring that Meta’s apps maintain a safe and inclusive environment for their users. The board’s ability to review and make decisions on content moderation issues for Threads will provide an additional layer of accountability and transparency for the app’s users.
In conclusion, Meta’s decision to expand the role of its Oversight Board to include Threads’ content moderation is a positive step towards ensuring that the app maintains a safe and inclusive environment for its users. The independent group’s ability to review and make decisions on content moderation issues will provide an additional layer of accountability and transparency for Threads’ users. As the app continues to grow and influence, the importance of independent accountability becomes increasingly vital.