Francesca Scapolo: “Oversight Board Protects Facebook and Instagram Users' Rights” CRO Cyber Rights Organization

Francesca Scapolo: “Oversight Board Protects Facebook and Instagram Users’ Rights”

Exploring online dynamics and their impact on offline existence forms the foundation of our interview with Francesca Scapolo, Content and Social Media Lead at the Oversight Board. Established by Meta in 2020, this independent body provides oversight and accountability for content moderation decisions made by social media platforms such as Facebook and Instagram. 

In collaboration with CRO, Francesca Scapolo, alongside Silvia Semenzin, CRO’s Head of Research and Advocacy, engages in a comprehensive and detailed conversation, delving into the vital aspects of our daily lives. This dialogue explores the ethical and technological challenges that digital platforms encounter in today’s landscape.

What is the Oversight Board, and when was it born?

The Board started operating in 2020, but its conception emerged in 2018 when it was first announced by Meta (then Facebook) CEO Mark Zuckerberg. The rising concept of the online civic space prompted the realisation that a platform of such magnitude should not unilaterally make decisions that profoundly impact people. One of our initial cases involved the Trump situation. Meta sought the Boards’ guidance regarding the suspension of Trump’s profiles, questioning the appropriateness of their decision. Our decision confirmed that disabling the account was justifiable, but it could not remain deactivated indefinitely. We recommended the imposition of a time limit.

In addition to our binding decisions, the Board also issues policy recommendations. While these are non-binding, Meta has followed them in 60% of cases to date. The Board consists of 23 members representing diverse backgrounds from around the world, including digital rights activists, human rights lawyers, and even a former prime minister of Denmark. We aspire to create a comprehensive piece that embraces diverse viewpoints, concepts, and geographic locations, thereby reflecting a range of perspectives.

Do you believe the Oversight Board can serve as a precedent, inspiring other platforms to establish similar bodies? Or is an external push from institutions necessary?

Platforms such as Spotify and Koo have already established similar entities, although we were the pioneers in this endeavor. The interest in such initiatives is undeniably present and increasingly evident. There is an ongoing debate around the idea that social media companies should not make the defining decisions on content moderation on their own.

Facebook and Instagram users were yearning for an independent medium to challenge content moderation decisions made by Meta, indicating a significant level of demand. Given the enormous power wielded by these platforms, which significantly impact people’s lives, the existence of an independent entity like the Oversight Board serves as a significant stride forward in safeguarding user rights and human rights. All our decisions reflect that international human rights standards are an important source of authority for the Board.

Francesca Scapolo: “Oversight Board Protects Facebook and Instagram Users' Rights” CRO Cyber Rights Organization

What areas have you worked on, and what are the most important results you have achieved?

We have just released our 2022 Annual Report, which highlights our most important achievements and the impact the Board has had on Meta over the past year. Let me give you a few concrete examples: Meta now tells users when their access to content has been restricted due to local law following a government request, a practice that was previously non-existent. Additionally, Meta now informs users whether human or automated review led to their content being removed.

The Board constantly emphasises the importance of fairness, particularly in cases where Facebook and Instagram remove content without explaining which rules have been violated. Also, sometimes users are not receiving equal treatment and are not given proper recourse for mistakes made by Meta. One of our most significant achievements in this area relates to our “shared Al Jazeera post” case. In this instance, a user shared a post from an Al Jazeera page discussing the Israeli-Palestinian conflict in May 2021. When reposting the content with an astonished comment, Meta deleted the post despite sharing news from a reputable news outlet.

The Board brought this matter to Meta’s attention, urging it to engage an independent entity to examine whether its content moderation related to the conflict in Israel and Palestine in May 2021 was biased. This report was subsequently published in September 2022, revealing that Meta’s content moderation during the May 2021 Israel and Palestine conflict appeared to have had an adverse human rights impact on the rights of Palestinian users to freedom of expression and on their ability to share information and insights about their experiences as they occurred. Much of the bias identified in the independent report related to a lack of in-language proficiency and guidance among content moderators at Meta.

Frequently, the rules implemented by these platforms are intrinsically linked to the privileges associated with specific geographic locations, such as the United States, Silicon Valley, and the Global North. It is intriguing to observe the evolution and trajectory of these discussions.

Although most appeals we receive are from America and Europe, 2/3 of the cases the Board selects come from the Global Majority. We pay particular attention to these regions.

For instance, we addressed issues such as the elections in Brazil and Cambodia or ongoing events in the Tigray zone in Ethiopia, where according to Amnesty International and Human Rights Watch, ethnic cleansing is occurring. In that specific case, the Board finds that Meta has a human rights responsibility to establish a principled, transparent system for moderating content in conflict zones to reduce the risk of its platforms being used to incite violence or violations of international law.

Regarding the recommendations you provide, what pressing issues do you believe warrant public attention?

Last year, we outlined seven strategic priorities that will increase our impact in the areas where we can make the most significant difference in how people experience Facebook and Instagram.

  • Elections and civic space: we’ll be looking at Meta’s responsibilities in elections, protests, and other key moments for civic participation;
  • Crisis and Conflict situations: we’ll explore Meta’s preparedness for potential harms its products can contribute to during armed conflicts, civil unrest, and other emergencies;
  • Gender: the Board will explore gendered obstacles that women and LGBTQI+ people face in exercising their rights to freedom of expression, including gender-based violence and harassment, and the effects of gender-based distinctions in content policy;
  • Hate Speech targeting marginalized groups: we’ll be exploring how Meta should protect members of marginalized groups while ensuring that its enforcement does not incorrectly target those challenging hate;
  • Governments’ use of Meta’s platforms: the Board will look at how Meta should respond to requests from national law enforcement. The Board is interested in exploring how state actors use Meta’s platforms and the implications for content moderation;
  • Treating users fairly: we will be exploring how Meta can do more to treat its users as customers through providing more specific user notifications, ensuring that people can always appeal Meta’s decision to the company, and being more transparent in areas such as ‘strikes’ and cross-check;
  • Automated enforcement of policies and curation of content: the Board will be exploring how automated enforcement should be designed, the accuracy of automated systems, and the importance of greater transparency in this area.

For all strategic priorities, we will continue to work with stakeholders to understand the policies and enforcement practices that Meta most urgently needs to improve and what kind of cases could provide the opportunity to address these.

How do you believe platforms can influence the European debate on automation and the Digital Services Act?

In the upcoming year, tech companies will need to adhere to new regulations that will prompt them to alter their strategies. In the past, discussions regarding social media regulation often suggested that either the government or the industry should be responsible for it. However, there is now a trend towards “co-regulation”, where various aspects of an autonomous regulatory mechanism are supported by legal action.

The Digital Services Act proposes processes that align closely with the strategies we have already begun implementing. It is intriguing to witness the enhanced accessibility to data granted to researchers, and we wholeheartedly welcome this push toward transparency. While numerous entities will participate in this evolving regulatory environment, we believe that our transparent, user-oriented, and independent methods could contribute to resolving the challenges posed.

Regarding the issue of algorithmic biases, how do you anticipate the debate will evolve in the coming years?

I believe that the Digital Services Act in Europe represents an exciting opportunity to gain deeper insights into the inner workings of algorithms. It has the potential to shed light on why certain content takes precedence over others and the effects this content has on users. As this topic becomes more prevalent in our discourse, it will inevitably play a significant role in shaping the trajectory of the digital landscape.

One of the most pressing challenges social media platforms will face is the rise of generative AI. As AI becomes more sophisticated, it has the capacity to generate increasingly complex narratives, making it increasingly difficult for content moderation systems to keep up. We need to have careful deliberations about how to tackle this issue to ensure that social media platforms continue to serve as safe and constructive spaces for discourse.

Overall, the Digital Services Act is an important step forward in grappling with the challenges of the digital world. By providing a framework for greater transparency and accountability, it has the capacity to empower individuals and organisations to make more informed decisions about how they engage with the digital space.

In the realm of online rights violations, do you believe platforms can self-regulate, or do broader practices need to be implemented?

My firm belief is in the adoption of a multistakeholder approach, as we do not hold the sole solution but rather one among many. It is crucial to consider independent regulatory systems, governmental intervention, association involvement, as well as user interests.

Our Board comprises individuals with diverse viewpoints from various regions of the world, who contribute valuable perspectives to enrich our debates. This reminds us that the solutions we identify may apply to only a fraction of the global population and may pose challenges for others. Hence, multiple perspectives are essential, and we must strike the appropriate balance through initiating these discussions.

Regarding nudity policies, could you provide an update on the progress made?

Yes. In January this year, the Board published our gender identity and nudity” decision, in which we found that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and non-binary people on Facebook and Instagram. We recommended to Meta that they align their practices with human rights standards. Meta responded to our recommendations a month ago, and these interactions are publicly available in our Quarterly Transparency Reports and on Meta’s website.

Meta also recently gave an update on our recommendations for the “breast cancer symptoms and nudity” decision. We suggested that Instagram’s detection of text overlay and breast cancer content be improved to prevent wrongly-flagged posts. Meta’s implementation team made adjustments to Instagram’s identification techniques for breast cancer content, and these changes have been in place since July 2021. As a result, 2,500 pieces of content were sent for human review instead of being removed. This success shows that Meta is committed to independent governance and working with the Board to understand their impact on the platform.

Scroll to Top
Contact us