Search

How COVID Changed Content Moderation: Year in Review 2020 - EFF

laingali.blogspot.com

In a year that saw every facet of online life reshaped the coronavirus pandemic, online content moderation and platform censorship were no exception.

After a successful Who Has Your Back? campaign in 2019 to encourage large platforms to adopt best practices and endorse the Santa Clara Principles, 2020 was poised to be a year of more progress toward transparency and accountability in content moderation across the board. The pandemic changed that, however, as companies relied even more on automated tools in response to disrupted content moderator workforces and new types and volumes of misinformation.

At a moment when online platforms became newly vital to people’s work, education, and lives, this uptick in automation threatens freedom of expression online. That makes the Santa Clara Principles on Transparency and Accountability in Content Moderation more important than ever—and, like clockwork, transparency reporting later in the year demonstrated the pitfalls and costs of automated content moderation.

As the pandemic wore on, new laws regulating fake news online led to censorship and prosecutions across the world, including notable cases in Cambodia, India, and Turkey that targeted and harmed journalists and activists.

In May, Facebook announced its long-awaited Oversight Board. We had been skeptical from day one, and were disappointed to see the Board launch without adequate representation from the Middle East, North Africa, or Southeast Asia, and further missing advocates for LGBTQ or disability communities. Although the Board was designed to identify and decide Facebook’s most globally significant content disputes, the Board’s composition was and is more directed at parochial U.S. concerns.

In June, Facebook disabled the accounts of more than 60 Tunisian users with no notice or  transparency. We reminded companies how vital their platforms are to speech; while the current PR storm swirls around whether or not they fact-check President Trump, we cannot forget that those most impacted by corporate speech controls are not politicians, celebrities, or right-wing provocateurs, but some of the world’s most vulnerable people who lack the access to corporate policymakers to which states and Hollywood have become accustomed.

As the EU’s regulation to curb “terrorist” or violent extremist content online moved forward in the latter half of the year, the Global Internet Forum to Counter Terrorism (GIFCT) took center stage as a driving force behind the bulk of allegedly terrorism-related takedowns and censorship online. And in September, we saw Zoom, Facebook, and YouTube cite U.S. terrorism designations when they refused to host Palestinian activist Leila Khaled.

At the same time, EFF has put forward EU policy principles throughout 2020 that would give users, not content cartels like GIFCT, more control over and visibility into content decisions.

The United States’ presidential election in November drove home the same problems we saw with Facebook’s Oversight Board in May and the string of disappeared accounts in June: tech companies and online platforms have focused on American concerns and politics to the detriment of addressing problems in—and learning from—the rest of the world. While EFF made clear what we were looking for and demanding from companies as they tailored content moderation policies to the U.S. election, we also reminded companies to first and foremost listen to their global user base

Looking ahead to 2021, we will continue our efforts to hold platforms and their policies accountable to their users. In particular, we’ll be watching developments in AI and automation for content moderation, how platforms handle COVID vaccine misinformation, and how they apply election-related policies to significant elections coming up around the world, including in Uganda, Peru, Kyrgyzstan, and Iran.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2020.

Let's block ads! (Why?)



"how" - Google News
December 25, 2020 at 01:17AM
https://ift.tt/3nZTede

How COVID Changed Content Moderation: Year in Review 2020 - EFF
"how" - Google News
https://ift.tt/2MfXd3I
https://ift.tt/3d8uZUG

Bagikan Berita Ini

0 Response to "How COVID Changed Content Moderation: Year in Review 2020 - EFF"

Post a Comment

Powered by Blogger.