Jessica Goodfellow
Aug 12, 2020

Facebook removed 7 million Covid-19 misinformation posts in 3 months

The social-media network has shed light on the scale of the coronavirus misinformation problem, with posts related to fake cures among the millions it has removed over recent months.

Facebook removed 7 million Covid-19 misinformation posts in 3 months

Facebook has said that it removed 7 million posts for spreading harmful misinformation about Covid-19, and labelled an additional 98 million posts which were deemed false by fact checkers, between April and June this year.

Posts that were removed include those that peddled fake preventative measures or exaggerated cures that health organisations deem to be dangerous. Other forms of coronavirus-related misinformation have been labelled by the company's cohort of independent fact checkers. 

Facebook vice president of integrity Guy Rosen provided the misinformation statistics in a press call as the company published its Community Standards Enforcement Report, a quarterly report that details the kind and volume of content the company has removed across Facebook and Instagram. The report is split into 12 categories across Facebook and 10 on Instagram, but misinformation is not part of the reported categories.

In the past few months, Rosen said the platform has "prioritised work around harmful content related to Covid-19 that could put people at risk", as well as misinformation related to the upcoming US presidential election. From March to July, it said it removed more than 110,000 pieces of content in the US for violating policies related to misleading people about voting or trying to intimidate them not to vote.

But the company has been operating on a reduced workforce over the past three months. In March it sent its content reviewers home due to Covid-19, and has therefore relied more heavily on technology to review content. As we discussed in an earlier feature, Facebook's machine-learning algorithm is imperfect and can be prone to error. 

Rosen said the past few months have demonstrated that content enforcement is "not an either/or approach" between either human reviewers or AIs, but that sophisticated systems need both people and technology. For example, content about suicide and self injury and content about child nudity and sexual exploitation are two areas where Facebook relies on people to review content and train technology. Accordingly, the company took action on fewer pieces of content related to these areas over the past few months.

As it prioritised harmful content, Facebook said the amount of hate speech it took action on increased from 9.6 million posts in Q1 to 22.5 million in Q2. It claimed 95% of that was detected proactively before anyone reported it—up from 89% in Q1. On Instagram, it took action on 3.3 million pieces of hate speech content. The amount of terrorism content it removed increased from 6.3 million to 8.7 million posts.

The company said it would invite external experts to independently audit the metrics used in the report, beginning 2021.

(This article first appeared on CampaignAsia.com)

Source:
Campaign India

Related Articles

Just Published

2 days ago

Call for entries: Campaign Asia-Pacific’s 2025 ...

The 2025 Power List is open for nominations! We’re spotlighting the marketers who are driving innovation, shaping consumer behaviour and redefining the marketing game across the region. Know a trailblazer leading the charge or think it’s your time to shine? Nominate now.

2 days ago

Women’s cricket searches by Indians rose 103% in ...

The live music events like Coldplay, Taylor Swift, Vijay Antony, and Diljit Dosanjh concerts saw 43% increase in searches in 2024, according to a Kantar report.

2 days ago

Is Elon Musk’s X winning back advertisers?

Social media platform X is reportedly in talks to raise money at its buying price valuation of $44 billion, despite user and advertiser losses since Elon Musk’s acquisition in 2022.

2 days ago

Take a peak: How marketers can turn digital noise ...

In an attention-starved and price-sensitive market, brands are battling for fleeting consumer focus. NP Digital's Neil Patel shares how leveraging emotional resonance through the 'peak-end rule' can create powerful moments that stick.