Amid an armed invasion of the US Capitol building as Congress was meeting to affirm Joe Biden's election, social-media platforms including Facebook, Twitter and YouTube have been compelled overnight to act against posts from outgoing US president Donald Trump.
The posts in question seemingly sought to stoke actions from Trump-aligned protestors and continued to claim the presidential election had been "stolen". While Facebook banned Trump from posting for 24 hours, Twitter locked Trump's account for 12 hours, providing he removes certain Tweets.
Facebook declared an "emergency situation" and removed one of Trump's videos, as did YouTube.
Over the last few years, these platforms have had to keep pace with an explosion of misinformation from many sources, including Trump and his supporters, and have reacted in varying ways to try to retain the neutrality of their platforms. While Twitter has added a disclaimer to some Tweets, Trump's supporters from the far right have also seen their content taken down and/or had their accounts banned in the run-up to an acrimonious US presidential election. Twitter has throughout Trump's term chosen not to suspend or revoke his personal account, even though it objectively violates the company's policies on a regular basis.
These new moves mark some of the most stringent responses from the platforms, which have previously been accused of allowing the president too much leeway.
"We are appalled by the violence at the Capitol today ... our Elections Operations Center has already been active in anticipation of the Georgia elections and the vote by Congress to certify the election, and we are monitoring activity on our platform in real time," Guy Rosen, VP of integrity, and Monika Bickert, VP of global policy management for Facebook said in a blog post. "As a part of this, we removed from Facebook and Instagram the recent video of President Trump speaking about the protests and his subsequent post about the election results. We made the decision that on balance these posts contribute to, rather than diminish, the risk of ongoing violence."
"We will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 US Presidential election," YouTube also announced.
Facebook's strong statement didn't stop prominent voices, including Alex Stamos, its former security chief, from laying into the platform for being complict in allowing the radical flames to spread in the first place.
(This article first appeared on CampaignAsia.com)