Facebook tightened its policies on election-related disinformation Thursday, limiting the reach of live videos and misleading posts in the latest effort by the world’s largest social network to tamp down an onslaught of baseless claims about vote-rigging and premature victory claims.
The move comes as Facebook and other social networks face mounting criticism from the left that they are not doing enough to stop the spread of false claims that could undermine faith in the election and results when they’re declared. Some prominent Democrats have called for the platforms to suspend the account of President Donald Trump, whose posts alleging electoral fraud have received warning labels on both Facebook and Twitter.
Facebook said it will now restrict posts on both Instagram and Facebook that its systems flag as misinformation so that they are seen by fewer users. It also is limiting the distribution of election-related live videos on its Facebook platform.
"As vote counting continues, we are seeing more reports of inaccurate claims about the election," Facebook said in a statement. "While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people."
President Donald Trump and his allies have been churning out unsubstantiated claims of voter fraud and improper conduct at polling places in a bid to convince the public that the election is being stolen from him. Those messages have then been amplified in both posts and videos by conservative influencers and supporters, who in some instances have then organized off-line protests.
Misinformation researchers say livestreamed videos have been a blind spot for major internet companies, in part because they are more difficult to review with artificial intelligence software and thus more challenging to moderate en masse.
Renée DiResta, the technical research manager at the Stanford Internet Observatory, said some conservative influencers use live videos to spread conspiracy theories and rile up supporters. Others use deceptive editing of live videos to make a protest or an interaction with election officials look like evidence for a baseless narrative.
"Certain types of content or certain types of conspiratorial allegations, when they’re just simply stated in a live stream, it’s very difficult to moderate that in real time," DiResta said.
Facebook prepared ahead of the election with a range of policies and tools to combat election-related disinformation, including a banner-message at the top of users’ profiles to let them know ballot counting was still ongoing and a series of labels to slap on those making false claims about fraud or prematurely naming a winner. Those banner messages will display the name of the projected winner once the race has been called by reputable news outlets, Facebook said.
Despite those efforts, misleading claims continue to proliferate on the platform. And while some such posts do have low engagement, others posted by Trump and conservative firebrands have been among the social network’s most liked and shared content in recent days.
View original post