Can you trust what you read on Facebook? No. And why not? Because Facebook has now explicitly said that it will obey an executive order from President Trump and will refuse to fact-check misinformation and disinformation as American heads into the 2020 election.
In April 2017, Facebook published a white paper that acknowledged the spread of “information operations” trying to divide and deceive Americans, in response to accusations that misinformation helped influence the 2016 U.S. elections. In September 2017, Facebook chief security officer Alex Stamos acknowledged that some of the accounts and Pages disseminating that information came from within Russia. Common Cause, a watchdog group, filed suit. Then Facebook joined Twitter and Google, telling Congress that they would do better.
On Thursday, in response to a request by the presidential campaign of Senator Joe Biden to stop the spread of misinformation, Facebook threw in the towel. The company claimed that a recent executive order by President Trump tied its hands.
The Biden campaign asked Facebook “to proactively stem the tide of false information by no longer amplifying untrustworthy content and promptly fact-checking election-related material that goes viral.” The campaign asked for “clear rules—applied to everyone, including Donald Trump—that prohibit threatening behavior and lies about how to participate in the election.” It also asked for a two-week period before the election during which all political advertisements would be fact-checked.
Facebook declined, issuing this unsigned statement. “We live in a democracy, where the elected officials decide the rules around campaigns,” Facebook wrote in a short, unsigned statement. “Two weeks ago the President of the United States issued an executive order directing Federal agencies to prevent social media sites from engaging in activities like fact-checking political statements. This week, the Democratic candidate for President started a petition calling on us to do the exact opposite. Just as they have done with broadcast networks—where the US government prohibits rejecting politicians’ campaign ads—the people’s elected representatives should set the rules, and we will follow them. There is an election coming in November and we will protect political speech, even when we strongly disagree with it.”
Facebook’s statement referred to an executive order Trump signed, which would propose new regulations under Section 230 of the Communications Decency Act which protects them from liability. It would be up to the Commerce Department and the FCC to implement the new rules.
On Friday, though, Michael O’Rielly, a Republican member of the Federal Communications Commission, told Bloomberg that he’s not even sure that the FCC has the legal power to grant Trump’s request.
Facebook chief executive Mark Zuckerberg has said that Facebook would not police clear untruths or fact-check politicians. It’s struggled to remove hate speech, based on its own algorithms. Yet it committed to providing accurate information the COVID-19 outbreak in its first-quarter earnings release, in which it reported profits of $4.9 billion on revenue of $17.7 billion. Recent reporting has still found that misinformation still circulates on Facebook regarding the efficacy of hydroxychloroquine in treating the coronavirus.
Facebook at least has publicly stated that it will try to combat misinformation around the coronavirus. But the company’s recent statement about political fact-checking is its clearest declaration yet that it’s figuratively given up. Facebook helped sway the 2016 election, and it looks like it won’t even try to prevent a repeat in 2020.