News for North Texas
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook Clamps Down On Posts, Ads That Could Undermine U.S. Presidential Election

Facebook CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building in Washington.
Chip Somodevilla
/
Getty Images
Facebook CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building in Washington.

Updated at 10:55 a.m. ET

Facebook says it won't accept any new political ads in the week leading up to the presidential election, one of several policies that CEO Mark Zuckerberg said will help ensure a fair election in November. One such measure intended to prevent election undermining involves deleting posts that claim people will get COVID-19 if they vote.

"This election is not going to be business as usual," Zuckerberg said Thursday about the vote that is now two months away. "We all have a responsibility to protect our democracy."

The new policies are also Facebook's latest attempt to respond to long-held criticisms over its handling of political content. Critics say the company allows a free hand to those who use the platform to mislead voters. And they accuse Facebook of applying inconsistent standards for members of the public, political figures and advertisers.

Facebook is already working to help people register to vote and to clarify how the election will work during a pandemic, Zuckerberg said. And in light of the intense political antagonism in the U.S., he added that the social media company will act "to reduce the chances of violence and unrest."

The company recently joined Twitter in removing accounts that spread "false stories about racial justice, the Democratic presidential campaign of Joe Biden and Kamala Harris and President Trump's policies," as NPR's Bobby Allyn reported.

Those removed accounts were linked to Russian state actors, suggesting Russia is once again seeking to influence U.S. elections with misinformation and by amplifying the differences among Americans.

But there is a new wrinkle for the Nov. 3 vote, Zuckerberg said. He noted that while Facebook has removed dozens of international networks that spread bogus stories, "we're increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders."

Facebook has faced intense criticism for how it approaches political content and ads. Most famously, Russian operatives used the platform to spread disinformation ahead of the 2016 U.S. presidential election.

New criticisms emerged early this year, and on many fronts. When Facebook said in January that it would continue to allow political advertisers to target users, Federal Election Commissioner Ellen Weintraub responded by sayingFacebook's "weak plan suggests the company has no idea how seriously it is hurting democracy."

Under the policy that was announced Thursday, Facebook would still allow advertisers to "adjust the targeting" for ads that were accepted before it institutes a one-week quiet period in late October.

Critics also said the company fell short in its response to an inflammatory post by President Trump in late May, in which he revived the phrase "when the looting starts, the shooting starts."

Twitter hid its version of Trump's post behind a warning label, but Facebook declined to take such action. In the weeks that followed, members of the public and Facebook's own employees complained about its handling of racist and hateful rhetoric, leading a number of large corporations to pause their advertising on Facebook.

By late June, Facebook reversed its position somewhat, saying it would "put warning labels on posts that break its rules but are considered newsworthy," as NPR's Shannon Bond reported.

Hoping to address concerns about this year's election, the company and its CEO outlined several changes in how it handles ads, posts and other content, saying they want to preserve fair play in the closing months of the 2020 presidential campaign.

If a post aims to undermine the legitimacy of the election – for instance, by "claiming that lawful methods of voting will lead to fraud," Facebook says – the company will attach an informational label. A similar approach will cover any content whose goal is to delegitimize the election's outcome.

Other posts would be removed outright, if they "claim that people will get COVID-19 if they take part in voting," the company says. If a post doesn't go quite that far – but still seeks to use COVID-19 to suppress voter participation — Facebook will attach a link to verified information about the coronavirus.

The policy restricting new political or issue ads in the final week of the campaign season does not mean Facebook users won't see any political ads in that time. Instead, advertisers can continue running ads that have already been published and scrutinized.

"It's important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech," Zuckerbeg said, "but in the final days of an election there may not be enough time to contest new claims."

The company's plan also includes potential measures that could be taken to rein in candidates or campaign accounts on Facebook that claim victory before final election results are in. In those cases, Facebook says, it will tack on a label that sends readers to authoritative election results, from either Reuters or the National Election Pool.

The company will also limit the ability to forward content on its Messenger platform, in hopes of "reducing the risk of misinformation and harmful content going viral," Zuckerberg said.

"We've already implemented this in WhatsApp during sensitive periods and have found it to be an effective method of preventing misinformation from spreading in many countries," he added.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

Bill Chappell is a writer and editor on the News Desk in the heart of NPR's newsroom in Washington, D.C.