Tuesday, June 30, 2020

Zuckerberg once wanted to sanction Trump. Then Facebook wrote rules that accommodated him. Washington Post long read by Elizabeth Dwoskin

Zuckerberg once wanted to sanction Trump. Then Facebook wrote rules that accommodated him.
17-22 minutes
By Elizabeth Dwoskin
washingtonpost.com
14 min


June 29, 2020


To Facebook’s executives in Washington, the post didn’t appear to violate its policies, which allows leaders to post about government use of force if the message is intended to warn the public — but it came right up to the line. The deputies had already contacted the White House earlier in the day with an urgent plea to tweak the language of the post or simply delete it, the people said.

Eventually, Trump posted again, saying his comments were supposed to be a warning after all. Zuckerberg then went online to explain his rationale for keeping the post up, noting that Trump’s subsequent explanation helped him make his decision.

The frenzied push-pull was just the latest incident in a five-year struggle by Facebook to accommodate the boundary-busting ways of Trump. The president has not changed his rhetoric since he was a candidate, but the company has continually altered its policies and its products in ways certain to outlast his presidency.

Facebook has constrained its efforts against false and misleading news, adopted a policy explicitly allowing politicians to lie, and even altered its news feed algorithm to neutralize claims that it was biased against conservative publishers, according to more than a dozen former and current employees and previously unreported documents obtained by The Washington Post. One of the documents shows it began as far back as 2015, when as a candidate Trump posted a video calling for a ban of Muslims entering the United States. Facebook’s executives declined to remove it, setting in motion an exception for political discourse.

The concessions to Trump have led to a transformation of the world’s information battlefield. They paved the way for a growing list of digitally savvy politicians to repeatedly push out misinformation and incendiary political language to billions of people. It has complicated the public understanding of major events such as the pandemic and the protest movement, as well as contributed to polarization.

And as Trump grew in power, the fear of his wrath pushed Facebook into more deferential behavior toward its growing number of right-leaning users, tilting the balance of news people see on the network, according to the current and former employees.

Facebook is now confronting a mounting advertiser boycott that has pushed down its stock price as companies demand stricter policies against hate speech. Starbucks became the latest on Sunday to say it would hit pause on social media advertising.

Facebook is also facing a slow-burning crisis of morale, with more than 5,000 employees denouncing the company’s decision to leave Trump’s post that said, “when the looting starts, the shooting starts,” up.

Bowing to those pressures on Friday, Zuckerberg announced a rash of new policies aimed at better policing content on the site. That includes affixing labels on posts that violate hate speech or other policies — even on those from political leaders.

But the company said the post wouldn’t have qualified.

As the United States heads into another presidential election while facing a pandemic and civil unrest, the latitude given to Trump may afford him a potential advantage. In recent months, he has used Facebook and other platforms to tout misleading information about coronavirus cures, election fraud and the motives of protesters, frequently targeting a left-wing movement as a cause of violence without citing evidence.

It also places Facebook in growing conflict with its counterparts in Silicon Valley. Twitter has labeled several presidential tweets as abusive and misleading, and social media platform Snapchat curtailed the reach of the president’s account.

“The value of being in favor with people in power outweighs almost every other concern for Facebook,” said David Thiel, a Facebook security engineer who resigned in March after his colleagues refused to remove a post he believed constituted “dehumanizing speech” by Brazil’s president.

Facebook contends the use of incendiary populist language predates social media. Nick Clegg, Facebook’s vice president for global affairs and communications, said in a statement that populism wasn’t invented in Silicon Valley, pointing to centuries of political history before social media companies’ existence.

“From the Arab Spring to local candidates challenging political incumbents, social media has also helped to open up politics, not favor one side over the other,” Clegg added. “Studies have shown the drivers of populism are complex and cannot be reduced to the use of social media, in fact political polarization has fallen in many countries with high internet use.”

Facebook declined to make Zuckerberg available for an interview, although it pointed out that Zuckerberg opposed Trump when his Muslim immigration ban went into effect. The White House declined to comment.

Zuckerberg talks frequently about making choices that stand the test of time, preserving the values of Facebook and subsidiaries WhatsApp and Instagram for all of its nearly 3 billion monthly users for many years into the future — even when those decisions are unpopular or controversial.

At one point, however, he wanted a different approach to Trump.
Setting the stage

Before the 2016 election, the company largely saw its role in politics as courting political leaders to buy ads and broadcast their views, according to people familiar with the company’s thinking.

But that started to change in 2015, as Trump’s candidacy picked up speed. In December of that year, he posted a video in which he said he wanted to ban all Muslims from entering the United States. The video went viral on Facebook and was an early indication of the tone of his candidacy.

Outrage over the video led to a companywide town hall, in which employees decried the video as hate speech, in violation of the company’s policies. And in meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, according to three former employees, who spoke on the condition of anonymity for fear of retribution. Zuckerberg expressed in meetings that he was personally disgusted by it and wanted it removed, the people said. Some of these details were previously reported.

At one of the meetings, Monika Bickert, Facebook’s vice president for policy, drafted a document to address the video and shared it with leaders including Zuckerberg’s top deputy COO Sheryl Sandberg and Vice President of Global Policy Joel Kaplan, the company’s most prominent Republican.

The document, which is previously unreported and obtained by The Post, weighed four options. They included removing the post for hate speech violations, making a one-time exception for it, creating a broad exemption for political discourse and even weakening the company’s community guidelines for everyone, allowing comments such as “No blacks allowed” and “Get the gays out of San Francisco.”

Facebook spokesman Tucker Bounds said the latter option was never seriously considered.

The document also listed possible “PR Risks” for each. For example, lowering the standards overall would raise questions such as, “Would Facebook have provided a platform for Hitler?” Bickert wrote. A carveout for political speech across the board, on the other hand, risked opening the floodgates for even more hateful “copycat” comments.

Ultimately, Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people. Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines.

That allowance was not formally written into the policies, even though it informed ad hoc decision-making about political speech for the next several years, according to the people. When a formal newsworthiness policy was announced in October 2016, in a blog post by Kaplan, the company did not discuss Trump’s role in shaping it.

In an interview, Bickert said the company ultimately made a call to maintain Trump’s Muslim ban video because executives interpreted Trump’s comment to mean that the then-candidate was not speaking about all Muslims, but rather advocating for a policy position on immigration as part of a newsworthy political debate. She said she did not recall the document where the options were presented.

Facebook’s Bounds added that the “newsworthiness” policy was added in 2016 after content reviewers removed a photo of a naked girl fleeing a napalm attack during the Vietnam War. “Our goal was to recognize the essential public benefit of preserving content that in other contexts wouldn’t be allowed,” Bounds said. “In the case of elected officials, it also ensures that they will be held to account for their words,” so that people can judge for themselves.

In spring of 2016, Zuckerberg was also talked out of his desire to write a post specifically condemning Trump for his calls to build a wall between the United States and Mexico, after advisers in Washington warned it could look like choosing sides, according to Dex Torricke-Barton, one of Zuckerberg’s former speechwriters.

The political speech carveout ended up setting the stage for how the company would handle not only Trump, but populist leaders around the world who have posted content that test these boundaries, such as Rodrigo Duterte in the Philippines, Jair Bolsonaro in Brazil and Narendra Modi in India.

“Though [Facebook] has cracked down on misinformation, the most problematic influencers are politicians,” said Claire Wardle, U.S. director of First Draft, an organization dedicated to fighting misinformation that has a partnership with Facebook. “You can do all the fact checking in the world, but these influencers have a disproportionate impact.”

Trump presented a unique challenge, she added. “Until then, no one would have considered a president who would have said those things.”
Protecting the right

After the election, it became clear Russia had used social media to sow disinformation. Facebook soon after became a frequent target of the president’s ire. He tweeted that the social media giant was “anti-Trump” and trying to undermine his victory.

At the same time, GOP leaders stepped up criticism that platforms such as Facebook and Twitter, with leadership ranks full of liberals, sought to limit the reach of right-leaning voices.

“There’s no credible research supporting Trump’s claim that social platforms suppress conservative content, but he still succeeded in getting them to revise their rules for him,” said former Facebook spokesman Nu Wexler, who left the company in 2018.

As Facebook scrambled to tackle foreign interference and misinformation, its executives in the nation’s capital argued that caution and deference was necessary to survive the new political environment, according to three people familiar with the company’s thinking.

Facebook’s security engineers in December 2016 presented findings from a broad internal investigation, known as Project P, to senior leadership on how false and misleading news reports spread so virally during the election. When Facebook’s security team highlighted dozens of pages that had peddled false news reports, senior leaders in Washington, including Kaplan, opposed shutting them down immediately, arguing that doing so would disproportionately impact conservatives, according to people familiar with the company’s thinking. Ultimately, the company shut down far fewer pages than were originally proposed while it began developing a policy to handle these issues.

A year later, Facebook considered how to overhaul its scrolling news feed, the homepage screen most users see when they open the site. As part of the change to help limit misinformation, it changed its news feed algorithm to focus more on posts by friends and family versus publishers.

In meetings about the change, Kaplan questioned whether the revamped algorithm would hurt right-leaning publishers more than others, according to three people familiar with the company’s thinking who spoke on the condition of anonymity for fear of retribution. When the data showed it would — conservative leaning outlets were pushing more content that violated its policies, the company had found — he successfully pushed for changes to make the new algorithm to be what he considered more evenhanded in its impact, the people said.
Isolated and divided

With the 2020 election on the horizon, Facebook and Zuckerberg’s hands-off approach to free speech was leaving it increasingly isolated in Silicon Valley.

In May 2019, Zuckerberg, citing free speech, refused to take down a doctored video of House Speaker Nancy Pelosi (D-Calif.) that made her appear drunk.

That summer, company leaders held a meeting to revisit its newsworthiness exception, which until then had been determined on a case-by-case basis, with the most controversial calls made by Zuckerberg. Internally, some were unclear how far that leeway extended, according to two people.

Clegg, the company’s new head of global affairs and communications and a former British deputy prime minister, announced the outcome of that meeting at a speech in Washington in September 2019. Aside from speech that causes violence or real-world harm, Facebook would allow politicians to express themselves virtually unchecked on social media. Facebook’s network of independent fact-checkers, which had been established as a key part of the company’s response to disinformation, would not evaluate their claims and the community guidelines would largely not apply to politicians.

Facebook did not want to be an arbiter of truth in political debate, he said, echoing Zuckerberg’s long-standing position.

The speech angered some employees, triggering more than 250 of them to sign a petition disagreeing with the decision because they thought it gave politicians a pass.

One former executive, Yael Eisenstat, who worked to improve the political ads process, wrote in The Post that the controversy was “the biggest test of whether [Facebook] will ever truly put society and democracy ahead of profit and ideology.”

She said that she routinely experienced how the company’s efforts at integrity were often undermined by “the few voices who ultimately decided the company’s overall direction.”

Meanwhile, in October, as Facebook faced more potential regulation and political troubles, Zuckerberg and his wife Priscilla Chan went to the White House for a private dinner with Trump, a part of the CEO’s effort to cultivate personal relationships in Washington.
Tweetstorm

As the pandemic and civil unrest dominated the first half of this year, Trump continued to turn to social media platforms to spread misinformation. He touted the unproven drug hydroxychloroquine as a possible cure for the coronavirus and claimed without evidence that the left-wing antifa movement was behind the violence at George Floyd protests.

Meanwhile, Facebook employees began challenging the company’s decisions.

Two months before Trump’s “looting, shooting” post, the Brazilian president posted about the country’s indigenous population, saying, “Indians are undoubtedly changing. They are increasingly becoming human beings just like us.”

Thiel, the security engineer, and other employees argued internally that it violated the company’s internal guidelines against “dehumanizing speech.” They were referring to Zuckerberg’s own words while testifying before Congress in October in which he said dehumanizing speech “is the first step toward inciting” violence. In internal correspondence, Thiel was told that it didn’t qualify as racism — and may have even been a positive reference to integration.

Thiel quit in disgust.

In May, following years of internal debate of its own, Twitter chose to go in the opposite direction. It labeled two misleading tweets by Trump about mail-in ballots with a fact-check label.

Trump responded two days later with an executive order that could hurt social media companies by removing a key exception that limits their liability for content posted on their sites.

The next day, Trump tweeted about the Minnesota protests. Twitter quickly labeled the tweet for violating rules about glorifying violence, and Snapchat stopped promoting Trump’s account the following week. YouTube told The Post that it holds politicians to the same standards as everyone else.

Facebook, on the other hand, chose to haggle with the White House, asking for a deletion or a change, said the people. Axios first reported the call, which Facebook’s Bounds confirmed to The Post.

As employees raged on internal message boards and externally on Twitter, Zuckerberg told workers that Facebook’s policies might change again in light of Trump’s post. The company had rules allowing for “state use of force,” he said, but they were vague and didn’t encompass the possibility that such pronouncements could signal harmful aggression. Bickert’s team planned a series of policy meetings for the weeks ahead.

In June, Facebook removed a swath of Trump campaign ads with Nazi symbolism, after an initial internal assessment that found the ads did not violate the company’s polices, according to documents viewed by The Post. In meetings, senior executives argued that not removing them would be perceived as pandering too much to the president, according to a person familiar with the discussions.

Last week, the advertiser boycott picked up steam. Hershey, Verizon, Unilever, Coca-Cola and others said they were temporarily pulling ads.

On Friday, Zuckerberg told employees in a live-streamed town hall that he was changing the company’s policy to label problematic newsworthy content that violated the company’s policies as Twitter does, a major concession amid the rising tide of criticism. He also said in the most explicit language ever that the company would remove posts by politicians that incite violence and suppress voting. Still, civil rights leaders said his assertions didn’t go far enough.

“There are no exceptions for politicians in any of the policies that I’m announcing today,” Zuckerberg said.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.