Monday, February 1, 2021

Facebook Knew Calls for Violence Plagued ‘Groups,’ Now Plans Overhaul. By Jeff Horwitz

Facebook Knew Calls for Violence Plagued ‘Groups,’ Now Plans Overhaul. By Jeff Horwitz

The Wall Street Journal.

Jan. 31, 2021 at 5:16 pm ET

The giant struggled to balance Mark Zuckerberg’s free-expression mantra against findings that rabid partisanship had overrun a feature central to its future

Now the social-networking giant is clamping down on Groups. The effort began after Facebook’s own research found that American Facebook Groups became a vector for the rabid partisanship and even calls for violence that inflamed the country after the election.


The changes, which Facebook escalated after the Jan. 6 riot at the U.S. Capitol, involve overhauling the mechanics of a product that was meant to be central to its future.


Facebook executives were aware for years that tools fueling Groups’ rapid growth presented an obstacle to their effort to build healthy online communities, and the company struggled internally over how to contain them. Now Facebook is working to overhaul the mechanics of a product that was meant to be central to its future.


The company’s data scientists had warned Facebook executives in August that what they called blatant misinformation and calls to violence were filling the majority of the platform’s top “civic” Groups, according to documents The Wall Street Journal reviewed. Those Groups are generally dedicated to politics and related issues and collectively reach hundreds of millions of users.


The researchers told executives that “enthusiastic calls for violence every day” filled one 58,000-member Group, according to an internal presentation. Another top Group claimed it was set up by fans of Donald Trump but it was actually run by “financially motivated Albanians” directing a million views daily to fake news stories and other provocative content.


Roughly “70% of the top 100 most active US Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment,” the presentation concluded. “We need to do something to stop these conversations from happening and growing as quickly as they do,” the researchers wrote, suggesting measures to slow the growth of Groups at least long enough to give Facebook staffers time to address violations.


“Our existing integrity systems,” they wrote, “aren’t addressing these issues.”


In response, Facebook ahead of the election banned some of the most prominent problem Groups and took steps to reduce the growth of others, according to documents and people familiar with its decisions. Still, Facebook viewed the restrictions as temporary and stopped short of imposing measures some of its own researchers had called for, these people said.


In the weeks after the election, many large Groups—including some named in the August presentation—questioned the results of the vote, organized protests about the results and helped precipitate the protests that preceded the Jan. 6 riot. After the Capitol riot, Facebook took down more of the Groups and imposed new rules as part of what it called an emergency response.


Facebook has canceled plans to resume recommending civic or health Groups, said Guy Rosen, Facebook’s Vice President of Integrity, a role that oversees the safety of users and discourse on the platform. Facebook will also disable certain tools that researchers argued had facilitated edgy Groups’ rapid growth and require their administrators to devote more effort to reviewing member-created content, he said.



Trump supporters swarm the Capitol on Jan. 6.

Photo: Carol Guzy/Zuma Press

“That helps us because we can then hold them accountable,” Mr. Rosen said, adding that the changes aren’t an admission that previous rules were too loose, but show Facebook adapting to emerging threats: “If you’d have looked at Groups several years ago, you might not have seen the same set of behaviors.”


Facebook, like some other tech giants, has caught criticism for banning certain content and people, including Mr. Trump. It is also under the close eye of the Biden administration, which has signaled its displeasure with Facebook’s handling of its platforms in the months leading to the election.


Mr. Zuckerberg said on an earnings call last Wednesday that Facebook’s users are tiring of the hyper-partisanship on the platform. “People don’t want politics and fighting to take over their experience on our services,” he said, adding that Facebook is also considering steps to reduce political content in its News Feed—the stream of baby photos, birthday reminders and rants from distant relatives that greets users when they log in.


Mr. Zuckerberg also said that the company was rethinking whether Groups could be redesigned to help people “grow as individuals” in the same way real-life communities can. “We can make it so that Groups on Facebook are not just a feed and a place where you post some content,” he said.


Groups pivot

Facebook’s 2019 renovations marked a strategic pivot away from its News Feed and one of its most significant platform alterations in years.


It emphasized content from Groups, elevating it in the stream of material it served to users. Giving priority to Groups, Facebook said, would help people make meaningful connections with like-minded friends. The shift came as Facebook faced criticism that News Feed was susceptible to foreign interference and other manipulation.


Groups, once a subsidiary feature, were made central to the app design, recommendation systems and dating features. Mr. Zuckerberg told the Journal at the time that Facebook had devoted six months’ work to ensure it made the shift responsibly and was mindful of its obligation not to promote Groups containing dubious medical advice of unfounded conspiracies with its algorithms. “If people really seek it out on their own, fine,” he said.


Groups also became central to Facebook’s branding, as it battled criticism over issues from data privacy to Cambridge Analytica’s role in the 2016 election. In a 2020 Super Bowl ad, it celebrated amateur-rocketry buffs, bouldering clubs and rocking-chair enthusiasts—brought together through Groups.


Nina Jankowicz, a social media researcher at the Woodrow Wilson Center in Washington, D.C., said she became alarmed after hearing a Facebook representative advise a European prime minister’s social-media director that Groups were now the best way to reach a large audience on the platform.


“My eyes bugged out of my head,” said Ms. Jankowicz, who studies the intersection of democracy and technology. “I knew how destructive Groups could be.”


The problem, she said, was that Facebook wasn’t stepping up oversight along with its algorithmic promotion of Groups. Recommendations could take a user from an alternative-health Group to an anti-Covid-lockdown Group to a militia Group in a few clicks. Facebook eventually banned militia groups entirely. And Facebook’s tools to boost Group growth—such as letting hyperactive Group administrators issue thousands of invitations to new users daily and inserting “previews” of Group content into invitees’ news feeds before they joined—magnified those risks, she said.



Facebook CEO Mark Zuckerberg unveils the redesign centering on Groups, April 2019.

Photo: David Paul Morris/Bloomberg News

Last June, she wrote an essay, “Facebook Groups Are Destroying America,” for Wired magazine declaring that partisan publishers and foreign actors were harnessing Groups to peddle conspiracy theories and falsehoods. If Facebook didn’t rethink its approach, she warned, Groups could undermine democracy.


Extremist groups

In a 2016 presentation about Facebook’s halting efforts to combat polarization, which the Journal reported last year, a researcher noted that extremist content had swamped large German political Groups and that “64% of all extremist group joins are due to our recommendation tools.” The presentation concluded that “our recommendation systems grow the problem.”


In response to the article, Facebook said it had fixed the recommendation problems.


The August 2020 internal presentation showed other issues emerging as U.S. Groups tied to mercenary and hyperpartisan entities used Facebook’s tools to build large audiences. Many of the most successful Groups were under the control of administrators that tolerated or actively cultivated hate speech, harassment and graphic calls for violence, it said, noting that one top Group “aggregates the most inflammatory news stories of the day and feeds them to a vile crowd that immediately and repeatedly calls for violence.”


Administrators had designated most of the Groups as private, so only members could read them. Some were secret—people outside Facebook wouldn’t know they existed, much less that they were garnering millions of views a week.


Americans didn’t run some of the most popular Groups, the August presentation noted. It deemed a Group called “Trump Train 2020, Red Wave” as having “possible Macedonian ties” and of hosting the most hate speech taken down by Facebook of any U.S. Group. The Group grew to more than a million members within two months of its creation last summer, according to data archived by the fact-checking website Snopes, before Facebook took it down in September.


The Journal wasn’t able to contact the administrators of the Group, whose personal pages, some with dubious English, were also removed. A request for comment to a purported successor Group didn’t receive a response.


Most of the Groups were on the right end of the political spectrum, but “Suburban Housewives Against Trump” appeared near the top of the charts, too, the August presentation said. Conservative or liberal, the Groups shared a common thread: They had harnessed passionate super-users and Facebook recruitment tools to achieve viral growth.


Content from the top 10 civic Groups was seen 93 million times over the course of seven days late in the summer. Large groups’ intent to break Facebook’s rules was often overt, the August presentation noted, with administrators coaching users on how to post offensive material in ways that would escape Facebook’s automated filters.


‘Toxic atmosphere’

“They are deliberately creating this toxic atmosphere,” Facebook’s researchers wrote of the administrators of the “ Kayleigh McEnany Fan Club,” named after—but not associated with—the Trump administration press secretary. The researchers said the Group largely functioned as a distribution system for “low-quality, highly divisive, likely misinformative news content” from a handful of partisan publishers.


Share Your Thoughts

What should Facebook do to address incitements to violence on its site? Join the conversation below.


The comments in the Group included death threats against Black Lives Matter activists and members of Congress, researchers said, and Facebook had flagged it 174 times for misinformation within three months. In the comments under one post about U.S. Rep. Ilhan Omar (D., Minn.) that the presentation included, comments included:


“I hope someone shoots her but she lives and is paralyzed.”


“Maybe a bullet would do her good.”


“Bring back public hangings.”


The Journal contacted five club administrators—most of whom appear to be affiliated with for-profit right-wing digital-media sites—through Facebook and the contacts listed by those outlets when available but didn’t receive a response.


Facebook’s public-policy team balked at taking action against prominent conservative Groups, and managers elsewhere in the company questioned proponents of the proposed restrictions about the effects on growth, according to internal documents and the people familiar with the decisions. To try overcoming resistance to further crackdowns, Facebook integrity staffers began sending daily analyses to Mr. Rosen and other senior executives, showing how Facebook’s methods for policing major Groups were failing to catch obvious violations of the company’s community standards.


Facebook’s platform-wide rules forbid hate speech and speech that incites violence. The company advises Groups moderators on how to maintain community rules. But rather than helping foster a civil tone, leaders of major politics-focused Groups encouraged members to break Facebook’s rules, threatened to ban anyone who reported such content and directed users to post their most outrageous material as comments on other posts—a tactic meant to confuse Facebook’s automated moderation systems.


Facebook declined to discuss the specifics of its handling of the researchers’ findings.


On October 20, the Mozilla Foundation, which makes the Firefox browser and says it promotes a healthy internet, ran a full-page ad in the Washington Post calling for Facebook to disable its algorithmic Group recommendation systems. “Countless experts—and even some of your own employees—have revealed how these features can amplify disinformation,” said the letter, which also urged Twitter Inc. CEO Jack Dorsey to suspend its algorithmically-driven Trending Topics feature.


Twitter didn’t suspend the feature, though it has sought to add more context and has manually intervened to remove incendiary trends such as “Hang Mike Pence. ” A Twitter spokesman said the company had moved quickly to take down calls for Mr. Pence’s death and begun adding factual context to its trending topics feature.


Ashley Boyd, Mozilla’s vice president for advocacy and engagement, said she had discussed the foundation’s concerns with employees from Facebook’s public policy, product development and communications staff before the letter’s publication. “They didn’t say we were crazy,” she said. “They said, ‘This is very similar to conversations we’re having internally.’ ”


Even before Mozilla published its letter, Facebook had temporarily stopped making algorithmic recommendations to Groups dedicated to political or civic issues, a Facebook spokesman said.


Facebook also halted showing previews of Group content to prospective new members, capped the daily number of invitations members could send each day, and began freezing comment threads when they repeatedly triggered automated filters for hate speech and violence, internal documents show. Mr. Rosen confirmed the pre-election moves.


The new rules, which Facebook designed to be temporary and largely didn’t announce publicly, couldn’t contain the viral growth of some Groups after the election. Most notably, a Group called “Stop the Steal” that was organizing election protests around the country grew to 361,000 members in less than 24 hours without any promotion from Facebook’s algorithms. When Facebook took it down Nov. 5, the company said it “was organized around the delegitimization of the election process, and we saw worrying calls for violence from some members of the group.”


In response to rising fears of political bloodshed, Mr. Zuckerberg that day approved an additional “break glass” emergency measures including further restrictions on Groups with a history of bad behavior, according to internal documents and people familiar with the decisions.


After violence related to ballot counting failed to materialize in the following days, Facebook began to loosen some of the restrictions on Groups, according to internal documents. It reminded employees and reporters that the measures had always been temporary.


On Jan. 6, after the rally organized by Amy and Kylie Kremer —the mother-daughter creators of the original “Stop the Steal” Group, which Facebook closed Nov. 5—a group of Trump supporters stormed the Capitol. The Kremers didn’t respond to requests for comment. In the wake of the riot, Facebook deleted other Groups that had cropped up using “Stop the Steal” in their names and espoused the same purpose.



Amy Kremer, at a Jan. 6 pro-Trump rally, co-founded the original ‘Stop the Steal’ Facebook Group.

Photo: Jacquelyn Martin/Associated Press

Mr. Zuckerberg approved instituting the break-glass measures Facebook had recently lifted and added more restrictions on Groups, internal documents show. In a public blog post, he blamed President Trump for trying to use Facebook “to incite violent insurrection.” Facebook required administrators to approve more posts in Groups with histories of violating its rules—a technique Facebook’s integrity staff had recommended in August but that the company hadn’t fully implemented.


Facebook Chief Operating Officer Sheryl Sandberg publicly cast blame for the riot’s organization on smaller social-media platforms, even as the company continued to rein in Groups. The company has dissolved 40 of the top 100 groups listed in the August presentation. She declined to comment.


Beyond the permanent ban on algorithmic civic and health Group recommendations, Facebook will prevent Groups of any sort from being promoted within their first 21 days of existence. Other temporary measures—such as the freezing of comment threads classified as turning vile and daily limits on Group invitations—remain in place and may become permanent. 


Facebook itself restricted political discussions on its own internal messaging boards last year amid debate over the platform and the US presidential election, handing oversight to the professional moderators, according to two people familiar with the decision.


“Growing fast isn’t in and of itself an indication of something good or bad,” Mr. Rosen said. When it comes to managing the risks of Facebook products, he said, “the balance moves all the time.”


Write to Jeff Horwitz at Jeff.Horwitz@wsj.com


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.