Facebook Knew Calls for Violence Plagued ‘Groups’, Now Plans Overhaul

Today, the social media giant is cracking down on groups. The initiative was taken after Facebook’s own research revealed that American groups on Facebook have become a vehicle for the frenzied partisanship and even calls for violence that have thrown the country into turmoil since the election.

One of the changes Facebook made after the January 6 riots in front of the U.S. Capitol is an overview of the mechanics of the product that must be central to its future.

Executives at Facebook have known for years that the tools that support the group’s rapid growth are hampering their efforts to build healthy online communities, and the company has struggled internally to find ways to curb them. Facebook is currently working on a redesign of the product’s mechanics, which should be at the heart of its future.

Company investigators warned Facebook executives in August that what they described as blatant misinformation and calls to violence filled most of the platform’s main civic groups, according to documents accessed by the Wall Street Journal. These groups typically focus on policy and related topics and collectively reach hundreds of millions of users.

According to an internal presentation, researchers reported to supervisors that a group of 58,000 members was inundated daily with enthusiastic calls to violence. Another prominent group said it was created by fans…

Donald Trump

but in reality it was run by financially motivated Albanians who focused a million views a day on fake news and other provocative content.

About 70% of the 100 most active civic groups in the United States are not considered to be dealing with issues such as hate speech, misinformation, intimidation and harassment, the presentation concluded. We need to do something to prevent these conversations from growing so quickly, the researchers wrote, suggesting measures to at least slow the group’s growth long enough to give Facebook employees time to address violations.

Our existing integrity systems, they wrote, do not address these problems.

In response, Facebook banned some of its main issue groups in the run-up to the election and took steps to curb the growth of others, according to documents and people familiar with the decisions. However, Facebook considered the restrictions temporary and refrained from applying the measures that some of its own researchers demanded, they said.

In the weeks following the election, many larger groups, some of which were mentioned in the August presentation, questioned the results, organized protests against the results, and helped to accelerate the protests that led to riots in the 6th Congressional District. The month of January preceded. In the aftermath of the Capitol Hill riots, Facebook has closed more groups and implemented new rules as part of a so-called emergency response.

Facebook has cancelled its plans to resume recommending civic or health groups, said

Guy Rosen,

Facebook’s Vice President of Integrity, a position that oversees the safety of users and discussions on the platform. Facebook will also disable some tools that researchers say contributed to the rapid growth of Edgy Groups and force administrators to work harder to verify member-generated content, he said.

Trump supporters swarm outside the Capitol on January 6..

Photo:

Carol Goosey/Zuma Press

It helps us because we can hold them accountable, said Rosen, who added that the changes are not an admission that the previous rules were too lax, but show that Facebook is adapting to new threats: If you had observed the group a few years ago, you might not have noticed the same behaviors.

Facebook, like other technology giants, has been criticized for banning certain content and individuals, including M.

Goat.

He’s also under the microscope

Biden

who expressed dissatisfaction with the way Facebook managed its platforms in the months leading up to the election.

Last Wednesday, Mr. Zuckerberg said in a conference call that Facebook users were tired of the hyper-partisanship on the platform. People don’t want politics and strife to take over their experience on our services, he said, adding that Facebook is also considering measures to reduce political content in the flow of information – a stream of baby photos, birthday reminders and diatribes from distant relatives that greet users when they log in.

Zuckerberg also said the society is looking at whether groups can be reoriented to help people develop as individuals, like real communities do. We can make Facebook groups more than just a feed and a place to post content, he said.

Rotary groups

Facebook’s 2019 redesign marks a strategic shift away from the News Feed and one of the most significant changes to the platform in years.

It has focused on the content of the groups and brought it to the attention of the content stream that it offers to the users. By prioritizing groups, Facebook wants to help people make meaningful connections with like-minded people. This change comes at a time when Facebook is being criticized for its vulnerability to foreign interference and other manipulations.

Groups, once a utility function, have become essential for designing applications, recommendation systems and dating functions. Zuckerberg told the magazine that Facebook had been working for six months to make this change responsibly, and that it was aware of its duty not to use its algorithms to promote groups that contain questionable medical advice about unfounded conspiracies. If people are really looking for it themselves, fine, he says.

Groups have also become the focus of Facebook’s brand strategy as the company comes under fire over issues such as privacy and the role of Cambridge Analytica in the 2016 election. In his advertisement for Super Bowl 2020, rock music fans, Boulder clubs and swing chair enthusiasts came together.

Nina Jankovic,

A social media researcher at the Woodrow Wilson Center in Washington, D.C., said she was alarmed to hear a Facebook representative advise the European prime minister’s social media director that groups are now the best way to reach a broad audience on the platform.

My eyes are falling out of my head, says Jankovic, who studies the intersection of democracy and technology. I knew how destructive groups could be.

The problem, he says, is that Facebook is not simultaneously increasing surveillance and promoting algorithmic groups. With a few clicks, a user can switch from an alternative health group to an anti-coviance group to a militia group. Eventually, Facebook banned the militias. And Facebook’s tools to accelerate the growth of groups, such as. B. The ability of administrators of hyperactive groups to send thousands of invitations a day to new users and post previews of group content in guests’ news feeds before they join the group increases these risks, she said.

Facebook CEO Mark Zuckerberg talks about redesigning focus groups, April 2019

Photo:

David Paul Morris/Bloomberg News

Last June, she wrote an essay for Wired magazine about the Facebook groups destroying America, claiming that partisan publishers and foreign actors use these groups to spread conspiracy theories and lies. If Facebook does not rethink its approach, she warned, the groups could undermine democracy.

Extremist groups

In a 2016 presentation on ending Facebooks efforts to counter polarization, which the Journal reported on last year, the researcher noted that extremist content has flooded major German political groups and that 64% of all extremist group memberships can be attributed to our recommendation tools. The conclusion of the interview was that our referral systems exacerbate the problem.

In response to this article, Facebook stated that it had addressed the issues with recommendations.

An internal presentation in August 2020 highlighted other issues related to American mercenaries and non-partisan groups using Facebook resources to build large audiences. Many of the most effective groups were controlled by administrators who tolerated or actively cultivated hate speech, harassment and calls for violence, she said, noting that one of the largest groups collected the most inflammatory messages of the time and delivered them to an unsavory crowd that immediately and repeatedly called for violence.

The administrators have marked most groups as closed, so only their members can read them. Some were kept secret by people outside of Facebook who didn’t know they existed, especially since they were garnering millions of views per week.

Americans did not lead some of the most popular groups, as the August presentation showed. She believed a group called Trump Train 2020, Red Wave may have had ties to Macedonia and hosted the most hateful speeches of any US group picked up by Facebook. The group reached more than a million members in the two months after its creation last summer, according to data archived by the Snopes site before Facebook removed it in September.

The Journal has been unable to contact the group’s administrators, whose personal pages, some of whose English is questionable, have also been removed. The panel did not respond to a request for comment from the proposed successor.

Most groups were on the right side of the political spectrum, but Suburban Housewives vs. Trump also climbed to the top of the rankings, according to the presentation in August. Conservative and liberal groups have a common denominator: They used passionate super users and Facebook’s recruiting tools to achieve viral growth.

By the end of the summer, the content of the first ten citizen groups had been viewed 93 million times in seven days. The August presentation showed that the intent of large groups to violate Facebook’s rules was often obvious, and administrators taught users how to post offensive content to bypass Facebook’s automatic filters.

Toxic atmosphere

They intentionally create this toxic atmosphere, Facebook researchers wrote about Facebook executives.

Kayleigh McEnaney.

A fan club named after, but not affiliated with, a spokesman for the Trump administration. The researchers stated that the group functioned largely as a distribution system for highly contradictory and likely disinformative low-quality information from a handful of partisan publishers.

Share your ideas

What should Facebook do about incitement to violence on its site? Join the discussion below.

According to investigators, the group has made death threats to Black Lives Matter activists and members of Congress, and Facebook has reported the group 174 times in three months for misinformation. In the comments under a post on

Ilhan Omar, U.S. Congressman.

(D., Minn.), which was attached to the presentation, including comments :

I hope someone shoots her, but she’s alive and paralyzed.

Maybe a bullet will do him good.

Restore public executions.

The Journal has contacted five club executives, most of whom appear to be connected to commercial right-wing digital media sites via Facebook, as well as the contacts listed by those media outlets when available, but has received no response.

Facebook’s public policy team favored action against known conservative groups, while officials in other parts of the company questioned proposed restrictions on the impact of growth, according to internal documents and people familiar with the decisions. In an effort to overcome resistance to further hacking, Facebook’s integrity officers began sending daily analyses to Mr. Rosen and other senior executives that showed how Facebook’s methods for monitoring large groups failed to detect clear violations of the company’s community standards.

Facebook’s rules prohibit hate speech and incitement to violence. The company provides group managers with advice on how to comply with EU rules. But instead of promoting a civilized tone, the leaders of the politically oriented groups encouraged their members to break Facebook’s rules, threatened to ban anyone who reported such content, and encouraged users to post their most egregious content as comments on other posts – a tactic designed to clutter Facebook’s automatic moderation systems.

Facebook declined to discuss the details of its work with the results.

The 20th. In October, the Mozilla Foundation, which makes the Firefox browser and claims to promote a healthy internet, ran a full-page ad in the Washington Post asking Facebook to stop its algorithmic group recommendation systems. Numerous experts – and even some of your own employees – have shown how these characteristics can amplify misinformation, the letter says, also calling for

Twitter Inc.

MANAGING DIRECTOR

Jack Dorsey.

to suspend its algorithm-driven trend subject function.

Twitter did not suspend the feature, although it did try to add more context and intervene manually to address inflammatory trends such as Hang

Mike Pence.

A Twitter spokesman said the company quickly responded to calls about Pence’s death and began adding facts to its Trending Topics section.

Ashley Boyd,

Mozilla’s vice president of advocacy and engagement said she had discussed the foundation’s concerns with Facebook’s public policy, product development and communications staff before publishing the letter. They didn’t say we were crazy, she said. They said: It’s very similar to our internal conversations. ”

Even before Mozilla published its letter, Facebook temporarily stopped algorithmically advising groups that deal with political or civic issues, a Facebook spokesman said.

Facebook has also stopped showing previews of group content to potential new members, limited the number of invitations members can send per day, and started freezing threads if they repeatedly activated automatic filters against hate and violence, internal documents show. Mr. Rosen confirmed the election motions.

The new rules, which Facebook sought to keep temporary and largely unexpected to the public, failed to halt the viral growth of some groups after the election. A group called Stop the Steal, which organized campaign events across the country, grew to 361,000 members in less than 24 hours – without any promotion from Facebooks algorithms. When Facebook published it on the 5th. In November, the group declared that it was organized to delegitimize the electoral process, and we saw disturbing calls for violence from some members of the group.

Responding to growing fears of political bloodshed, Zuckerberg approved additional emergency braking measures that day, including additional restrictions on groups with a history of bad behavior, according to internal documents and people familiar with the decisions.

After violence erupted during the vote counting process in the days that followed, Facebook began easing some restrictions on groups, according to internal documents. She reminded staff and journalists that these measures were still temporary.

On January 6, after a rally organized by Amy and Kylie Kremer – the creative mother-daughters of the original Stop the Steal group that took down Facebook on November 5 – a group of Trump fans stormed Capitol Hill. The Kremlin did not respond to requests for comment. After the outcry, Facebook removed other groups that used Stop the Steal on their behalf and had the same goal.

Amy Kremer, at the pro-Trump rally on January 6, is co-founder of the original Facebook group Stop Stealing.

Photo:

Jacqueline Martin/Presse Associée

Zuckerberg supported the introduction of Break-Glass, which Facebook recently abolished, and added further restrictions on groups, internal documents show. In a public blog post, he accused President Trump of using Facebook to incite a violent riot. Facebook has asked administrators to approve more posts in groups that have broken its rules in the past, a method recommended by Facebook employees in August but which the company has not yet fully implemented.

Chief Operating Officer of Facebook

Sheryl Sandberg.

publicly blamed small social media platforms for the turmoil, even while the company remained in the fold. The company dissolved 40 of the top 100 groups it named in its August presentation. She declined to comment.

In addition to permanently banning algorithmic recommendations from social and health groups, Facebook will ban advertising for groups of any kind for the first 21 days of their existence. Other temporary measures, such as freezing comments deemed harmful and daily restrictions on group invitations, remain in effect and may become permanent.

Last year, Facebook itself ended political discussions on its own internal discussion forums in the midst of a debate about the platform and the U.S. presidential election, leaving control to professional moderators, according to two people familiar with the decision.

Rapid growth alone is not a sign of good or bad, Rosen said. When it comes to managing the risk of Facebook products, he says : The balance is constantly changing.

Email Jeff Horwitz at [email protected].

Copyright ©2020 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8

You May Also Like