Quick News Bit

Facebook’s Internal Chat Boards Show Politics Often at Center of Decision Making

0

In June 2020, when America was rocked by protests over the death of

George Floyd

at the hands of a Minneapolis police officer, a

Facebook

FB -5.05%

employee posted a message on the company’s racial-justice chat board: “Get Breitbart out of News Tab.”

News Tab is a feature that aggregates and promotes articles from various publishers, chosen by Facebook. The employee’s message included screenshots of headlines on Breitbart’s website, such as “Minneapolis Mayhem: Riots in Masks,” “Massive Looting, Buildings in Flames, Bonfires!” and “BLM Protesters Pummel Police Cars on 101.”

The employee said they were “emblematic of a concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative way,” according to written conversations on Facebook’s office communication system reviewed by The Wall Street Journal. Many other employees chimed in to agree.

In the same chat, a company researcher said any steps aimed at removing Breitbart—a right-wing publisher popular with supporters of former President

Donald Trump

—could face roadblocks internally because of the potential political blowback. “At best, it would be a very difficult policy discussion,” the researcher said.

Facebook chose to keep Breitbart on News Tab. A spokeswoman for the tech giant said the company makes a judgment based on the specific content published on Facebook, not the entire Breitbart site, and that the Facebook material met its requirements, including the need to abide by its rules against misinformation and hate speech.

Many Republicans, from Mr. Trump down, say Facebook discriminates against conservatives. The documents reviewed by the Journal didn’t render a verdict on whether bias influences its decisions overall. They do show that employees and their bosses have hotly debated whether and how to restrain right-wing publishers, with more-senior employees often providing a check on agitation from the rank and file. The documents viewed by the Journal, which don’t capture all of the employee messaging, didn’t mention equivalent debates over left-wing publications.

Other documents also reveal that Facebook’s management team has been so intently focused on avoiding charges of bias that it regularly places political considerations at the center of its decision making.

Facebook employees, as seen in a large quantity of internal message-board conversations, have agitated consistently for the company to act against far-right sites. In many cases, they have framed their arguments around Facebook’s enforcement of its own rules, alleging that Facebook is giving the right-wing publishers a pass to avoid PR blowback. As one employee put it in an internal communication: “We’re scared of political backlash if we enforce our policies without exemptions.”

A protest against police brutality and racism in June 2020 in Minneapolis.



Photo:

KEREM YUCEL/Agence France-presse/Getty Images

Facebook employees focused special attention on Breitbart, the documents show, criticizing Facebook for showcasing the site’s content in News Tab and for helping it to sell ads. They also alleged Facebook gave special treatment to Breitbart and other conservative publishers, helping them skirt penalties for circulating misinformation or hate speech.

Right-wing sites are consistently among the best-performing publishers on the platform in terms of engagement, according to data from research firm NewsWhip. That is one reason Facebook also is criticized by people on the left, who say Facebook’s algorithms reward far-right content.

Facebook says it enforces its rules equally and doesn’t consider politics in its decision making.

“We make changes to reduce problematic or low-quality content to improve people’s experiences on the platform, not because of a page’s political point of view,” said Facebook spokesman

Andy Stone.

“When it comes to changes that will impact public pages like publishers, of course we analyze the effect of the proposed change before we make it.”

The Facebook Files is a Wall Street Journal project based on reporting that includes a cache of documents and data reviewed earlier this year. The internal communications offer an unprecedented look at Facebook’s struggles to manage the products and systems at the heart of its business success.

Facebook is one of the most important outlets for publishers, with more than a third of Americans saying they regularly get their news from the platform, according to Pew Research Center.

In May 2016, the tech blog Gizmodo reported that Facebook’s “Trending Topics” list routinely suppressed conservative news. Facebook denied the allegations, but the ensuing controversy prompted claims of bias from Republicans that haven’t let up.

Some internal documents show employee antipathy toward conservative media. In 2018, an engineer who had claimed on a message board that Facebook was intolerant of conservatives, left the company. When he took his critique to

Tucker Carlson’s

Fox News show, some Facebook employees criticized him for going on a network “so infamous and biased it can’t even call itself a news channel,” records from the message boards show. Various employees called Mr. Carlson a “white nationalist” and “partisan hack” who “looks as though he’s a Golden Retriever who has been consistently cheated out of a cache of treats.”

“Any dog comparison is a compliment as far as I’m concerned,” Mr. Carlson said in an interview.

Fox

News declined to comment. Fox Corp. and Wall Street Journal parent

News Corp

share common ownership.

In many of the documents reviewed by the Journal, employees discussed whether Facebook was enforcing its rules evenly across the political spectrum. They said the company was allowing conservative sites to skirt the company’s fact-checking rules, publish untrustworthy and offensive content and harm the tech giant’s relationship with advertisers, according to records from internal Facebook message boards.

‘Special exceptions’

In a farewell memo to colleagues in late 2020, a staffer in Facebook’s integrity team, which seeks to mitigate harmful behavior on the platform, said Breitbart was undermining the company’s efforts to fight hate speech.

“We make special exceptions to our written policies for them, and we even explicitly endorse them by including them as trusted partners in our core products,” the staffer said of Breitbart.

Ranking Trust

A study by Facebook researchers found that Breitbart was the least trusted news source, and also ranked as low quality, among several dozen it looked at across the U.S. and Great Britain.

Trust ratings from user surveY

Breitbart is ranked

least trusted of

all publishers

FACEBOOK INTERNAL TRUST RATING

Y-Axis: Trust ratings from user survey

X-Axis: Facebook internal trust rating

Breitbart is ranked least trusted of all publishers

Y-Axis: Trust ratings from user survey

X-Axis: Facebook internal trust rating

Breitbart is ranked least trusted of

all publishers

Breitbart was included in News Tab, which was launched in 2019. The product contains a main tier with curated news from publishers including The Wall Street Journal, New York Times and Washington Post, which are paid for their content. Breitbart is part of a second tier of news designed to deliver news tailored to a user’s interest, and isn’t paid.

Facebook said it requires sites included on News Tab to focus on quality news reporting and bars those that repeatedly share what it deems misinformation or violate its public list of community standards.

Asked about the inclusion of Breitbart, Facebook Chief Executive

Mark Zuckerberg

said in an interview at the time of the launch that the aim was for News Tab to have a diversity of perspectives.

As the May 25 killing of Mr. Floyd inflamed political tensions across the country in 2020, one staffer wrote in the racial-justice chat that he understood “factual progressive and conservative leaning news organizations” both needed to be represented, but that could be done without including Breitbart.

A senior researcher wrote in the chat that it would be a problem for Facebook to remove Breitbart from News Tab for the way it framed news events, such as the protests after Mr. Floyd’s death, because “news framing is not a standard by which we approach journalistic integrity.”

He said if the company removed publishers whose trust and quality scores were going down, Breitbart might be caught in that net. But he questioned whether the company would do that for all publishers whose scores had fallen. “I can also tell you that we saw drops in trust in CNN 2 years ago: would we take the same approach for them too?” he wrote.

He said that Breitbart had been hurt by algorithm changes that favored all content considered trustworthy, which were defensible within Facebook, he wrote, because they applied to all publishers and could be tied to some clear goal of improving user experience.

An August 2019 study by Facebook researchers found that Breitbart was the least trusted news source of the several dozen it looked at across the U.S. and Great Britain, according to a chart from the study reviewed by the Journal. The study, which also ranked news sources based on quality, also classified Breitbart as “low-quality.”

A Breitbart spokeswoman said the company’s content was far more accurate and more popular with Facebook’s own users than the mainstream news media competitors that Facebook pays for content.

Demonstrators in support of Donald Trump and Breitbart in March 2017 in Huntington Beach, Calif.



Photo:

EUGENE GARCIA/epa/shutterstock

Facebook’s relationship with Breitbart has also come under fire from advertisers and the employees who work on ad sales. In 2018, one employee working on the Facebook Audience Network, a group of third-party publishers for whom Facebook sells advertising, argued that Facebook should drop Breitbart from the network.

“My argument is that allowing Breitbart to monetize through us is, in fact, a political statement,” the person wrote in an internal memo. “It’s an acceptance of extreme, hateful and often false news used to propagate fear, racism and bigotry.”

After the 2016 election, advertisers started looking to avoid Breitbart, which delighted in provoking the left with anti-PC rhetoric and nationalism that critics called racist. In the automated ad system, even if an advertiser didn’t specifically seek to advertise on Breitbart, its ads could appear there.

Many advertisers sought to ensure their ads didn’t appear on Breitbart by taking advantage of a Facebook Audience Network feature that allowed them to block specific websites, the employee wrote, but the tactic wasn’t proving effective.

“Breitbart tries to work around every control that we put in place, so we have to block at the platform level,” the employee quoted an unnamed advertiser as saying, conveying the client’s dissatisfaction.

A director of product management responded. “On a personal level, all you say resonates with me,” the director wrote. “That being said, and most importantly, we have to rely on our principles and policies when making decisions.”

The Breitbart spokeswoman said it was false that it worked around controls.

Breitbart remained in Facebook Audience Network until the spring of 2020, when all mobile web publishers were removed.

Targeting ‘hyperposters’

Facebook took steps to damp the spread of what it deemed misinformation in users’ feeds after the 2016 election. That included a tool called “Sparing Sharing,” which targeted “hyperposters,” or accounts that post very frequently. It reduced the reach of their posts, since data had shown these users disproportionately shared false and incendiary information.

Facebook had implemented the change even though Joel Kaplan, Facebook’s global head of public policy and a former deputy chief of staff to former President George W. Bush, had argued against implementing the initiative too aggressively. Mr. Zuckerberg approved the change but ordered that its effects be weakened.

Another tool, called “Informed Engagement,” reduced the reach of posts that people were more likely to share if they hadn’t read them.

The two tweaks successfully shifted the news stories users were likely to see toward a more mainstream, less volatile mix.

In 2019, Facebook data scientists studied the impact of the two tools on dozens of publishers based on their ideologies, according to the documents reviewed by the Journal.

The study, dubbed a “political ideology analysis,” suggested the company had been suppressing the traffic of major far-right publishers, even though that wasn’t its intent, according to the documents. “Very conservative” sites, it found, would benefit the most if the tools were removed, with Breitbart’s traffic increasing an estimated 20%, Washington Times’ 18%, Western Journal’s 16% and Epoch Times’ by 11%, according to the documents.

Political Test

When Facebook analyzed its ‘Sparing Sharing’ and ‘Informed Engagement’ tools, which were designed to reduce what it deemed misinformation, it found the two had a greater impact on far-right publishers.

Political ideology (as classified by Facebook)

Circles sized by publisher audience size

Likely increase in audience without tools in place ⟶

0%5%10%15%20%25%30%35%

Note: Audience size is based on Facebook internal metric VPV, or Viewport Views, which measures how many times content is seen by users.
Source: Internal document titled, ‘Sparing Sharing + Informed Engagement Removal Political Ideology Analysis’

The study was designed to prepare Facebook for any fallout that might come from stopping the two initiatives, including accusations of bias, according to the documents and people familiar with the matter. “We could face significant backlash for having ‘experimented’ with distribution at the expense of conservative publishers,” one of the researchers wrote in an internal memo.

The company stopped the Informed Engagement program but kept Sparing Sharing.

Mr. Kaplan ordered more studies analyzing how enforcement efforts were implemented for different ideologies as Facebook increasingly faced the charge that it was suppressing conservative voices, one of the people familiar with the matter said.

“It is not news to us that Facebook has effectively suppressed our content,” the Breitbart spokeswoman said. “Still, we’ve been crushing our establishment news competitors in engagement for years, so imagine what would happen if Facebook treated Breitbart equally with other top news publishers.”

List of examples

In 2020, a Facebook engineer gathered up a list of examples he said were evidence that Facebook routinely declines to enforce its own content moderation rules for big far-right publishers like Breitbart, Charlie Kirk, PragerU and Diamond and Silk, according to the documents.

The controversy had escalated in July 2020, when Mr. Trump tweeted a Breitbart video claiming “you don’t need a mask” against Covid-19 and there was a cure for it that included the antimalarial drug hydroxychloroquine. The video, which featured a live news conference, was seen millions of times before Breitbart and social-media platforms including Facebook took it down.

According to Facebook’s fact-checking rules, pages can be punished if they acquire too many “strikes”—meaning they published content deemed false by third-party fact-checkers. It requires two strikes within 90 days to be deemed a “repeat offender,” which can result in a user being suspended from posting content. More strikes can lead to reductions in distribution and advertising revenue.

In a town hall, Mr. Zuckerberg said Breitbart wasn’t punished for the video because that was its only infraction in a 90-day period, according to internal chats describing the meeting.

According to the engineer’s list of examples, the content producers were “managed partners,” part of a program in which Facebook assigns internal handlers to prominent users. A side benefit for these users, the engineer alleged, was that their liaison at Facebook helped them avoid punishment over fact-checking strikes, according to the documents.

A strike would be escalated for review by senior Facebook executives, including the policy and public-relations teams, who would consider whether to overturn the punishment.

A Facebook spokesman said such inquiries come from entities on both sides of the political spectrum, as well as mainstream news organizations.

In an internal memo, the engineer said he based his assessment in part on a queue of three dozen escalations that he had stumbled onto, the vast majority of which were on behalf of conservative content producers. A summary of the engineer’s findings was posted to an internal message board.

Social-media influencers Lynnette ‘Diamond’ Hardaway, center, and Rochelle ‘Silk’ Richardson, right, spoke with Mr. Trump at a rally in March 2020.



Photo:

Al Drago/Bloomberg News

In one case he cited regarding pro-Trump influencers Diamond and Silk, third-party fact-checkers rated as “false” a post on their page that said, “How the hell is allocating 25 million dollars in order to give a raised [sic] to house members, that don’t give a damn about Americans, going to help stimulate America’s economy?” When fact-checkers had rated that post “false,” a Facebook staffer involved in the partner program argued there should be no punishment, noting the publisher “has not hesitated going public about their concerns around alleged [anti-]conservative bias on Facebook.”

Diamond and Silk was able to lobby the third-party fact checker to change the rating down to “Partly False” and, with the help of the managed partner escalation process, all its strikes were removed, according to the posted summary and escalation documents.

BuzzFeed previously reported on the memo summarizing the engineer’s examples.

A Facebook spokesman said that the employee who flagged Diamond and Silk’s willingness to complain about bias was just gathering information to pass up to decision makers, not arguing what to do about the incident.

The chat conversations the Journal reviewed show that inside the company, Facebook employees demanded that higher-ups explain the allegations.

“We are apparently providing hate-speech-policy-consulting and consequence-mitigation services to select partners,” wrote one. “Leadership is scared of being accused of bias,” wrote another.

Facebook executives dropped into the chat to explain fact-checking policies and how the managed partner program worked but didn’t address the questions about bias, according to the chat records.

Write to Keach Hagey at [email protected] and Jeff Horwitz at [email protected]

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment