In the run-up to the 2020 US presidential election, arguably the most competitive in US history, the most popular Facebook pages for Christian and African-American content were run by Eastern European troll farms. These sites were part of a larger network that, according to an internal company report, reached nearly half of all Americans. Particularly frightening: This reach was not achieved through user actions, but primarily through Facebook’s own platform design and algorithms that are hungry for so-called engagement – liking, sharing and commenting.
The report, which was written back in October 2019 and made available to MIT Technology Review by a former Facebook employee who was not involved in its creation, states that it was the social network after the last election in 2016 – as Donald Trump became president – failed to prioritize fundamental changes to the way his platform processes and disseminates information. Instead, the company pursued a so-called Whack-a-mole-Strategiewhich consisted of monitoring the activities of problematic actors and only then stopping them when they took part in the political discourse – and introducing some virtual guard rails that prevented “the worst of the worst”.
Troll farms still achieved great reach
But this approach did little to contain the real problem, the report said. Troll farms continued to build huge audiences by networking Facebook pages and reaching 140 million US monthly users with their content – 75 percent of whom, amazingly, had never followed any of the pages before. They saw the content because Facebook’s content recommendation system pushed it into their news feeds.
“Rather than users choosing to get content from these actors, it is our platform that is choosing to [diesen Trollfarmen] a tremendous reach, “writes the report’s author, Jeff Allen, former senior data scientist at Facebook. Joe Osborne, spokesman for Facebook, said in a statement that at the time of Allen’s report, the company” had already investigated these issues. ” “Since then we’ve formed teams, developed new guidelines, and worked with industry peers to address these problematic networks.” “Aggressive enforcement measures” have been taken against these types of domestic and foreign “inauthentic groups.” Results are reported regularly in quarterly reports.
But it’s not that easy. In reviewing these statements shortly before publication, MIT Technology Review found that five of the troll farm sites mentioned in the report are still active. The largest troll farm site that targeted African Americans in October 2019 is also still active on Facebook. The report found that the “problematic actors” are reaching the same demographic groups targeted by the Kremlin-backed Internet Research Agency (IRA) during the 2016 US election – namely Christians, black Americans and members of indigenous people Groups. An investigation by BuzzFeed News from 2018 revealed that at least one member of the Russian IRA charged with alleged meddling in the 2016 US election had also visited North Macedonia – a country known for its troll farms – despite no concrete evidence of a connection were found. (Facebook said its investigations had also found no link between the IRA and the North Macedonian troll farms.)
Bosses ignored receipts
“This is not normal. This is not healthy,” wrote Allen. “We gave fake actors the opportunity to amass huge numbers of followers for largely unknown purposes.” Allen produced the report as the fourth and final installment of his year and a half effort to understand troll farms. He left the company later that month, in part out of frustration that management “effectively ignored” his job, the former Facebook employee said. Allen declined to comment.
The report reveals the alarming state Facebook leadership has left the platform in for years. The US edition of MIT Technology Review introduces the full report as PDF with blackened names of employees available as it is in the public interest.
The key revelations include:
- In October 2019, around 15,000 Facebook pages were operated with a mostly US audience from Kosovo and North Macedonia, who were known as “problematic actors” during the 2016 election.
- Collectively, these troll farm pages – which are treated as a single page in the report for comparison purposes – reached 140 million US users monthly and 360 million weekly users worldwide. For comparison: The site of the supermarket chain Walmart reached the second largest US audience with 100 million.
- The troll farm sites teamed up to form, among other things, the largest Christian-American site on Facebook, 20 times larger than the next largest – with a monthly reach of 75 million US users, 95 percent of whom had never followed one of the sites . They also put the largest African American page on Facebook, three times the size of the next largest, reaching 30 million US monthly users, 85 percent of whom have never followed any of the pages. With indigenous users on Facebook, 400,000 users are reached monthly (the second largest page of its kind, 90 percent did not follow the page before). In terms of sites for women, the troll farms were the fifth largest with 60 million US users per month. Here 90 percent of the users did not follow the page before.
- Troll farms primarily affect the USA, but also the United Kingdom, Australia, India and Central and South American countries.
- Facebook has conducted several studies that confirm that content that is more likely to receive a lot of engagement from users (likes, comments, and shares) is more of what is known to be problematic. Still, the company has continued to rank content in users’ newsfeeds based on the highest engagement.
- Facebook prohibits sites from posting content that has merely been copied from other areas of the platform, but does not enforce this policy against known problematic actors. This makes it easy for foreign actors who do not speak the national language to post completely stolen content and still reach a large audience. At one point in time, up to 40 percent of page views were from US sites that contained primarily non-original content or material of little originality.
- Troll farms have also found their way into Facebook’s “Instant Articles” and “Ad Breaks” partnership programs, which are supposed to help real news organizations and publishers monetize their articles and videos. Due to a lack of basic quality controls, up to 60 percent of instant article reads were used for content that had been plagiarized by other sites. This made it easy for troll farms to interfere in the debate unnoticed and even receive payments from Facebook for it.
Facebook rewards bad content
The report looks specifically at troll farms in Kosovo and North Macedonia run by people who do not necessarily understand American politics. However, due to the way in which Facebook’s “reward systems” are designed for the news feed, they could still have a significant impact on political discourse. In the report, Allen gives three reasons why these sites can reach such a large audience. First, Facebook does not penalize pages for publishing completely copied content. If something has gone viral before, it will likely go viral again when posted a second time.
Second, for the algorithm, Facebook pushes “appealing” content on pages to people who do not follow it. If the friends of the users comment on or share posts on one of these pages, these users will also see the content in their news feeds.
Third, the Facebook ranking system pushes content further up in users’ news feeds with a lot of engagement. In most cases, troll farm operators appear to have financial rather than political motives; they post what gets the most approval, regardless of the actual content. However, since misinformation, clickbait and politically divisive content tend to receive a high level of participation (as confirmed by Facebook’s own internal analyzes), troll farms tend to post more of them over time, the report said.
As a result, as of October 2019, all 15 of the top pages targeting Americans with Christian interests became 10 of the top 15 Facebook pages targeting Black Americans, and four of the top 12 Facebook pages targeting American Target indigenous people, operated by troll farms. A frightening result.
“Our platform has given the biggest voice in the Christian American community to a handful of bad actors who, based on their content production practices, have never been to church,” wrote Allen. “And our platform has given the biggest voice in the African American community to a handful of bad actors who, based on their content production practices, have never had an interaction with an African American.”
(bsc)
Credit: Source link