This post was originally published on this site

In the weeks leading up to the 2016 U.S. election, Facebook shut down 5.8 million fake accounts in the United States, according to a source familiar the prepared testimony Facebook will deliver to the Senate Judiciary Committee Tuesday afternoon.

The accounts were removed in October 2016, but Facebook’s testimony will also acknowledge that the company’s automated tooling did not yet reflect today’s understanding that masses of fake accounts were being used to advance social and political causes, according to the source. The company credits updates geared toward looking for political and social-focused accounts for the 30,000 accounts it disabled before the French election and tens of thousands of others removed before the German election.

CBS News correspondent Nancy Cordes reported that the company will also disclose that over the course of three years, approximately 29 million people were “delivered” or “served” content from pages related to the Internet Research Agency (IRA) – a Russian troll farm associated with the fake campaign ads.

The company believes as many as 126 million people could have seen that content over the course of three years. The company will likely testify that figure represents just one out of every 23,000 posts the average user saw.

Before the November 8, 2016 election, the company’s security team also reported to U.S. law enforcement officials indications that the group APT28 had targeted employees of American political parties. U.S. law enforcement believes APT28 is connected with Russian intelligence operations.

The testimony is the latest in a series of revelations related to fake users and flawed metrics on the social network. 

In September, researchers exposed a security loophole that allowed at least a million Facebook accounts, both real and fake, to generate at least 100 million “likes” and comments as part of “a thriving ecosystem of large-scale reputation manipulation.” CBS News tested out the network and confirmed its ability to quickly generate likes. Facebook posts that quickly receive a lot of likes are more likely to be placed higher in other people’s feeds, meaning users buoyed by fake likes can ultimately generate significantly more real attention and influence.   

The Wall Street Journal first reported details of the October 2016 account purge. In April of this year, Facebook conducted another purge, removing tens of thousands of fake accounts that had liked media pages as part of a wider strategy to appear real while spamming users, and it removed 30,000 accounts — allegedly tied to Russian influence operations — in the run-up to the French national election in May.

In 2016, the company announced that it undercounted the traffic of some publishers and for more than a year over-reported time spent on Facebook’s Instant Articles platform. It acknowledged issues affecting a range of metrics — including ad reach, streaming reactions, likes and shares, and admitted that for two years it reported to advertisers overestimated figures for the average time users spent watching videos on its platform.

The disclosures in 2016 led to a putative class action lawsuit, which was filed by a Facebook investor in January. 

Got news tips about digital privacy, social media or online marketing? Email this reporter at, or for encrypting messaging, (PGP fingerprint: 4b97 34aa d2c0 a35d a498 3cea 6279 22f8 eee8 4e24).

© 2017 CBS Interactive Inc. All Rights Reserved.