U.K. Challenges Facebook Over Lies In Political Ads

Facebook’s controversial decision to allow “deceptive, false or misleading content” in political ads is being challenged in the U.K.

Damian Collins, chair of the Digital, Culture, Media and Sport (DCMS) Committee, has written to Facebook’s head of communications Sir Nick Clegg—a former deputy prime minister—demanding an explanation for the change.

The new policy sees the company dropping a ban on political ads with ‘deceptive, false or misleading content’ and instead only banning ads that “include claims debunked by third-party fact-checkers.” In the U.K., this means content checked by Facebook partner Full Fact.

It’s already been challenged in the U.S., where Senator Elizabeth Warren has been taking out ads of her own claiming it gives “Donald Trump free rein to lie on his platform—and pay Facebook gobs of money to push out their lies to American voters.”

The change has been justified by vice president of policy solutions, Richard Allan, in an article for the Daily Telegraph, in which he suggests that it’s not Facebook’s place to police content on the platform—it’s ours.

“We do not believe it should be our role to fact check or judge the veracity of what politicians say—not least since political speech is heavily scrutinized by the media and our democratic processes,” he wrote.

However, Collins warns that the change will reduce Facebook’s efforts to combat disinformation online ahead of elections, particularly ahead of a possible U.K. general election.

He points to Volume 2 of the US Senate Select Committee on Intelligence’s report on Russia’s use of social media.

“The report notes that Facebook provided the Committee with ‘information related to a number of [Internet Research Agency] IRA affiliated social media accounts, including advertisements purchased in connection with those accounts,’ approximately 61,000 posts connected to 133 Instagram accounts,” he writes.

“The change in policy will absolve Facebook from the responsibility of identifying and tackling the widespread content of bad actors, such as Russia’s Internet Research Agency.”

Collins also cites Facebook’s former head of global elections integrity ops, Yael Eisenstat, who recently claimed that when she recommended scanning adverts to detect misinformation efforts, she was supported by engineers but blanked by upper management.

He asks what her proposals were, whether they were determined to be feasible and on what grounds they were rejected.