Facebook bans deepfake videos ahead of the 2020 US election

Facebook caught the heat in the wake of the Cambridge Analytica scandal in which user accounts were found to have been targeted and manipulated in a bid to interfere with the 2016 US Presidential election, and ahead of this year’s election, Facebook has now issued a sweeping ban on deepfake videos.

“Deepfakes”, where one person’s image and voice can be superimposed over another with convincing effect thanks to deep learning and artificial intelligence, have caused concern in media circles as they can be used to improperly attribute statements or sayings to politically exposed figures.

The move is likely the first of many Facebook will take as it sets up its ‘war room’ to make sure that its platform isn’t used to influence the outcome of this year’s US election. The company has confirmed that it will be actively monitoring its core platform for any signs of political interference.

The firm has come under increasing observation by not only the general public, but furthermore the US government in recent months – with company CEO Mark Zuckerberg testifying before Congress on more than one occasion.

Late last year, Twitter made its own move by banning all paid political advertising on its core platform in a bid to prevent interference – whether the company plans to issue a similar stance on deepfake videos remains to be seen.

What are your thoughts? What other steps could Facebook take in order to safeguard voters – and its platform – from electoral interference? Be sure to let us know your opinion in the comments below.