Mishcon de Reya page structure
Site header
Main menu
Main content section

Now & Next: The danger of deepfakes

Posted on 24 October 2019

Videos can now be faked to a hyper realistic standard. What does this mean for democracy and how can we combat the spread of misinformation? Can you spot all the deepfake interviews in the film?

Could deepfakes weaken democracy?

The Economist

Supported by Mishcon de Reya

Democracy is easy.  It’s like stealing ice cream from a baby.

I genuinely love the process of manipulating people online for money.

Zuckerberg: We just want to predict your future behaviours.

These videos are all deepfakes.  Synthesised content created using artificial intelligence.

Fake, fake, disgusting news.

Deepfakes will make for even more complicated arguments about what is fake news and what is real and if seeing is no longer believing the very real question is could deepfakes weaken democracy?

Aviv Ovadya

Democracy just doesn’t work if people don’t believe in it.

NOW&NEXT

Faking the future

Bill Posters

So the deepfake artworks used artificial intelligence and machine learning technologies to kind of hack the bodies if you like, of famous celebrity influencers.

Bill Posters is the artist behind these deepfake videos known as the Spectre Project.

Spectre is almost too powerful to comprehend.

Bill Posters

Two of the main questions we wanted to explore with the Spectre Project is what does it feel like when our personal data is used in unexpected ways by powerful tech companies and how as a result can that change our understandings of today.

To test Facebook’s response, Bill posted the deepfake videos on Instagram, a social media platform owned by Facebook.  The company downgraded the videos’ visibility.

Zuckerberg: Spectre showed me how to manipulate you into sharing intimate data about yourself and all those you love for free.

 

But that didn’t stop this fake clip of Facebook boss Mark Zuckerberg, going viral.  That showed the potential for spreading disinformation online through deepfakes.  A danger that is likely to increase as long as tech companies and politicians remain unsure how to deal with it

Bill Posters

The power of deepfakes is an area of great concern whilst these technologies exist in what is essentially a regulatory black hole.

Image manipulation is already exploited by autocratic regimes.  It is a dark art that goes back to Joseph Stalin who made his enemies disappear.  AI today is capable of making deepfake videos like this where comedian Bill Hader morphs into Tom Cruise.  As the technology advances the danger is that deepfakes will be used to mislead voters in democratic countries.

Aviv Ovadya

If you take away those tools that enable us to be able to sort out what’s real from what’s not, you make very poor decisions.

Aviv Ovadya is the founder of Thoughtful Technology Project.  He worries about another problem that deepfakes could be used as an excuse to help politicians escape scrutiny.

Aviv Ovadya

You have the corrupt politician being able to say “oh yeah that video of me – that was fake”.  That brings us into a world where people won’t know what they can trust.

He believes the ultimate threat from deepfakes could be that more and more people opt out of democratic politics; a phenomenon he calls “reality apathy”.

Aviv Ovadya

Reality apathy is when it is so hard to make sense of what’s happening.  People just sort of give up.  Democracy just doesn’t work if people don’t believe in it.

So what can be done to fight back?  A group of scientists at Cambridge University are having a go.  They have developed a computer game to teach people how to spot disinformation.

Dr Sander van der Linden

So in the game people essentially step into the shoes of a fake news producer and you build your way up to a fake news empire by spread fake content online

Dr Sander van der Linden, the game’s designer believes it will help people to distinguish fact from fiction.

Dr Sander van der Linden

So your goal is to get as many followers as possible while maintaining your online credibility, so you can’t be too ridiculous.  And the first badge in the game is about impersonating other people online and of course one example that we’ve talked about as deepfakes … So in the game we test people before and after and at the beginning we found that people are duped by a lot of these techniques but once they played the game they’ve become resistant and are able to identify them later on.

Dr van der Linden’s team have drawn inspiration from preventative medicine in their hunt to cure fake news.

Dr Sander van der Linden

So just as you inject someone with a severely weakened dose of a virus to trigger antibodies in the immune system, you can do the same with information.  People can essentially create mental antibodies and become immune against fake news and essentially everyone is their own bullshit detector.

Today I am President, not because I’m the greatest though probably I am.

Deepfake technology means that faking videos is becoming as easy as facing words and photos.  Until people learn to look at video with a more critical eye there is a danger that deepfakes could be used to undermine democracy

One of the interviews in this film was a deepfake.

But which one?

NOW & NEXT

Supported By

Mishcon de Reya

The Mishcon Academy offers outstanding legal, leadership and skills development for legal professionals, business leaders and individuals. Our learning experts create industry leading experiences that create long-lasting change delivered through live events, courses and bespoke learning.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else