Distorting Reality: Deepfakes and the rise of Deception

Emerging technologies are making it easier to create and manipulate information, by creating realistic images, videos and sounds that make the subject appear as if they are saying something they never actually said. Given our tendency to believe what we see, it’s easy to understand the inherent risks that could arise with the misuse of such technology.

The New Zealand Law Foundation has funded a report “Perception inception: Preparing for deepfakes and the synthetic media of tomorrow” which looks at the wide-ranging social, legal and policy issues around such technology.  The full report can be read here.

While misinformation in the media is nothing new, technology’s escalating sophistication makes manipulated videos (coined “deepfakes”) ever more realistic and increasingly resistant to detection. Add to the mix the power of social media, where fake news can be spread without any accountability, and it is possible to understand the potential for harm.

The term “deepfakes” first emerged in 2017, where a Reddit user called “deepfakes” published a series of fake pornographic videos that featured celebrities faces imposed on the bodies of the actresses.  Of course superimposing a celebrity’s face onto pornography was not something new, however the videos stood out, as they marked the first instance where high quality and convincingly fake videos were able to be created by an individual, at home on their computer, in a couple of hours.

The New Zealand Law Foundation Report recognises the rapidly evolving technology in this space. It is no longer a case of fake images created in Photoshop. Machine learning technology is capable of studying the movement of a person’s face and replicating it into a learning algorithm associated with a word, phrase, attitude or feeling.  With the assistance of advanced graphic technology, once sufficient learning has been achieved, an almost life-like effect can be attained.

It is this sophistication that has legislators the world over looking at their own regulatory landscapes, and how best to manage the risks associated with such technology. In the United States policy makers are riding a wave of concern, introducing specific deepfake legislation. In New Zealand, however, the Report warns not to get carried away.

The Report recognises that that the threat of deception posed by synthetic media technologies is not a new one, as all digital media entails some degree of synthesis or manipulation.  The authors recommend caution in developing any substantial new law without first understanding the complex interaction of existing legal regimes. The Report points to a number of laws already capable of dealing with many of the posed risks, including the Crimes Act, the Privacy Act and the Harmful Digital Communication Act. Instead of developing new legislation, the Report recommends that where new law is necessary, nuanced amendments to existing regulations should suffice.  The Report also sets out actions that agencies responsible for the existing laws could take, such as defining and providing certainty to the public around the agencies’ responsibilities under those laws.

In outlining the existing laws already capable of dealing with the emerging risks, the Report also highlights the complex level of considerations relevant to individuals or agencies that may be generating or disseminating synthetic media, or technologies to create such media. While deepfake technology may be used playfully or as satire, individuals and agencies should be careful to ensure their activities fall within the limitations already in place by existing laws.

If you would like to understand more about the laws and regulations relevant to the use of synthetic media and deepfake technology, please get in touch with our technology law experts.

Business Law Team

Gerard Dale, Claire Evans, Graeme Crombie, Evelyn Jones, Anna Ryan, Joelle Grace, Nicola Hardy, Peter Orpin, Ellen Sewell, Matt Tolan, Kristina Sutherland, Caroline Cross, Jacob Nutt, Danita Ferreira, Whitney Moore, Alex Stone, Ben Cooper, Lisa Catto

also in this edition

Business law newsletter:

Contact

Graeme Crombie is a Lane Neave partner in the corporate team

Graeme Crombie
Partner, Lane Neave

t +64 3 372 6392
m +6 21 634 849
e (click to email)

Whitney Moore
Senior Solicitor, Lane Neave

t +64 3 372 6376
e (click to email)