About this project.
Perception Inception is a research project funded by the New Zealand Law Foundation’s Information Law and Policy Project, and affiliated with the Centre for Law and Policy in Emerging Technologies at the University of Otago Faculty of Law.
The project examines ethical, legal and social questions arising from state-of-the-art emerging technologies. These technologies are capable of producing audio-visual information in ways we have not seen before, and of kinds we are unfamiliar with.
A challenge of the project is drawing a line around the subject. New Zealanders consume huge amounts of audio-visual information and much of this would not exist without some degree of augmentation. In fact, people actively seek out augmented and manipulate information in many contexts. From digital and audio effects in the film and gaming industry, to advertising products and services, to homemade music production, to instagram filters, to deepfake technologies.
We’re frequently asked about “fake news”, artificial intelligence and cybersecurity. Our project will touch on these topics, but always in the context of asking, “how can we rely on audio-visual information?”
Our response to these developments has to consider the needs of New Zealanders as creators, citizens and consumers. We will be reaching out to industry, policy, legal and creative communities. Please don’t hesitate to get in touch.
A video emerges of Prime Minister Jacinda Ardern. It's a short clip, but it seems to depict her speaking with someone. The person is a radical activist. They're associated with controversial positions on issues of national significance: perhaps te Reo Māori, the rights of women, positions on extractive industries and climate change, or plans around the ownership of state resources.
Requests for comment are met with an absolute denunciation of the meeting in question, but the public interest is undeniable. The video continues to circulate on social media.
Domestic and international media report on the fact of the video's emergence and circulation. Later, an audio recording emerges. It's only a minute long, but it clearly records the Prime Minister making comments that allude to a radical position on a range of political issues. It is wildly inconsistent with her previous statements. It causes massive disruption, invites questions and speculation from the opposition, and among some groups, undermines her future credibility.
In response, the Prime Minister points to the existence of technologies that can artificially manipulate audio-visual information: her voice can be "sampled" and her image can be imposed or recreated entirely. How does the public respond? Ardern's mannerisms are the same, along with her distinct facial profile, expressions and idiosyncrasies. When she speaks she uses her own distinctive voice.
Now imagine, perhaps, the Chief Executive of publicly listed company, a large Government Ministry, or perhaps of a Crown Entity. Perhaps a statement by the Chairman of a Board of Directors leading to allegations of a conflict of interest. What about people without the same public standing, but with significant roles of a more private nature: can their positions continue in light of apparently damning evidence about them? What about similar material relating to candidates for significant positions, such as, for example, New Zealand's Chief Technology Officer? How should you respond to a call – perhaps even a video call – from a loved one seeking emergency financial assistance?
In the examples above, perhaps requests for comment continue to be met with denials and ultimately retractions are issued. But before then, share prices are affected, reputations are irreparably damaged, and there are lasting implications for public trust and confidence.
At the same time, content creators continue to push the boundaries of entertainment, research and art driving billion-dollar industries. In recent years we have seen dead performers brought back to digital-life, and even the creation of entirely new digital avatars with faces and voices of their own. Across film, gaming, television, marketing and business solutions, a new wave of audio-visual information is enriching our lives.
With the support of the New Zealand Law Foundation and the Centre for Law and Policy in Emerging Technologies, the "Perception Inception" project will explore the present state-of-the-art emerging audio-visual technologies, their ethical and social implications, and how they interact with existing legal norms.
Seeing is believing.
What could you do if you could impersonate anybody's face and voice? Could you fool human or technological biometric security measures? New technologies that can photo-realistically replicate faces, imitate voices and animate bodies are a novel challenge.
Most of us consume audio-visual content on a massive scale. It helps us form beliefs and make important decisions. An audio clip or a video can change the way we feel about somebody, maybe even how we vote. With a blurred divide between the real and unreal, the fake and authentic, we are vulnerable to making decisions based on less accurate information, assuming we can define "accuracy" in the first place.
Creators of some of our most captivating content are using emerging audio-visual technologies to elevate their work. Avatars of real people can be made to do things they never did, or entirely new people can be created, moulded and animated, stretching the boundaries of reality and how we perceive it.
Artful marketing walks a line between legitimate and illegitimate deception. Products and services are presented in a way that will engage consumers. In considering how to respond to the threats of emerging technologies, how do we ensure legitimate commercial activity is respected?
"No one's ever really gone," said Luke Skywalker. In the world of film and art this is increasingly true. New technologies are bringing dead actors back to life on our screens. Their photorealism grows increasingly compelling. To what extent do the living or dead have legal claim over their distinct mannerisms, face or voice?
Audio-visual information can provoke real-world consequences. Words, appearances and actions matter in diplomacy. They can provoke armed conflict, have economic consequences and affect reputations, markets and stock prices.