Deepfakes – friend or foe? — Remit Consulting

Deepfakes – friend or foe?

Deepfakes are all over the news lately – whether it’s music artists being imitated against their wishes, a headteacher being wrongly accused of having racist views in a multicultural school, or a Prime Minster going through the courts to fight deepfake porn depicting her.

It’s pretty much all negative, and with good reason: deepfakes can cause real damage to people and property.

Many of the leading thinkers would like to see deepfakes banned, or at least serious and effective regulation and control in place over their use. But getting regulation through is slow, and getting the companies investing in AI to self-regulate is going to result in conflicts of interest at the very least.

Prior to writing this, I hadn’t come across a single positive article about or application for deepfakes, or ‘synthetic media’. So, I asked around and there are some thought leaders trying to piece together possible applications:

  • Education and training - learning could be made more engaging if you could have a conversation with Malcolm X about his motives for change, or if Michael Gove could explain how the changes to Building Safety Bill affect your portfolio (...probably not for everyone, but you get the gist!). Personally, I'd love to be able to choose who voices the audiobook I'm listening to at any given time.

  • Customer service - an artificial concierge can consistently handle visitor interactions and queries, using detailed information about expected visitor and host preferences, to ensure a seamless experience.

  • Virtual tours - imagine virtual open houses where potential buyers can take immersive tours, guided by lifelike avatars of agents, who can answer questions in real time.

But this doesn’t negate any of the issues. How would Michael Gove feel, knowing likeness was out there, saying things outside his control? Or what happens if an artificial concierge gets hacked and lets anyone into a top security building? And how can you trust what you are buying if other aspects of the sales process are artificial?

Transparency will be key; for example, disclosing when synthetic media is being used can help maintain trust with clients.

Additionally, industry bodies should work with the sector to establish standards that ensure deepfake technology is used responsibly and that the rights of people and companies are protected.

A world in which deepfakes can be successfully integrated whilst leaving music artists, headteachers and Prime Ministers unscathed, is our hope.

But there is work to be done yet!

Share