Our voices are just about as unique as our fingerprints – so how would you feel if your voice was cloned?
A new type of deepfake known as voice cloning has emerged in recent months, in which hackers use artificial intelligence (AI) to simulate your voice.
Famous faces including Stephen Fry, Sadiq Khan and Joe Biden have all already been victims of voice cloning, while one unnamed CEO was even tricked into transferring $243,000 to a scammer after receiving a fake phone call.
Voice cloning is an AI technique that allows hackers to take an audio recording of someone, train an AI tool on their voice, and recreate it.
Speaking to MailOnline, Dane Sherrets, a Solutions Architect at HackerOne, explained: ‘This was originally used to create audiobooks and to help people who have lost their voice for medical reasons.
‘But today, it’s increasingly used by Hollywood, and unfortunately scammers.’
When the technology first emerged back in the late 1990s, its use was limited to experts with an in-depth knowledge of AI.
However, over the years the technology has become more accessible and more affordable, to the point where almost anyone can use it, according to Mr Sherrets.
‘Someone with very limited experience can clone a voice,’ he said.
‘It takes maybe less than five minutes with some of the tools that are out there which are free and open source.’
‘Having a CEO’s voice makes it a lot easier to get a quick password, or access to a system. Companies and organisations need to be aware of that risk.’
Thankfully, Mr Sherrets says there are several key signs that indicate a voice is a clone.
‘There are key signs,’ he said
‘There are the pauses, the issues where it doesn’t sound as natural, and there might be what you call “artefacts” in the background.
‘For example, if a voice was cloned in a crowded room and there’s a lot of other people chatting, then when that voice clone is used, you’re going to hear some garbage in the background.’
However, as the technology continues to evolve, these signs will become trickier to spot.
‘People need to be aware of this technology, and constantly be suspicious of anything asking them to act urgently – that’s often a red flag,’ he explained.
‘They should be quick to ask questions that maybe only the real person would actually know, and not be afraid to try and verify things before they take any action.’
Mr Sherrets recommends having a ‘safe word’ with your family and friends.
‘If you really are in an urgent situation, you can say that safe word and they’ll instantly know that this is really you,’ he said.
Finally, the expert advises being aware of your digital footprint, and keeping an eye on the amount you upload online.
‘Every time I upload now, it expands my audio attack surface and could be used to train AI later,’ he added.
‘There’s trade-offs to that that everyone will need to make, but it’s something to be aware of – audio of yourself that’s floating out there can be used against you.’