‘Alarming’ Tom Cruise Deepfakes Spark Concern After Taking Over TikTok
If you’ve been on TikTok over the last few days, the chances are you’ll have seen Tom Cruise in a number of bizarre videos.
The actor has gone viral over the past week, in clips which appear to show him playing golf, doing magic tricks, and taking a tumble before recalling the time he met the former Soviet Union President Mikhail Gorbachev.
And yet, despite the scarily life-like nature of the clips, these aren’t actually clips of Cruise doing any of these things. They’re part of a viral trend, which uses his face in a series of deepfake videos.
Last week, an account simply titled ‘deeptomcruise’ appeared on the picture sharing site, with a number of deepfake videos of the Mission: Impossible star, which have since been taken down from the account, but not before they were shared all over the internet.
One of the videos appears to show Cruise performing a magic trick, while wearing a funky Hawaiian shirt. The person says, ‘I want to show you some magic. It’s the real thing – I mean, it’s all real,’ hinting that the video is, of course, not real at all.
While these particular videos might be seen as harmless, experts have begun raising concerns surrounding realistic deepfake videos as a whole, with one saying that ‘seeing is no longer believing.’
Mckay Wrigley, founder of Bionic, an artificial intelligence app, shared one of the TikToks, writing: ‘See this video of Tom Cruise? Well, it’s not Tom Cruise. It’s AI generated synthetic media that portrays Tom Cruise onto a TikTok user using Deepfakes. Seeing is no longer believing.’
‘We are woefully unprepared for this,’ he added. ‘AI safety needs to be bumped up our list of priorities.’
The post was then shared by Sam Gregory, Program Director at Witness.org, a company which uses ‘video and technology to protect and defend human rights.’
‘So you’ve seen Tom Cruise deepfake, but what SHOULD worry us with deepfakes?’ he wrote.
‘Women are already being targeted by deepfakes and seeing is no longer believing rhetoric undermines real video.’
Others pointed out that in addition to issues surrounding fake revenge porn and the likes, deepfake could potentially be used in cases of identity fraud.
Another raised concerns that fake videos from religious leaders, doctors or anyone we trust, spreading potentially harmful information, could be incredibly damaging to society.
If you have a story you want to tell, send it to UNILAD via [email protected]
Most Read StoriesMost Read