Jimin lookalike’s death: How AI-generated Saint Von Colucci duped media houses worldwide
You may have heard the story of Saint Von Colucci, an actor who died after undergoing multiple surgeries to look like the BTS boy band star Jimin. The story was picked by dozens of publications – both internationally and in India – and subsequently went viral on social media, sparking outrage and sympathy from lakhs of people. But there’s one problem: Colucci was only a figment of someone’s imagination.
An Al Jazeera report has proven how the whole story was an elaborate hoax, likely produced using a generative AI program. Read on to discover the full story behind one of the most significant cases of AI-generated fake news in recent times.
Who is Saint Von Colucci?
The story of Saint Von Colucci, a Canadian actor who died after undergoing 12 plastic surgeries to resemble BTS singer Jimin, gained traction when Daily Mail reported it. The report claimed that Colucci had relocated from Canada to South Korea in 2019 with hopes of breaking into the K-pop scene, as his publicist Eric Blake allegedly confirmed.
Colucci shelled out $220,000 on 12 cosmetic procedures, ranging from a facelift and a nose job to lip reduction and an eye lift. He succumbed to complications from these surgeries on April 23 in a South Korean hospital.
The report also featured ‘quotes’ from Eric Blake, describing how Colucci used to have dark blond hair and blue eyes and stood at 6 feet tall before his insecurities drove him to drastically alter his appearance. Of course, neither of the two men ever existed.
How did it all start?
The drama began when a press release announcing Von Colucci’s death was sent out to journalists around the world. The press release, which was written in clumsy English, was addressed by a PR agency called HYPE Public Relations.
But despite the awkward language and most of his images online looking as if generated by AI, the story was picked up by dozens of media outlets from the US, Canada, the UK, South Korea, India, Malaysia, and the Philippines.
The debacle has proven right experts who’ve been warning about the role of artificial intelligence, including Deepfakes and generative AI in spreading misinformation. But this is perhaps the first instance of AI being used to trick media outlets at such a large scale.
What busted the news?
As spotted by Al Jazeera, the press release contained a bunch of red flags: many links in the document would not load, including a link to Colucci’s supposed Instagram account. On top of that, the hospital where Colucci had supposedly died was found to be ficticious.
HYPE did have an actual website, though. But it appears to be unfinished and was registered only weeks before Colucci’s reported death. When the media outlet texted the listed number, they received an amusing response: “W*f do u want.”
Then there were clues surrounding the man, Saint Von Colucci, himself. Despite being described as a songwriter for multiple K-pop stars, Colucci barely has any online presence and no one has come forward to mourn his death. Moreover, his images available online are blurry and some of them suffer from the biggest sign of AI use – deformed hands.
Finally, to top it all off, South Korean media have reported that the police have not received any case report about a Canadian actor who died due to complications from plastic surgeries.
This case only solidifies fears of AI being used to spread misinformation at a large scale and highlights how the task of fact-checking and verifying information is set to get increasingly difficult.