The Age of Realtime Deepfake Fraud Is Here
www.404media.co
At least now I saw youre way more gorgeous and more beautiful than you were in the photo you sent me, an older white man with a greying beard says during a Skype video call. He is talking to an elderly woman who appears to be in her car, staring into her phones front-facing camera.She laughs at the compliment, and the smiling man keeps going. I think I should send security to keep you safe, so no one comes, he says. To that, the woman laughs even more. Ill be okay, she reassures the man.The bearded man, however, doesnt really exist. Instead, he is a realtime deepfake created by a fraudster, likely to lure the woman as part of a romance scam and have her send money. Someone filming the interaction captures what is really happening: a young Black man is sitting in front of a laptop and webcam, and software is then automatically transforming his appearance to that of the much older white man and feeding that into Skype, all live.Other realtime deepfakes from fraudsters include a Black man making himself appear as a white woman, and many cases of scammers being able to hold objects during the live call without breaking the illusion. One deepfake scammer even included an American flag in the background of their live video call. 0:00 /0:30 1 A video shared on Telegram. Redacted by 404 Media to protect the victim's identity. This is the reality of fraud today, where scammers are able to digitally manipulate their appearance in real time to match a photo on a drivers license, dating profile, or a celebrity. Like a chameleon, these scammers, who can be involved in everything from romance scams like the one in the video to fraudulent tax refunds, can hide their true appearance with just a laptop and phone and produce very realistic results.Its way better now, Format Boy, a self-described and high profile Yahoo Boy, told me in a Telegram chat referring to deepfakes. Yahoo Boys are fraudsters typically based in Nigeria who traditionally used Yahoo emails as part of their scams, but have now broadened to all manner of schemes. Before we couldn't do somethings; we can do now.Do you know anything else about realtime deepfake fraud? Do you carry it out yourself, or investigate it? I would love to hear from you. Using a non-work device, you can message me securely on Signal at joseph.404 or send me an email at joseph@404media.co.It is now very common to see spammy video ads on social media that use a deepfaked version of a celebrity or public figure. 404 Media previously reported on a rampant scam on YouTube in which fake versions of Taylor Swift, Steve Harvey, and Joe Rogan pitched Medicare scams. 404 Media also reported on another set of deepfake ads, also on YouTube, featuring the likenesses of Arnold Schwarzenegger, Sylvester Stallone, Mike Tyson, and Terry Crews selling erectile dysfunction supplements.But those were pre-rendered, static files. They were recorded beforehand, likely tweaked and perfected over time, and then uploaded to YouTube. The promise of realtime deepfakes for fraudsters is that they can use the tech to engage with a victim in the moment. Rather than some scripted video which may or may not be tailored to the victim, realtime deepfakes allow a scammer to talk directly to their mark and improvise on video calls or livestreams. They can appear just as human as the person they are impersonating, potentially fooling not only people but also the automated systems that require someone to prove their identity to open an account with a financial institution, for example.For months 404 Media has monitored the spread of deepfake technology throughout fraud-focused Telegram channels. For much of that time, the results were not impressive. Some involved using AI to animate a photo in an attempt to bypass cryptocurrency exchanges identity verification processes and the videos were stilted and unnatural. Others looked more realistic, but it was unclear whether the advertisements were scamsfraudsters on Telegram asked for hundreds of dollars for access to their tool that allegedly bypassed know-your-customer (KYC) verification checks. Some fraudsters also advertised access to tools that let a phone user replace their cameras input with a file from their phones gallery, meaning they could upload the deepfake video to services that ask for a selfie. 404 Media has also seen Instagram accounts where a real person consistently deepfakes themselves to appear as a different gender in order to catfish people.Images from a fraud-focused Telegram channel.But recently the quality has improved massively, with fraudsters demonstrating theyre able to hold entire realtime video calls for extended periods of time while maintaining their deepfaked persona, according to videos viewed by 404 Media.In another example, a handsome man in a black baseball cap says I love your smile over a WhatsApp video call while giggling. Hes chatting with an elderly woman with an American accent.I didnt know this was going to be a video call, she says after a pregnant pause. The man laughs flirtatiously and grins.Im barely awake and Im not dressed, she says.A camera filming the interaction then pans away and zooms out, and shows the scammers setup. A laptop sits atop a PC tower, with a giant screen showing various pieces of software.Format Boy spends much of his time telling other Yahoo Boys how to perform realtime deepfakes in videos he uploads to YouTube and Instagram. On his YouTube channel he claims his content is for pranks and educational purposes. But on Telegram his channel bio says he has been a Yahoo Boy for more than seven years. Comments on his YouTube videos indicate he has viewers from Ukraine, Ghana, and Cameroon.One realtime deepfake method Format Boy explains on YouTube involves a phone, a laptop, a piece of screen mirroring software, the livestreaming tool OBS to set up a virtual camera, NDI tools to route the video feed, and a faceswap app called Amigo AI. Amigo allows users to upload a photo of their choosing to base the deepfake on.Another similar method involves a face swap tool called Magicam, which also allows an user to upload an arbitrary photo, and a ring light. Format Boy says in the video that having a ring light is important to make the resulting deepfake more detailed; multiple videos and images 404 Media has seen of fraudsters creating realtime deepfakes include ring lights. 0:00 /0:27 1 A video shared on YouTube. Redacted by 404 Media to protect the victim's identity. A representative for Magicam told 404 Media in a Discord message that we were not aware that Magicam had been used in any fraudulent activity.At this time, we do not have built-in mechanisms to detect or prevent malicious use. However, in light of recent developments, we recognize the urgency of this issue and are actively exploring solutions that can help us better safeguard our technology while still respecting user privacy, they added.Amigo AI did not respond to a request for comment.A third technique uses tools called DeepFaceLive and DeepLiveCam which require a beefier laptop with a modern GPU to power the realtime deepfake. Fraudsters seem to move from one app to another when options that produce high quality videos for cheaper become available.With their set up in place, the fraudster can then make a video call over WhatsApp, Skype, Zoom, Google Hangouts, Telegram or another service while presenting themselves as whoever they want. Scammers also turn to voice changers, either ones already baked into the deepfake app or small pieces of hardware that can make them sound like someone else, according to other videos.Format Boy has posted in his Telegram channel about deepfakes since late 2023, through 2024, and multiple times this year too. Over the months, and then years, it is possible to see how much easier creating deepfakes has become for fraudsters, while their realism has simultaneously gone up.Format Boy told 404 Media that the current deepfake tech works for romance scams, but he didnt think it was good enough for bypassing KYC checks. I think were almost there, but not quite yet, he said.Stills and images from a fraud-focused Telegram channel.But multiple AI and cybersecurity researchers say fraudsters are using realtime deepfakes to target the systems that are supposed to check that a person appearing on camera is real and not artificially generated. And in some cases those fraudsters are being successful.We're seeing some of our partners actually getting those deepfakes, David Maimon, head of fraud insights at cybersecurity company SentilLink and a professor at Georgia State University, told 404 Media. In some cases, the fraudsters generate a new face with ChatGPT, superimpose that image onto a drivers license, then apply to open new bank accounts with that stolen or synthetic identity. Some also perform tax refund scams, where fraudsters submit a victims tax return in order to get an IRS refund themselves, Maimon said.And when they are asked to sort of prove liveliness with the videos, they're able to do that as well, Maimon said, adding that the frequency of these fraud attempts has gone up over the past year. Maimon showed 404 Media a video in which a deepfake successfully passed the identity verification process on CashApp. CashApp acknowledged a request for comment but did not provide a response in time for publication. Maimon also published some of his findings in a blog post on SentiLinks website earlier this month.We have seen a sizable uptick of requests to use our product in the last few months alone related to two different things: integration with web conferencing platforms (Zoom and Teams are the two most requested platforms, which we already integrate with) and KYC verification for user onboarding, a spokesperson for Reality Defender, a company whose product is designed for enterprises to detect deepfakes, said in an email. We are only privy to the incidents that clients have shared with us willingly, but can confirm that entities we work with have had real time deepfake issues in Africa, in Singapore, and in North Korea, among other places.Format Boy for his part continues to post tips online. His Telegram bio says Im here to make millionaires.
0 Comments ·0 Shares ·14 Views ·0 Reviews