It’s been a few years since Deepfakes first appeared on the scene. Currently, this technology is being abused in negative ways, such as on social media to disseminate misleading information. It’s also employed for lighter fare, like placing Nicholas Cage in mediocre films for comedy effect. Deepfakes have been in the news for a while for some of you, since they originally came to light back in 2017. Even our recent piece, “Deepfakes a Cybersecurity Threat for 2020,” discussed the issue. When was the last time you brought this subject up?
The Goalposts Have Moved
Hours of source footage showcasing a real-looking target’s face were needed in the past to make credible deepfakes. As a result, celebrities, politicians, and other prominent individuals were the only ones who could use it. Fake people may now be made with just a photo of them and 5 seconds of their voice thanks to recent advances in Machine Learning. Many individuals these days take images and videos of themselves and share them on social media sites like Facebook and Instagram. For a deepfake, an attacker may just require this. Does it sound ominous? It’s true. The goalposts have moved. Every social media user is at risk of being impersonated over the phone or even on video. Let’s take a look at how these assaults operate and what you can do to protect your business and yourself. If you are blackmailed by العميق التزييف, you can contact us.
The Goalposts Have Moved
Hours of source footage showcasing a real-looking target’s face were needed in the past to make credible deepfakes. As a result, celebrities, politicians, and other prominent individuals were the only ones who could use it. Fake people may now be made with just a photo of them and 5 seconds of their voice thanks to recent advances in Machine Learning. Many individuals these days take images and videos of themselves and share them on social media sites like Facebook and Instagram. For a deepfake, an attacker may just require this. Does it sound ominous? It’s true. The objective has shifted. Anyone with a social media profile is at risk of being impersonated over the phone or even through video conference. Let’s take a look at how these assaults operate and what you can do to protect your business and yourself. We can protect you from فيك الديب very easily.
Calling a Friend
It’s not a new concept to use voice deepfakes. Adobe demonstrated a software named VoCo a few years ago. For roughly 20 minutes of talking, it was able to duplicate a person’s voice rather accurately. It is suspected that this product has been withdrawn owing to ethical and security problems, despite its intended use by audio editing specialists. Other businesses have also taken up where Adobe left off in more recent years. Lyrebird, Descript, and other commercial solutions now imitate or even improve on this concept. The “Real-Time Voice Cloning” open source project can create convincing voice snippets from mere seconds of a person’s speech.
Unfortunately, such an assault isn’t just a theory anymore: It’s now 2019: “A CEO was duped out of $243,000 by using a voice deepfake.” The CEO mistook the other executive for the head of the German parent business. What was it that swayed him? He could tell his boss was calling because of the melody in his voice and the tiny German accent. Having the right voice provided this assailant the credibility he needed to make off with $243,000 from his victim. With this technique in an attacker’s inventory, phishing will become significantly more deadly. We’ve already discussed how potent vishing assaults can be.