Deepfake Technology

Quantum Valley
4 min readApr 14, 2021

Not Everything You See is

Can you believe that Obama just called Trump a dipshit? How about when Tom Cruise played Iron Man or when the Queen delivered her alternative Christmas speech?

Do you think these videos or even pictures are real or are they just like some manipulation?

Artificial Intelligence and Deepfake Technology

Artificial Intelligence and Machine Learning, which are good in the Cybersecurity world. With this, we can learn from data, identify patterns and make decisions with minimal human intervention. But like many other technologies, the use of AI can have opposing sides as well. One area which has received massive attention in the last couple of years is the creation of Deepfakes. Over the past few years, deepfakes have entered mainstream consciousness, blurring the line between real and fake media. And as technology improves, it threatens to make that difference fully indiscernible.

Deepfakes are potent tools used for exploitation and disinformation. It could also influence elections and erode trust.

“This is a dangerous time. Moving forward we need to be more vigilant with what we trust from the internet. It’s a time when we need to rely on trusted news sources.” says Jordan Peele.

There’s growing concern that this new technology could become a danger to our democracy in the wrong hands. However, the most immediate threat presented by deepfakes is not political; it’s pornography.

Many people are now using women’s faces to produce porn videos they never consented to be in, so deepfake is very harmful, even when people know that it’s not real.

In a survey conducted in September 2019, researchers at Deeptrace found that of the deepfake videos they could identify online, 96% were pornographic.

Taking a woman’s face and putting it into this context is part of a long history of using sexual humiliation against a woman.

This is what makes deepfake porn videos so invasive. The video creator takes away a victim’s control of their face and uses it for something else they never wanted.

Anyone’s face can be put into frames and create videos like these. How would you feel if you were a victim of this kind of situation?

Noelle Martin, a victim of deepfake videos, was not a celebrity. She was just a typical person living a normal life, yet pictures and videos of hers leaked on many different porn sites.

“This is probably one of the most difficult things because fake porn and my name will forever be associated,” she said. “My future children will have to see things. My future husband will have to see things. And that’s what makes me sad.”

Another situation in which deepfake is involved is when a Pennsylvania mother appeared in the news on the 3rd of March 2021. She was accused of creating deepfake images to harass members of her daughter’s cheerleading team anonymously. Raffaela Spone allegedly bullied three girls who were members of the Doylestown Victory Vipers cheerleading gym over the summer. She created false images that made it look like the girls were nude, drinking alcohol, and vaping. She also sent texts from phone numbers purchased online urging one girl to kill herself.

These videos are actually difficult to detect. Several U.S. laws regarding deepfakes have taken effect over the past years, which are introducing bills to criminalize deepfake pornography and prohibit the use of deepfakes in the context of an election.

Regardless of the issues in which deepfake is involved, we cannot deny that it still has its positive uses. Cereproc, a Scottish company based in Edinburg, creates digital voices for people who lose theirs from disease. Computer scientist Supasorn Suwajanakorn believes that this technology could offer students a better chance of learning. If you also ever thought of seeing your beloved dead ones back to life, a deepfake tool — MyHeritage Deep Nostalgia feature has made it possible for you. It A bit uncanny, don’t you think?

The transition to this new world is going to be difficult. And there is still a long way to go before we can fully ensure our safety in this technology. On the one hand, it’s incredibly fun. But on the other, it’s a tremendous responsibility. So, we need to distinguish what is fake from what is accurate before we click the button.

--

--