in , ,

Deepfakes: A Red Alert to Women

The skill used to manipulate mortal portrayals based on artificial intelligence is what Deepfakes is.

Photo by Plann on Unsplash

You are living a new normal life, having fun with your kith and kins, suddenly your life turns upside down into the worst nightmare after you watch yourself in a sexually explicit video in which you have never ever taken a part. This has become a certain prospect today for female fame or an ordinary woman belonging in the era of Deepfakes.

The skill used to manipulate mortal portrayals based on artificial intelligence is what Deepfakes is. Using various machine learning techniques, it can easily juxtapose image or visual recordings with another source of clips. Deepfakes modifications look absolutely authentic.

Open Sourced Algorithms:

It is found from a current report by Deeptrace that, the number of Deepfake videos got made twice as much from 2018 to 2019 on the Internet. The reason behind this rapid rise of Deepfake videos is that the algorithms related to it are open-sourced and are easily convenient for the person with good knowledge and basic programming abilities. A credible internet connection is also an important factor. The Anonymous who created the popular code repository is vastly available in Github and is free for all users. Due to this, the non-experts can also utilize this without any sort of hurdles.

In the month of October 2018, the auction house named Christie sold an oil painting produced by GAN(Generative Adversarial Networks) for $435,000, approximately 45 times its highest assessment. The painting was developed by a group of French learners, employing a dataset of almost 15,000 portraits beginning from the 14th to the 20th century.

Although Developed for Good, Being Used for Cyber Exploitation:

All though AI techniques as of Deepfake have many implementations, that includes photo editing, image repairing, and 3D adaptation. Unfortunately, today their fundamental application is developing sexually explicit videos in order to cyber exploitation.

As per the Deeptrace report says, 96% of the Deepfake videos on the Internet are or related to pornographic videos. Not surprisingly, the major casualties portrayed in the fake videos are women, whose portrayals and resemblances are utilized without their agreement, and continued without their awareness.

Revenge Porn being the Primary Concern:

A primary concern is Deepfakes’ probable consequence on revenge porn. As sighted in the case with Representative Katie Hill, revenge porn can terminate someone’s political career. The use of technologies like Deepfake, revenge porn, or cyber smear campaigns can adopt a whole new dimension and potentially can impact a broader population of sufferers. Among the 85,000 circulating online, there are 90 percent depict non-consensual porn spotlighting women.

Who can be the victim?

In the year 2020, Helen a British writer was awakened for a series of deepfakes on a pornographic site that emerged to show her engaging in violent acts of sexual violence. This feeling extended throughout Helen’s life left her feeling exposed. She also experienced panic attacks but didn’t get any idea about who did this to her.

The next comes the Indian Journalist, Rana Ayyub. She used to face hatred over social media sites but never bothered much about it. But in the year 2018, a deepfake was made by someone to humiliate her and it ran viral, conveyed among significant political spheres in India in which she was involved. Similar to Helen, Rana is now a self-censor.

It appeared as if deepfakes would disappear when social sites outlawed them after getting pressurized by the public. But constant inactivity by legislators has emboldened this organization again. Then pandemic emerged, lending creators and viewers more time and chance to exploit women’s misery online. Whereas the act by working from home mediators went slower.

There are more than thousands of clips in which the faces of idols, like Gal Gadot, Taylor Swift, Scarlett Johansson, Emma Watson, or even seventeen-year-old TikTok star Charli D’Amelio, are superimposed onto the faces of porn stars. Porn deepfakes not only features celebs but also showcases the faces of non-famous female entities such as ex-spouse, ex-girlfriends, high school crushes.

Deepfakes like this, are often not aiming to fool observers. According to media philosopher Milena Popova, porn deepfakes are virtually labelled frauds, with some creators taking satisfaction in them as a kind of either fanfiction or media remix.

Like any other aspects of revenge porn, ethical issues of approval and objectification, clearly made us understand that the clippings need not be actual to commit real harm. Not only spreading humiliation and trauma among the unfamiliar women whose faces made appropriated; they also abuse the sex workers who are digitally beheaded by the system.

Digital Media and Deepfakes:

Prostitutes work in these scenes for earning a profit, and being reimbursed is how they exist. Either it’s filmed under the agreement or built DIY-style, porn that is modified and shared non-consensual is offensive, materially as well as morally.

Deepfakes can be tough to overthrow from a libel or slander position, so possibly a more beneficial remedy would take porn seriously as a part and parcel of the digital economy and crackdown on deepfaking as a copyright violation.

In digital media, Deepfakes are a recent and influential genre. They exemplify a creative method with an enormous capacity for hoax and fantasy building and also the peril of disinformation. But the continuation of framing the technology is completely based on what we anticipate.

How Deepfakes Are Being Harnessed:

To harm women, who are victimized by unidentified makers with no concern for permission, and to distress porn workers.

A federation of survivors and advocates that includes Helen and Gibi has initiated the campaign #MyImageMyChoice, aiming for legal changes around the globe. Publishing articles on survivor stories worldwide, from Germany to Australia, #MyImageMyChoice is striving for a joined-up entire human-rights remedy to the situation.

More than 45,000 people have signed the entreaty asking the U.K. administration to establish world-leading private image abuse laws. Now, the campaigners desire these notions to put into procedure, not just in the U.K. but around the world.

What do you think?

Written by TEAM WSL

At Whatshelikes, we keep the millennial woman updated with everything that’s happening around her be it fashion & lifestyle, health, events, movie reviews to name a few. We consistently strive to bring a gamut of authentic content from all across the country at just a click.


20 New Age Professions Where Women Are Making Their Marks

Home Decor

20 Home Decor Changes That Covid Brought To Our Life