An awesome new AI app turns women into porn videos with one click


From the beginning, deepfakes, or AI -made synthetic media, were available primarily used to make pornographic representations in women, who often find this destruction psychologically. The original creator of Reddit who popularized the face -to -face technology of famous female celebrities in porn videos. To date, research firm Sensity AI estimates, between 90% and 95% of all online deepfake videos are unnatural pornography, and about 90% of those who appear are women.

As technology advances, many are easy to use no code tools also emerged, allowing users to “strip” the clothes of female body images. Many of the services have since been forced offline, but the code is still in open-source repositories and keeps coming back in new forms. The most recent such site received more than 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It’s even taken offline.

There are other photo apps that change faces, such as ZAO or ReFace, which puts users in selected scenes from mainstream movies or pop videos. But as the first dedicated porn face-swapping app, Y has taken it to a whole new level. It’s “adapted” to create obscene images of people without permission, according to Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-driven abuse. This makes it easiest for developers to improve the technology for this specific use case and attract people who don’t think about making deepfake porn. “Any time you specialize like that, it creates a new corner of the internet to be drawn to new users,” Dodge said.

The Y is easy to use. Once a person uploads a photo of a face, the site will open a library of porn videos. Most have women, though a small portion have men as well, mostly gay porn. A user can select any video to make a preview of the result converted to face-to-face in seconds-and pay to download the full version.

The consequences are far from perfect. Many of the facial changes are apparently fake, with the faces glistening and bending as they turn at different angles. But to a casual observer, some are just enough to pass, and the course of the deepfakes has already shown how easily they can go unrecognizable to reality. Some experts argue that the quality of the deepfake is also not important because the psychological toll on victims can be the same in any way. And many members of the public remain unaware that such technology exists, so even poor-quality facial changes can be able to fool people.

To date, I have not fully progressed in getting any of the images taken. Forever, that will get out there. Whatever I do.

Noelle Martin, an Australian activist

Y pays for itself as a safe and responsible tool for exploring sexual fantasies. The language of the site encourages users to upload their own face. But nothing can stop them from uploading other people’s faces, and comments on online forums suggest users have already done that.

The consequences for women and girls targeted by such activity can be devastating. On a psychological level, it can feel like these offending videos are like revenge porn – really intimate videos that have been taken or released without permission. “This kind of abuse – where people misrepresent your identity, name, reputation, and replace it with offending ways – will ruin you to the point,” says Noelle Martin, an Australian activist targeted by a deepfake pornography campaign.

And the effects can stay with victims for a lifetime. Images and videos are hard to remove from the internet, and new material can be created at any time. “It affects your human relationship; it affects you getting a job. Every job interview you look for, it can take. Potential romantic relationships,” Martin said. “Until now, I’ve never had one. success in capturing any images taken. Forever, that will get out there. Whatever I do. ”

Sometimes it’s more complicated than revenge porn. Because the issue isn’t real, women may be skeptical about whether they deserve to feel traumatized and whether they should report it, according to Dodge. “If someone struggles with whether they’re even a victim, it damages their ability to get up,” he said.

Unusual deepfake porn can also have economic and career effects. Rana Ayyub, an Indian journalist turned out victim of a deepfake pornography campaign, received severe online harassment with the consequences that he had to reduce his online presence and therefore needed a public profile to do his job. Helen Mort, a UK -based poet and broadcaster who once shared his story with the MIT Technology Review, said she felt pressure to do the same after discovering that her photos were stolen from private social media accounts to create fake nudes.

The UK government-funded Revenge Porn Helpline recently received a case from a teacher who lost her job after deepfake pornographic images were spread on social media and brought to the attention of her school, as Sophie Mortimer, who oversees the service. “It’s getting worse, not better,” Dodge said. “A lot of women are targeted in this way.”

Y’s option to deepfake gay porn, though limited, poses an additional threat to men in countries where homosexuality is criminal, according to Ajder. This is the case in 71 jurisdictions around the world, 11 where the transgression was punished by death.

Ajder, who has discovered several deepfake porn apps over the past few years, says he tried to contact Y’s hosting service and force it offline. Although he had no hope of restraining the creation of similar devices. Already, another site is popping up that seems to be trying the same thing. He thinks banning such content from social media platforms, and perhaps also making it illegal to make it or consume it, would prove to be a more sustainable solution. “That means these websites are treated the same way as dark web material,” he said. “Even if it’s driven underground, at least it’s put in the eyes of everyday people.”

Y did not respond to multiple requests to comment on the press email listed on its site. Domain -associated registration information is also blocked by the privacy service Restricted for Privacy. On Aug. 17, after the MIT Technology Review made a third attempt to reach the creator, the site posted a notice on its homepage saying it would no longer be available to new users. As of Sept. 12, the announcement is still there.



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *