Jump to content

AI-assisted fake porn are being used by people on Reddit for self-completion

Sources: The Next Web Motherboard (Vice)

 

 

Quote

Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.

 

So far, deepfakes has posted hardcore porn videos featuring the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza, and Gal Gadot on Reddit. I’ve reached out to the management companies and/or publicists who represent each of these actors informing them of the fake videos, and will update if I hear back.

Fake celebrity porn, where images are photoshopped to look like famous people are posing nude, is a years-old category of porn with an ardent fan base. People commenting and voting in the subreddit where deepfakes posts are big fans of his work. This is the latest advancement in that genre..

 

According to deepfakes—who declined to give their identity to me to avoid public scrutiny—the software is based on multiple open-source libraries, like Keras with TensorFlow backend. To compile the celebrities’ faces, deepfakes said he used Google image search, stock photos, and YouTube videos. Deep learning consists of networks of interconnected nodes that autonomously run computations on input data. In this case, they trained the algorithm on porn videos and Gal Gadot’s face. After enough of this “training,” the nodes arrange themselves to complete a particular task, like convincingly manipulating video on the fly.

 

Artificial intelligence researcher Alex Champandard told me in an email that a decent, consumer-grade graphics card could process this effect in hours, but a CPU would work just as well, only more slowly, over days.

 

“This is no longer rocket science,” Champandard said.

 

The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016. It isn’t difficult to imagine an amateur programmer running their own algorithm to create a sex tape of someone they want to harass.

As compliance to the community standards, I've decided not to include any images of these fake porn GIFs. This reminds a lot of that NVIDIA AI (probably powered by their Titan V) that generates pictures of people but not really from living people.

Here's another example of AI that can generate fake videos using machine learning

 

This is creepy. It looks like Elon Musk's warnings about AI not being regulated is real.

 

Google implemented some sort of machine learning to Google Photos to pattern match faces and objects. Say if I search for "Luke Lafreniere" or "Linus Sebastian" on my Google Photos app, it can tell which photos have their faces. I search for Christmas tree, it will show which ones have it. I don't think Google despite their notorious privacy invasions will be nefarious enough to create a fake sex tape of their users but this becomes problematic if the AI described in the OP becomes used in revenge porn. Remember how Facebook is trying to combat revenge porn by sending them nude photos and hashing it to prevent from being sent to other people. Looks like Facebook needs to up their machine learning algorithms further if they're serious in combating revenge porn.

 

Basically with this new AI, anyone can take a normal video of you from YouTube and create short video clips of porn. While it is easy at the moment to spot fake and real sex tapes, the mere fact that many people on Reddit are already self-completing with these AI generated porn clips is already concerning. They also pointed out that you don't need special hardware to do AI porn and all you need is a decent graphics card and processor (probably RX 480 or GTX 1060 and Ryzen 5 1600 or i5-6600) to do this.

Quote

“We need to have a very loud and public debate,” he said. ”Everyone needs to know just how easy it is to fake images and videos, to the point where we won't able to distinguish forgeries in a few months from now. Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off. Now it can be done by a single programmer with recent computer hardware.”

 

Champandard said researchers can then begin developing technology to detect fake videos and help moderate what’s fake and what isn’t, and internet policy can improve to regulate what happens when these types of forgeries and harassment come up.

 

“In a strange way,” this is a good thing, Champandard said. “We need to put our focus on transforming society to be able to deal with this.”

But the problem is that way too many people are gullible enough to believe everything on the Internet. Someone with a beef or vendetta can use this AI as retaliation. The first victims of these are celebrities. There are studies that demonstrate that porn viewing desensitizes ones genitals and reduces white matter in the brain. https://www.wired.com/2014/06/is-it-really-true-that-watching-porn-will-shrink-your-brain/

 

The first part of this AI porn maker would be much worse celebrity beefs, then followed by fake political attacks and fake sex tape allegations to make up stories about your arch nemesis. It  will be worse than a leaked burn book just like in Mean Girls.

 

giphy.gif

giphy.gif

Edited by hey_yo_
edited tags, added some more stuff to the OP

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is creepy real creepy.

Gaming Mouse Buying Guide (Technical Terms,Optical vs Laser,Mice Recommendation,Popular Mouse Sensor,Etc)

[LOGITECH G402 REVIEW]

I love Dark Souls lore, Mice and Milk tea  ^_^ Praise The Sun! \[T]/

 

 

 

I can conquer the world with one hand,As long as you hold the other -Unknown

Its better to enjoy your own company than expecting someone to make you happy -Mr Bean

No one is going to be with you forever,One day u'll have to walk alone -Hiromi aoki (avery)

BUT the one who love us never really leave us,You can always find them here -Sirius Black

Don't pity the dead,Pity the living and above all those who live without love -Albus Dumbledore

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, you're telling me I can't believe everything I see on the internet?

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

"Remember those social media posts? Well I'm making them into revenge porn!" thought some person out there. On the flip side, you could use it to appear better in the bedroom then you really are ;)

Link to comment
Share on other sites

Link to post
Share on other sites

More troubling is the coming Audio Faking AIs. It's already doable with enough money thrown at the right sources, but we're about to enter an era where Audio & Video have gone from perspective snapshots without context to "can you prove it's real?" isn't something easy.

 

We're probably a few years from the first really well done hoax, that'll be used as a Media Hit Piece, but it's going to be here soon enough. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, VegetableStu said:

oh great now everywhere I go I think of rule 34 now isn't it

Rule 34 rules the World. Has since 1995.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, tjcater said:

"Remember those social media posts? Well I'm making them into revenge porn!" thought some person out there. On the flip side, you could use it to appear better in the bedroom then you really are ;)

impossible for me sorry

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, tjcater said:

"Remember those social media posts? Well I'm making them into revenge porn!" thought some person out there. On the flip side, you could use it to appear better in the bedroom then you really are ;)

"Is it too late now to say sorry 'cause I'm making porn with your body." :D

 

1 hour ago, CUDA_Cores said:

I find it disturbing Redditors were getting off to AI photo-shopped porn, but then I remembered it was Reddit...

Aren't people in Reddit the ones who experience blue balls that often? xD

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I keep telling everyone, technology is advancing too fast for our own good, we Just can't keep up anymore.   

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This will be a huge problem in just a few years. Not porn being made of celebrities, but this technology (plus fabricated audio) being used to create fake evidence and "fake news".

Soon having video "evidence" complete with audio will not actually prove anything, just like a photo doesn't really prove anything today because of photoshop. Not only will it create a bunch of lies being told, denying something will be incredibly easy too, even if they got you on video doing whatever you are denying.

 

The future is not looking bright.

In the meantime I'll have to study these Taylor Swift and Aubrey Plaza videos to validate how real they look. For research purposes of course.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, LAwLz said:

This will be a huge problem in just a few years. Not porn being made of celebrities, but this technology (plus fabricated audio) being used to create fake evidence and "fake news".

Soon having video "evidence" complete with audio will not actually prove anything, just like a photo doesn't really prove anything today because of photoshop. Not only will it create a bunch of lies being told, denying something will be incredibly easy too, even if they got you on video doing whatever you are denying.

 

The future is not looking bright.

In the meantime I'll have to study these Taylor Swift and Aubrey Plaza videos to validate how real they look. For research purposes of course.

As if it isn't hard enough now just refuting the obvious fake science and health pages on the internet without the complication this brings.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

This will be a huge problem in just a few years. Not porn being made of celebrities, but this technology (plus fabricated audio) being used to create fake evidence and "fake news".

Soon having video "evidence" complete with audio will not actually prove anything, just like a photo doesn't really prove anything today because of photoshop. Not only will it create a bunch of lies being told, denying something will be incredibly easy too, even if they got you on video doing whatever you are denying.

 

The future is not looking bright.

In the meantime I'll have to study these Taylor Swift and Aubrey Plaza videos to validate how real they look. For research purposes of course.

In the old days, everyone got their 15 minutes of fame. In the 21st century, everyone gets to star in a porno! I'm not sure that's an improvement.

 

The obvious use is Invented Blackmail. One just needs to look up major Elections around the world to see the timeline for when there'll be high leverage targets for this. Every Intelligence Service of any competence will be all over this technology for the first few years. At least until no one believes Video anymore. (Frank Herbert & Elon Musk are coming out ahead in the prediction department in this space. "Mentats" here we come.) And it's really not that hard to produce a situation, with some research, where a target is in a space that can't be fully denied. 

 

We've actually been in a string of really badly done hatchet jobs for the last few years. That's about to change in the very near future. 2019 is the likely major rollout of some of these hits. Look for any anti-EU party leaders that might rise to leadership of their country in the 2018-2020 election period as the most likely first targets. High-profile Media personalities will also be targeted, but that could span anyone across the globe. All depends who needs leverage.

 

The 21st Century isn't turning out so great so far.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Taf the Ghost said:

 

The 21st Century isn't turning out so great so far.

The human race really needs to take a breath and take stock of technology and the effect it is having, not just in the future but in the right now too.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Taf the Ghost said:

Look for any anti-EU party leaders that might rise to leadership of their country in the 2018-2020 election period as the most likely first targets.

That's a pretty dangerous thought to have. It sounds like you've already made up your mind who to trust and distrust when faced with what could be fake, but also could be real information and evidence.

It's good to be skeptical, but don't jump to conclusions. That can very easily lead to confirmation bias which in and of itself will encourage that type of bad behavior.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Jman87 said:

Soooo...

 That's interesting. I've always wondered why I subconsciously avoid photographic evidence of my existence, and now I know. Lol

And here I was thinking I was the only one on the entire planet that hated getting their picture taken...

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, LAwLz said:

That's a pretty dangerous thought to have. It sounds like you've already made up your mind who to trust and distrust when faced with what could be fake, but also could be real information and evidence.

It's good to be skeptical, but don't jump to conclusions. That can very easily lead to confirmation bias which in and of itself will encourage that type of bad behavior.

https://www.usatoday.com/story/news/world/2016/07/01/austrian-court-overturns-presidential-election-result/86589342/

 

Hit Pieces only work well when you control the Media, which means the attacks will come from the Tech-aligned major Leftwing groups that control most of the Intelligence Services. The non-Western attackers will be either the Chinese or South Koreans (who have a far higher Attack Tech base than most realize). Russia is always a mixed bag with this type of approach, as they leverage their physical assets more. 

 

In the first run of these attacks, the works will be expensive and time-intensive. Though Western Intelligence Services have been so bad at pulling off most of their InfoOps late, I wouldn't put it past them to brutally screw up the first few and waste this attack vector that they'll have for 3-5 years. Somewhere in the mid-2020s, you'll be able to do this as a function in common professional editing software.

 

There'll be a point at which the cost is low enough this'll be an attack vector against certain chunks of elites in most countries to keep them in line. What the "line" is wholly depends on where they live. I hope everyone is going to enjoy the stories "Video leaks of Big Star doing nasty things on camera! You won't believe it!" hitting every few months in the 2019 and beyond period.

Link to comment
Share on other sites

Link to post
Share on other sites

AI is just making it easier, it is already possible with a huge budget.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Captain Chaos said:

Next up : Apple animoji pron, the chicken does the turd xD

I'll stick to this :P

 

 

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

My concern is the impact this will have on the justice system, in which an attorney can say their client never said X, and there is reasonable cause for doubt.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dingus said:

My concern is the impact this will have on the justice system, in which an attorney can say their client never said X, and there is reasonable cause for doubt.

Which I think it's time for laws to be updated to be in sync with the modernization of society. The EU has their GDPR but I don't think the rest would follow and it will depend on the country.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×