9.8 C
London
Monday, October 3, 2022

AI is getting better at generating porn. We may not be prepared for the consequences. – londonbusinessblog.com

Must read

Is it worth starting a business under recessional conditions?

It would seem that now is not the best time to start your own business. In fact, rising inflation, increasing energy prices and general...

British Prime Minister Liz Truss cancels tax cut plan that caused pound to crash

LONDON — British Prime Minister Liz Truss on Monday scrapped her signature plan to cut taxes on the country's top earners after it sparked...

South African startup Talk360’s seed funding hits $7 million after new support • londonbusinessblog.com

Months after the first conclusion of the seed round, Talk360a South African Voice over Internet Protocol (VoIP) startup, has raised an additional $3 million,...

Prosus ends $4.7 billion acquisition of India’s BillDesk • londonbusinessblog.com

Prosus has called off the $4.7 billion acquisition of BillDesk announced last year, once slated to be the European tech giant's largest acquisition, and...
Shreya Christinahttps://londonbusinessblog.com
Shreya has been with londonbusinessblog.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider londonbusinessblog.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

A red-haired woman stands on the moon, her face darkened. Her naked body looks like it belongs on a poster you’d find on a hormonal teen’s bedroom wall — that is, until you reach her torso, where three arms spew from her shoulders.

AI-powered systems like Stable Diffusion, which translate text prompts into images, have been used by brands and artists to create concept images, award-winning (albeit controversial) prints and full-fledged marketing campaigns.

But some users, wanting to explore the dark side of the systems, have tested them for a different kind of use: porn.

AI porn is about as disturbing and imperfect as you’d expect (that redhead on the moon probably wasn’t generated by someone with an extra arm fetish). But as the technology continues to improve, it will raise challenging questions for AI ethicists and sex workers alike.

Pornography created using the latest image-generating systems first made it to the 4chan and Reddit discussion boards earlier this month, after a member of 4chan leaked the open source Stable Diffusion system ahead of its official release. Then, last week, what appears to be one of the first websites dedicated to the high-fidelity AI porn generation was launched.

The website, called Porn Pen, allows users to customize the appearance of naked AI-generated models – who are all women – using switchable tags such as “babe”, “lingerie model”, “chubby”, ethnicities (e.g. “Russian” and “Latina”), and backgrounds (e.g., “bedroom,” “shower,” and wildcards like “moon”). Buttons capture models from the front, back, or side and change the look of the generated photo ( eg “movie photo”, “mirror selfie”) However, there must be a bug in the mirror selfies because in the feed of user generated images some mirrors don’t actually reflect a person – but of course these models are not people at all. Porn Pen works like “This person does not exist‘, only it’s NSFW.

On Y Combinator’s Hacker News forum, a user who claims to be the creator, describes Porn Pen as an “experiment” using advanced text-to-image models. “I explicitly removed the ability to specify custom text to avoid generating malicious images,” they wrote. “New tags will be added as the prompt engineering algorithm is further refined.” The creator did not respond to londonbusinessblog.com’s request for comment.

But Porn Pen raises numerous ethical questions, such as biases in image-generating systems and the sources of the data from which they emerged. The technical implications aside, you wonder if new technology to tailor porn – assuming it catches on – could harm adult content creators who make a living doing the same.

“I think it’s somewhat inevitable that this would arise when… [OpenAI’s] DALL-E did,” Os Keyes, a Ph.D. candidate at Seattle University, londonbusinessblog.com told via email. “But it’s still depressing how both the options and the defaults replicate a very heteronormative and masculine look.”

Ashley, a sex worker and peer organizer who works on content-moderation cases, believes the content generated by Porn Pen in its current state poses no threat to sex workers.

“There is endless media,” said Ashley, who did not want her last name published for fear of being harassed for their work. “But people distinguish themselves not only by making the best media, but also by being accessible and interesting. It will be a long time before AI can replace that.”

Existing monetized porn sites such as OnlyFans and ManyVids require adult creators to verify their age and identity so that the company knows they are consenting to adulthood. AI-generated porn models can’t do this, of course, because they’re not real.

However, Ashley is concerned that if porn sites crack down on AI porn, it could lead to stricter restrictions on sex workers, who are already subject to stricter regulation through legislation such as SESTA/FOSTA. Congress introduced the Study law for safe sex workers in 2019 to investigate the effects of this legislation, which makes online sex work more difficult. This study found that “community organizations” [had] reported increased homelessness of sex workers” after losing the “economic stability afforded by access to online platforms”.

“SESTA was marketed as anti-child sex trafficking, but it created a new criminal law on prostitution that had nothing about age,” Ashley said.

Currently, there are very few laws around the world that cover deepfak porn. In the US, only Virginia and California have regulations that restrict certain uses of counterfeit and deeply counterfeit pornographic media.

For example, systems such as Stable Diffusion “learn” to generate images from text. Fed billions of photos with annotations indicating their contents — for example, a photo of a dog labeled “Dachshund, wide-angle lens” — the systems teach that specific words and phrases refer to specific art styles, aesthetics, locations, and so on.

This works relatively well in practice. A prompt like “a painting of a bird in the style of Van Gogh” will predictably produce a Van Gogh-esque image with an image of a bird. But it gets trickier when the prompts are vague, refer to stereotypes, or cover topics the systems are unfamiliar with.

For example, Porn Pen sometimes generates images without a person – presumably a system malfunction to understand the prompt. Other times, as mentioned before, it is physically visible unlikely models, usually with extra limbs, nipples in unusual places and twisted flesh.

“By definition [these systems are] is going to represent those whose bodies are accepted and valued in mainstream society,” Keyes said, noting that Porn Pen only has categories for cisnormative people. “I’m not surprised you get a disproportionately high number of women, for example.”

While Stable Diffusion, one of the systems likely underlying Porn Pen, has relatively few “NSFW” images in the training dataset, early experiments by Redditors and 4chan users show it’s quite adept at generating celebrity pornographic deepfakes (Porn Pen has—perhaps not coincidentally—a “celebrity” option). And because it’s open source, there would be nothing to stop the creator of Porn Pen from fine-tuning the system for additional nude photos.

“It’s certainly not great to generate [porn] from an existing person,” Ashley said. “It can be used to harass them.”

Deepfake porn is often made to threaten and harass people. These images are almost always developed without the data subject’s consent bad intention. In 2019, the research bureau Sensitivity AI found that 96% of deepfake videos online were non-consensual porn.

Mike Cook, an AI researcher who is part of the Knives and Paintbrushes collective, says the dataset may include people who have not consented to using their image for training in this way, including sex workers.

“A lot of [the people in the nudes in the training data] can earn their income from producing pornography or pornography-related content,” Cook said. “Like visual artists, musicians or journalists, the work these people have produced is being used to create systems that also undermine their ability to earn a living in the future.”

In theory, a porn actor could use copyright protection, defamation and possibly even human rights laws to fight the creator of a deeply faked image. But as a piece in MIT Technology Review comments, collecting proof supporting the legal argument can prove to be a formidable challenge.

When more primitive AI tools popularized deepfak porn a few years ago, a Wired research found that non-consensual deepfake videos garnered millions of views on mainstream porn sites like Pornhub. Other deepfake works found a home on sites related to Porn Pen – according to Sensity data, the top four deepfake porn websites received more than 134 million views in 2018.

“AI image synthesis is now a widespread and accessible technology, and I don’t think anyone is really prepared for the implications of this ubiquity,” Cook continued. “In my opinion, we have rushed very, very far into the unknown in recent years without considering the impact of this technology.”

To put Cook’s point, one of the most popular AI-generated porn sites expanded late last year through partner agreements, referrals and an API, allowing the service — which hosts hundreds of non-consensual deepfakes — to survive bans on its payment infrastructure. And in 2020, researchers discovers a Telegram bot that generated abusive deepfake images of more than 100,000 women, including underage girls.

“I think in the next decade we will see a lot more people testing the limits of both technology and the limits of society,” Cook said. “We need to take some responsibility for this and work to educate people about the ramifications of what they’re doing.”

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article

Is it worth starting a business under recessional conditions?

It would seem that now is not the best time to start your own business. In fact, rising inflation, increasing energy prices and general...

British Prime Minister Liz Truss cancels tax cut plan that caused pound to crash

LONDON — British Prime Minister Liz Truss on Monday scrapped her signature plan to cut taxes on the country's top earners after it sparked...

South African startup Talk360’s seed funding hits $7 million after new support • londonbusinessblog.com

Months after the first conclusion of the seed round, Talk360a South African Voice over Internet Protocol (VoIP) startup, has raised an additional $3 million,...

Prosus ends $4.7 billion acquisition of India’s BillDesk • londonbusinessblog.com

Prosus has called off the $4.7 billion acquisition of BillDesk announced last year, once slated to be the European tech giant's largest acquisition, and...

Who is Lilian Matsuda? Untold Facts About Nick Gehlfuss’ Wife

Lilian Matsuda is an American marketing manager. She is especially popular as the wife of American actor Nick Gehlfuss. Lilian's husband is...