Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Pornpen.ai – AI-Generated Porn (pornpen.ai)
711 points by dreampen on Aug 23, 2022 | hide | past | favorite | 414 comments
Hey HN, I've been working on https://pornpen.ai, a site for generating adult images. Please only visit the site if you are 18+ and willing to look at NSFW images.

This site is an experiment using newer text-to-image models. I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated. New tags will be added once the prompt-engineering algorithm is fine-tuned further. If the servers are overloaded, take a look at the feed and search pages to look through past results.

For comments/suggestions/feedback please visit https://reddit.com/r/pornpen

Enjoy!



I had some fun with StyleGAN a couple years ago. I should probably have written this up, or something. Some others I showed it too thought I should. But I never took it too serious. I was just bored over a couple weekends when I had access to some beefy GPUs. And y'know, it's quasi-porno.

I opted for dudes not gals. Part personal preference, partly because I happened to get my hands on an enormous dataset of hot shirtless guys taking selfies.

The initial results were rather poor: https://i.imgur.com/npzDcCL.jpg

That's around when I realized how important the training data really is. Too much variation in pose. So I trained a model that detected the face, nipples, and belly button, and aligned them vertically with that. Then I briefly skimmed over the dataset manually, and deleted anything that just varied too much from the median, in a rather arbitrary way.

Results were better: https://i.imgur.com/bS7ERC6.jpg

So I just let it churn away for a week, heating up the basement. Some final results with narrow truncation and style transfer: https://i.imgur.com/rhnWnpK.jpg

One thing I find intriguing is how bland those men are, really. Oh, beautiful, but too classically beautiful. In a sense, StyleGAN is a compression algorithm, and highly truncated outputs are something like a median of the inputs. So the above reflect a sort of "average" idealized image, that contains the traits most commonly found, in the media I trained it with.


In your example link here:

https://i.imgur.com/bS7ERC6.jpg

you show one fellow with a smartphone stabbed through the chest, plus a headphone cord fused into his forearm veins.

When you "let it churn away for a week," which of the following is more likely:

1. Smartphone Stab Wound Guys decrease in likelihood

2. More realistic-looking smartphone stabs

In either case it doesn't exactly inspire confidence in self-driving cars.


In a future Tesla shareholders meeting:

“Version 10.2 can detect children crossing the road with 97% accuracy, and collides with them dead on at the same rate. Those well publicised close calls are now a thing of the past!”


To be fair if a kid crosses a road dangerously in front of human drivers there are a lot factors at play that impede with the accuracy of those drivers. No system is perfect and we shouldn’t expect that.


Absolutely, but we cannot assume that machines have the same “right to make a mistake” than humans. If a machine kills a person, somebody is responsible and will get sued.


The good news: we kill less people.

The bad news: when we do, it's unfathomable how anything but a computer could do this


Sounds like it might be time for another instalment in the franchise https://en.wikipedia.org/wiki/Tetsuo:_The_Iron_Man


Superb


This would make for a hilarious blog post


[flagged]


Because when presented with a bunch of pictures of hot shirtless dudes, thinking “how can I apply cutting edge technology here?” simultaneously isn’t most peoples first instinct (regardless of your gender identity or sexual orientation), but is immediately relatable if you’re a technology nerd. It’s both unexpected and familiar, and that’s the magic formula for humor.


[flagged]


That's crossing into personal attack. Please don't do that here.

https://news.ycombinator.com/newsguidelines.html


How is this a personal attack? Besides that, you already removed the parent comment so nobody can even find this now without clicking my profile.

> I think this just might not be your cup of tea. That's fine. We all have our preferences.

I don't see how I could be any nicer.

> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

I feel like you're ignoring your own guidelines. Please lead by example.


It's crossing into personal attack to imply that someone doesn't understand "the concept of humor" (or any other common concept). You went way beyond the information in the GP comment to step into that territory, and then you doubled down ("If isn't inherently obvious to you already").

Besides that, you don't need to go on about someone's putative lack of understanding in order to make your substantive points.


Okay, but why aren't you giving my comment the most charitable interpretation and instead assuming the worst? You say to take the most charitable interpretation but don't do so yourself. Your words don't match your actions. You're being a hypocrite.

I fully believe you care and are trying. I also understand there will always be subjective ambiguity and that's fine. In the past we have disagreed and I have later come to acknowledge I was wrong. But if you keep pointing towards rules you don't live by yourself you're undermining the quality of this forum. And I acknowledge mistakes happen and that's fine. But it really feels different lately.

So if you truly care about the quality of HN, do yourself a favor and be more introspective with your moderation. Especially when you are the only person who believes I am being unruly here (as far as I can tell). I'm not saying this to attack you. I'm saying this as someone who has loved this forum for years but is worried about it getting worse.

Cheers.


The site guidelines don't ask people to take the "strongest plausible interpretation" (emphasis added). That's not quite the same thing. I didn't see a plausible interpretation of your comment that wasn't unduly personal—hence my reply.

Interpretation depends on past experiences and associations. Since everyone's are unique, no two people will always share the same interpretations. It gets worse when all we have are little bits of text to understand each other through. For example, if you had spoken those words and I had heard your tone of voice, it's quite possible I would have interpreted you differently.

On the other hand: you can't go strictly by intent, either, because intent is internal to the person whose intent it is. No one else has access to that (we're not mind readers) so the burden is on the commenter to disambiguate it.

https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...

https://news.ycombinator.com/newsguidelines.html


So my comment got upvotes from multiple people, a random person came by to tell you you're wrong about your judgment (and reaffirmed it when you questioned him), and yet there is no plausible interpretation of my comment not being harmful? Dan, you need to take a look in the mirror here. You don't seem interested in asking yourself "is it possible I'm wrong?" Actions speak louder than words. And once again, I think you should lead be example.

I give up. I assume you're just going to ban me anyway.


I did ask myself that! That's why I said "didn't", not "don't", and why I added the bit about tone of voice.

There's zero question of banning you for something like that.


Okay, I admit I'm frustrated but I can tell you're not trying to be harmful. I think I'll just let this one simmer for a bit.

For what it's worth, I really wasn't saying those things about HN out of pettiness or bitterness. Thank you for engaging with me. Cheers.


You're way off base here.


You think so? Having a sense of humor doesn't mean you get every joke. Telling someone they have no sense of humor, so it couldn't possibly be explained to them, is a kind of putdown. Elvis Costello was not being nice when he wrote this:

  If you don't know by now
  Nobody's gonna tell you
  If you don't know by now
  The shock would probably kill you
https://www.youtube.com/watch?v=ptnXM6rsxSY


Yes, I think you're taking a very uncharitable interpretation of the post.

In any case, I'll leave it here.


Scott Adams claims that 30% of people have no sense of humor whatsoever.


Scott Adams has claimed a great variety of things over time. A lot of those claims don't hold up well under scrutiny, so I'm a bit skeptical of this one. Especially since I'm one of those people who think Dilbert stopped being funny a long time ago.


But Scott Adams as himself is hilarious. It's like an AI-bot gone awry except that there's actually a real person tweeting out to the world.


It takes one to know one, as they say.


The community can't have too few blog posts / ML experiment writeups, so do write a blog post on your experiences, it would definitely be interesting.


> partly because I happened to get my hands on an enormous dataset of hot shirtless guys taking selfies.

I feel like this is burying the lede. I'm very curious where you sourced this training data.


Wabi-sabi is a thing. Friend noted years ago how kids who are super aware of their weird teeth or hair colour turns out to be their most beautiful trait later.


Those first images remind me of the "brainlet" memes you often see in high-quality shitposts (something I appreciate more than the average HN reader I think).




Look at that reflects



They’re even more wonderful to stare at when there’s a pair of em’.

Boobies are beautiful as well: https://en.wikipedia.org/wiki/Booby


I suppose that could be considered "contamination" in the training data.


Inevitable successor to this: a TikTok-style adaptive porn feed that learns exactly what you like and starts generating porn customized to your kinks and preferences.


Porn addiction is already a serious problem for many people. Combine an enhanced version of this model with TikTok style delivery and I’m very fearful of the end result. It’ll be the equivalent of crack in terms of the dopamine rush.


We may experience the first ever dopamine burnout. How much can the brain handle?

Remember the film "Brainstorm" from the 1980's? Where the guy almost dies because he blasts a real-life full brain capture of sex into his own brain in a loop for days?

We'll probably learn if there is a thing as too much dopamine.


> Remember the film "Brainstorm" from the 1980's? Where the guy almost dies because he blasts a real-life full brain capture of sex into his own brain in a loop for days?

No, I had to go look for it and it was on youtube (NSFW) but is this the scene [0] you're talking about? This was likely what inspired the braindance features in Cyberpunk 2077, it's pretty much the 1980s iteration (with a massive tape) on what looks like a futuristic VHS cassette and a phone for some reason.

Well, it wasn't exactly just dopamine, but there was woman who pleased herself to severe level [1] with 30 min orgasms that impacted her entire life, it's a become a meme on JRE [2].

Sidenote: this is probably the only film where Chris Walken wasn't like he is portrayed now, even Kings of NY had that vibe about him but here he is kind of... normal?

0: https://youtu.be/cOGAEAJ4xJE?t=2969

1: https://www.zmescience.com/other/feature-post/brain-pain-cen...

2: https://www.youtube.com/watch?v=Oq6KMk8tkNE


Reminds me of an Arthur C Clarke story, Patent Pending: https://en.wikipedia.org/wiki/Patent_Pending_(short_story)


I cringe at the thought of watching it again and dismantling it from the very high place it currently stands on my all-time favorite movies list. Mostly just for the concept, I can't even remember the plot.


Unlike other films from my childhood, it holds up very well, mostly due to the actors. The very, very last moment (what "really" happens at death), however, is awful.


Sorry but to say, TikTok for P*rn has already been made, works for iOS and Android. Probably already has millions of users, with a percentage of them uploading original content. Not giving the name or link here for obvious reasons.


It’s Time to Build


"If you build it, they will come" takes a new meaning


As does "normalization of deviance".


This could get very ugly if the algorithm begins to generate images of underage persons in response to the preferences of those twitter users who have 'MAP' in their biography, an acronym which stands for 'minor-attracted person.'


That's why we shouldn't be giving those people nice things.


MAP huh? Exactly a TIL I really hope comes in handy for Jeopardy one day.


clap clap clap


That would be a cause for the models on TikTok to unionize because that’s taking their jobs away.


The really smart models will put together an AI to generate images of them using only their own images as training data, and then they can sit back and profit.

Eventually we'll be watching generated personalities on twitch playing video games, and then we'll know the end is nigh. There is no infinity, eventually the snake chokes on its own tail...


>Eventually we'll be watching generated personalities on twitch playing video games

that won't happen because the entire value proposition is in the personal connection to the streamer. (the modern derogatory term is 'parasocial relationship').

just like nobody watches chess computers or starcraft bots play nobody is going to watch bots play games. in fact to stay on the topic of the thread, people are making millions on onlyfans because they realized chatting with their viewers is much more valuable than generic pornography.


> that won't happen because the entire value proposition is in the personal connection to the streamer. (the modern derogatory term is 'parasocial relationship').

That "personal connection" is about as deep as the one I have to my favourite bartender. So I think the negative connotation of the term is quite justified.

I'm pretty sure, once we can add the ability to read and write external state to GPT3 and make the state personalized by user, we can emulate that as well :)


> the entire value proposition is in the personal connection to the streamer

Right - that’s where the money is made. The people that pay them have to be found in the broader market, though, and that’s where this sort of thing would shine. A streamer could pump out a stupid amount of content with this, always fresh and new, with little or no ongoing time commitment. That in turn would free up their time to provide personalized content to paying customers.


No joke Instagram influencers already do this to a degree.

I knew a guy who came across a post by an influencer at the cafe he worked at. She never came in. A different girl came in and had breakfast taking photos of the food and scenery.

It wasn’t a paid post. She just needed content so she outsourced it to someone who generated it for her.


There may be a difference between what we currently know as "bots" and when someone finally links generated images/video, and content through text and speech together into something that does a good approximation of a person.

The fact that it might occasionally do or say crazy things as it's not a real person might even be part of the draw. It's not like real people don't sometimes adopt controversial opinions purely for the attention it brings.


We've already got vtubers playing robot characters, it feels like it's only a matter of time before someone attempts to make a 'true' AI vtuber. Though I suspect we're still a while away from AI being able to complete arbitrary video games.


Those generated images, fixed permanently in the stage of early adulthood could still be produced when the model is well past retirement age.


Yep, that was one of the aspects that came to mind. I wonder if anyone has trained up a bunch of Marilyn Monroe images to generate new content of her?

What's the legal precedent for for those images? Does it depend on the source set of data? If that's public domain, does that mean new images are owned by the producer (who is likely to be someone else entirely)? Is it more like someone that draws a likeness of someone else (I'm not sure the copyright of that either, but it's probably well defined by this point).


There has been a series of pop concerts of late featuring holograms sometimes of deceased members, notably the group TLC who toured with a hologram of the late Lisa 'Left eye' Lopes who had died in a car crash a decade before. Some kind of legal precedent must have been set in order to permit that to happen. In this case, it looks like whoever holds the rights to the original performances from which the holograms were generated is the owner.

It would be a different matter, as you outlined above, if one created a digital facsimile of a person and used it to perform in ways that the original person did not. Does a person own their own persona?


There's a business model in there though: the algorithms need well tagged poses and images, and one of the biggest issues with sex work is that it is used to attack people in other areas of their lives (even in a perfect society for this, the desire not to be directly personally associated as a part of your identity would probably still exist).

If you can retain paying customers, and then pay performers for a supply of source material (which won't, unaltered, ever end up in customer hands) then there's a new business model somewhere in there.


And then sells your preferences to the highest bidder. A bit like the Westworld HBO series.


For the general public you mean. I'm sure Bezos already has a feed of "urinate in pants".

Why the downvote? At 15 $/h his 162.7 billion could buy 1.24 million years worth of bathroom breaks. You want to argue other motivations are at play here?



I'm having some trouble finding it through search, but if I recall, this project was submitted to HN at some point.


Was it this one? https://news.ycombinator.com/item?id=9847283

If so, it existed for a while but shut down years ago.


Yes, I think this is it. It wasn't generated porn, but it would hone in on what you liked similar to Tic Tac.


hmm yes that will definitely not totally fuck up the dopamine systems of young men and make an already badly addicted generation hopelessly dependent on this shit


We lost that battle a decade ago


Porn "addiction" and all that hand wringing is just FUD from the moral panic types and the skeezy media "doctors" that prey on them. In fact non-substance "addictions" aren't recognized at all. "Gambling disorder" is presently the only condition in the subsection of "Non-substance-related disorders" in DSM 5.

Addiction mediated by chemicals that directly increase perceptions of incentive salience through mesolimbic dopaminergic activation are real addictions.

Perceptions of sensory stimuli come in through the normal paths do not directly hijack any part of the brain like addictive chemicals do. Downstream of the stimuli there may be reward learning but it's through normal channels, not directly turning up incentive salience. You actually have to like/enjoy something first before reward learning happens.


Is there a DSM-5 condition for outsourcing all your rational facilities to the DSM-5?


I cited the current status quo and then I explained the scientific reasoning behind it. You are implying the DSM 5 created my beliefs, but I only cited it here because if I just stated the science alone I'd be critized for that instead. I've spent decades reading journal articles on this topic and there's every reason to believe that substance mediated addition is a very different thing than non-substance mediated "addiction".

The crackpots and abusive groups out there holding "porn addiction" and related treatment camps and services are just about as legit as the "pray the gay away" camps. They're completely unethical and unsupported but moral panic keeps them flush with cash.


Is it possible your issue is with the overloading of the word "addiction"? If we called it a compulsion instead would you be ok with that?

'cos I'm just thinking that Skinner boxes look pretty damn addictive from the outside, regardless of the precise mecahnism of action.


Yes, that's the primary issue I'm addressing. But even beyond that there have been many real studies (not anecdotes) showing that masterbation and porn use have no effect on sexual partners. In fact most of the one's I've read show the reverse correlation if anything: people who masterbate more are better able to enjoy sex with partners. ref: https://www.nature.com/articles/s41443-022-00596-y

Additionally, scientists have studied if it is possible to become "addicted" to sex and other normal sexual stimuli. It doesn't happen. At least not like with, say, methamphetamine. Imagine how many people would be "addicted" to sex if it were. ref: https://akjournals.com/view/journals/2006/11/2/article-p222....

The real dangers here are cults like "no fap" (ie, NCOSE, FightTheNewDrug, Reboot) and other unhealthy moral panic groups promising to kill each other and repeated threatening to kill adult actors, media producers, and scientific researchers.

People with the strongest opposition to scientific consensus have the lowest levels of objective knowledge--yet they also have the most confidence. If someone claims to "treat" sex addiction or porn addiction, they are part of the problem providing "non-evidence-based treatment". ref: https://www.sciencedirect.com/science/article/pii/S027273582...

https://link.springer.com/article/10.1007/s10508-020-01884-8 "Claiming Public Health Crisis to Regulate Sexual Outlets: A Critique of the State of Utah’s Declaration on Pornography"

https://ajph.aphapublications.org/doi/full/10.2105/AJPH.2019... "Should Public Health Professionals Consider Pornography a Public Health Crisis?"


Small point: you don't have any evidence that moral panic groups are unhealthy. If you're going to wield that "objective knowledge" sword, you'll need to die by it, too.


You're right. It probably doesn't change them.

There's a good chance there's just a selection bias for the type of person that joins a nofap cult (religious or otherwise) being intrinsically violent (unhealthy to others). Now they just have a cult to encourage them and tell them which researchers to send the death threats to.


> scientists have studied if it is possible to become "addicted" to sex and other normal sexual stimuli. It doesn't happen.

Dude, what planet are you living on? Sex addicts (people who compulsively seek sex to a harmful degree, and wish they could stop) are extremely common, as are e.g. food addicts.

Porn is also hardly a "normal" stimulus (i.e. comparable to stimuli that would have been found in the evolutionary environment). Cf https://en.m.wikipedia.org/wiki/Supernormal_stimulus

Honestly, it seems like you are saying things that are very obviously untrue because you've read some (probably methodologically and epistemically poor) papers. You should significantly reduce your (unwarranted) confidence or you are likely to get burned.


Please don't cross into personal attack. You can make your substantive points without that.

https://news.ycombinator.com/newsguidelines.html


He study forms of addiction. He's just relaying things he have studied.


I don't think that observation is relevant to any of my objections.


My issue is with the use of the term “addiction”, yes. It’s multiple orders of magnitude smaller in effect size on dopamine receptor down regulation compared to Opioid Use Disorder, for example, and I think conflating them can be a bit harmful.

That said, this battle was lost long long ago, so it’s somewhat of a moot point.


addiction refers to a pattern of behavior not just its scale. like, you can be addicted to eating even though it has a much smaller effect than smack and people still are and end up 600 pounds. i also think addiction around porn is not just addiction it's a fucky form of erotic plasticity that gets people hooked on weird stuff that messes with their ability to have normal, healthy relationships.

like, i watch a shitton of peers have problems with this and then y'all tell me "oh at least it's not opioids". bruh. yes it's not opioids it's a different kind of problem. then there's the people who are out here like "oh you're just a puritan" refusing to admit that there's literally anything potentially dangerous about porn at all.


> oh at least it's not opioids

That's not what I'm saying at all. I'm saying that we shouldn't conflate the two via language, because as you also admit, they're not even in the same planet with regards to their danger and effect on the human organism. They don't operate the same way at all. I said nothing and passed no judgement on the dangers or lack thereof of porn.


The point you should be catching, I believe, is that such things exist in reality independent of their existing in the DSM-5


> I cited the current status quo

You certainly did.

> and then I explained the scientific reasoning behind it.

That's charitable.

> there's every reason to believe that substance mediated addition is a very different thing than non-substance mediated "addiction".

And? They are both addictions.


Carry on doing your thing then.


[flagged]


Hey, please edit swipes out of your posts here - you're making some interesting points but we need you to do that within the site guidelines. If you wouldn't mind reviewing them and sticking to them, we'd be grateful: https://news.ycombinator.com/newsguidelines.html.


sorry man i shouldntve said some of that, i got kinda hot under the collar cause the constant "you're a puritan" pisses me off after seing so many friends struggle with it. i'll proofread before posting.


> literally listen to all the girls who end up crying bc their boyfriends can't get it up and they think they ugly, but really it's porn addiction.

… or it’s the well-known trend of decreasing testosterone levels among males in Western countries.


i agree that's part of it and massive problem, and its kinda scary most 20 year olds would have to go on aggressive TRT to get close to people 80 years ago. however getting clean from porn for even a month seems to really help most of them. sometimes longer but def makes a difference and usually with time it gets fixed.

i will say there's a theory that frequent masturbation lowers testosterone, telling your brain "yea i got a harem of chicks im good" so it doesn't need the aggression or drive. could just be broscience but an interesting area to explore.

but yeah both problems deserve attention. tbh i wouldn't be surprised if politicians, media, etc. ignore it on purpose because it's correlated with lower violence or whatever.


Masturbation does not seem to have any long-lasting effects on testosterone levels.

https://www.medicalnewstoday.com/articles/325418.


i agree that those studies show normal masturbation doesn't. i'm saying maybe unhealthy addictive behavior (like 3 times a day) does and that hasn't been studied. i was also totally clear it might be broscience but that it's an interesting study area that bears considering.


Lmao, you can't tell someone to "stfu" on HN. In fact I'm pretty sure it's against the rules.

There's nothing wrong with porn, just as there is nothing wrong with gambling. Some small percentage of the population might end up with an addiction to it, but that doesn't mean that the overall thing should be banned. Otherwise...sex would be banned, because, well, _some_ people get sex addiction.

You say that people are getting into "taboo" stuff that's outside of normal/health. Let me taking a wild guess, that's anything outside of a "missionary with a man and a woman while married for the purposes of procreation", right?

In reality it's that's newer generations are less prude than previous ones, the lgbtq+ community especially, and we're now able to enjoy such diverseness of sexual identity and expression.

Stop being such a prude, lmao.


Please don't break the site guidelines yourself, regardless of what someone else did. It only makes things worse.

https://news.ycombinator.com/newsguidelines.html


They’re working on changing that, too.


Thats a long description of habit forming


/snark Welcome to the metaverse


Better than Instagram, at least.


no, its gonna have people combining dark pattern skinner box shit from insta and tiktok with the addictive nature of porn. and its gonna be way worse. yall wouldn't believe me if i told you how common straight up porn addiction is in young kids, it's fucking bad already.

edit: @stickfigure i'm can y'all please quit trying to tar me with this brush, it's bullshit. i'm not arguing about masturbation i'm saying excessive consumption of porn when people have limitless access to weird ass shit that doesn't reflect the real world is causing problems. don't try to make me some puritan or whatever because literally nothing i said suggests that.


If people wanted to work on this, I genuinely believe that could be huge.


I would happily pay $100 a month for that.


Eventually this and similar technology will be able to generate unlimited porn of all types. Including CP. Should or even could it be made illegal?

To me that kinda feels like if it was made illegal then you are basically acting as the thought police.

Edit: are people just uncomfortable talking about this or do they think downvoting is going to somehow prevent it?


The question of whether or not machine generated CP is ethical or not is a confusing one to me, since the crux of problem of CP is that it's violating the rights of a child. If you get rid of that fact, CP becomes a 'victimless crime', in a sense, and it becomes hard to pinpoint what the actual perpetration is. (Ignoring the fact that a CP model has to be trained on real images, which destroys this argument outright, for now).

Nobody wants to be a pedophile, but if you are unfortunately hardwired that way the impulses are impossible to ignore—similar to any other sexual impulse that 'healthy' adults have.

I think the goal is how to create the least amount of harm for children, and it's theoretically interesting to consider the possibility that machine generated CP could be used to ultimately end the real-life CP black market.

I'm sure this won't ever happen because nobody wants to touch this subject with a 10-foot pole; and our species tends to favor solutions that are idealistic rather than rational in many ways. But the European approach to criminal justice is something like: "humans are not perfect, and we can't fix them, so let's try to engineer solutions which make living together more pleasant". Nobody likes an icky solution like this, but the alternative is a lot of children suffering (although I'm not saying those two options are perfectly substitutable).

On the other hand, it could be a completely stupid idea outright. Just an interesting thing to think about.


>(Ignoring the fact that a CP model has to be trained on real images).

that's not how this tech works.

Did you think actual guinea pigs had to be trained to do household chores for these pictures to be generated? (dailywrong.com/new-course-teaches-guinea-pigs-household-chores/)

I had a lot of fun creating pictures of llamas in space. I'm pretty sure no llamas have ever walked on the moon.


I think it is how the tech works; though. The model knows what a guinea pig looks like, and it knows what household chores looks like, so it can concatenate the two. I don't think the model knows how children look like unclothed, and I don't think it can posit that by looking at nude human adults. But I could be very incorrect there; and these techniques will certainly evolve over time.


I think generating CP with AI would be 10000x easier than most of what I have personally made with with dall-e.

creating a llama wearing a space suit and walking on the moon is a lot harder than filling out the very limited amount of skin hidden by a swimming suit.


Question is does it reduce demand from the real world or does it serve as demand multiplier in the real physical world?


I think these are the right questions to ask; and these types of things would need to be studied in order to pose a potential solution. But I can't imagine any modern day institution beginning to pose these questions without facing public outlash. The sensationalized headlines that would spin out of this would probably stop any research in this area in its tracks.

In that way our species is very primitive. We're trigger happy in our outrage and often unable to focus on a larger shared goal, much to our own demise.


We can study near-like issues to get a hint of the phycological effects.

Have people into bestiality to play in their VR worlds and see if it suffocates demand or increases demand and we see spillover into the real world. (find a jurisdiction where the above is tolerated)

Maybe Thailand can produce some data given their apparent tolerance for certain behaviors.


I think the research shows that availability of normal porn reduces rape. (Don't have a link at hand, sorry.) So I would guess the effect would probably be the same here.


It's the age-old dilemma: is it a safety valve or a gateway to worse things?


But what exactly are those worse things?


It's not entirely a new issue, see the discussions around "loli erotic japanese manga" etc. Though I agree with you, nobody is touching this tech anytime soon (at least publically) even though you can easily imagine safe applications like MRI with AI generated images to study exact brain patterns for potential treatments and so on.


I don’t think you can ignore the fact that the model would have to be trained on child sexual abuse material, though. That’s pretty important.


I view that currently as a technical limitation that will likely be lifted as the technology progresses. Our species has been hard at work on the field of AI-image generation for only a decade, and we have thousands of years to go. So I would posit that this barrier is probably a short-term one.


I agree. But what do we do about the models that are trained on CSAM prior to this inevitable breakthrough? Because you and I both know it’s likely to happen.


What about writing a blog post about raping and killing people. We have to draw the line somewhere and for CP the line is at the first step.


Unfortunately, don't think folks watching CP would stop at that. Porn is an addiction (or compulsion?) and I don't see how CP makes it any better for folks with pedophilic tendencies.

To think that porn reduces sexual tendencies may be true (in the short to mid term), but eventually, as the addiction intensifies, those hooked have the urge to seek more novel, more aggressive versions of it (in reel or for real).

https://en.wikipedia.org/wiki/Effects_of_pornography#Sexual_...


Otoh, I have read articles worrying that young men aren't going out looking to meet women like they used to, because they are satisfying their desires though porn.


Not at all contradictory. Opioid addicts need higher and higher doses for their fix. In the worst case they too drop out of society, committing crimes such as theft and prostitution to service their addiction.


I agree that people addicted to porn want more porn. The question is whether they also want more real-life sex, or less real-life sex. It seems weird to have simultaneously a moral panic about both options.


Oh no the engine of society is losing fuel to burn through, how awful. How will the necrarchy ever afford their servants if the subjects have dodged the exploitation of their reproductive urge.


If you fear your reproductive urge is being exploited, I think eradicating the exploitators is a better course of action than withholding having children. There are better reasons not to have children than using it as a way to harm your overlords.


Add reefer into the mix, and you've got a recipe for madness.


I don't know if you're being sarcastic here or not, but I've mixed porn with stimulants and the result is definitely pretty. I've went through 16-hour masturbation sessions and have done things I would havje never imagined. The thoughts still haunt me. Porn addiction is not a joke.


It is a terrible idea to compare pedophilia to drug use.

One might be a victimless crime.

The other never can be.


perhaps it was just in response to the second part

> To think that porn reduces sexual tendencies may be true (in the short to mid term), but eventually, as the addiction intensifies, those hooked have the urge to seek more novel, more aggressive versions of it (in reel or for real).

and if so, it's a joke about the reefer madness movie and moral panic


[flagged]


Would you mind reviewing the site guidelines at https://news.ycombinator.com/newsguidelines.html? You broke quite a few of them here—especially this one:

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."


Is that any different from the argument about violent videogames back in the day? Or Dungeons & Dragons before that?


I wish people would explain why they're downvoting these ethics-related comments so aggressively, because I'm interested in these conversations and not sure why other users find them irrelevant.


The guidelines specifically ask you not to post like this: https://news.ycombinator.com/newsguidelines.html. It's completely off-topic and, what's worse, repetitive.

It's frustrating not to be able to know the reasons for downvotes, but it's the nature of the beast. Sometimes people think we should require explanations, but you can't change the way people are by demanding they jump through a hoop.

If you see unfairly downvoted comments, give them a corrective upvote (https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...). Other users will hopefully do that too.

A lot of the time, that happens and then posts like this end up as uncollected garbage in the thread, complaining about something that doesn't exist anymore.


I've personally noticed that the collective upvotes start right after someone posts "why downvote" rant like gp did.


Yes, I'm sure that's a thing too.


I think it demonstrates why it's so easy for technology to run ahead of ethics. No one wants to hear a public safety announcement when they get the keys to a new car.

OTOH, I think the risk of CSAM from this kind of thing is overblown, because either (1) it wouldn't actually involve real children in any way, or (2) the model would have to be train on actual CSAM, which there are already laws against.

If no humans are harmed by it at any stage, then how's it any different from the CGI murders all over movies and TV? In the absence of harm, the only reason to consider it a crime is that it's disgusting. But we don't prosecute people for having sick thoughts, just for acting on them. If you take the child and abuse out of the equation, the only argument left is that seeing CSAM (artificial or not) is a "gateway drug" to commiting child abuse. If that's the case then why shouldn't we prosecute people for watching Dexter, on the grounds that they're on their way to becoming serial killers?


> If no humans are harmed by it at any stage, then how's it any different from the CGI murders all over movies and TV? In the absence of harm, the only reason to consider it a crime is that it's disgusting. But we don't prosecute people for having sick thoughts, just for acting on them.

You say that, but created from scratch drawn material is illegal in some countries. What you've stated is a nice ideal but often doesn't match reality, and I'm on the fence as to whether that's a necessarily a bad thing in this case (I'm not sure we really know enough about sexual drive and how it interacts with the subconscious to know whether it's actually benign or has a societal cost).


> You say that, but created from scratch drawn material is illegal in some countries.

Homosexuality is illegal in some countries.


My point was the last sentence of what I quoted doesn't quite match reality. We do prosecute people for sick thoughts in some of these cases, regardless of whether we should.


Well... just because some countries prosecute drawings (or homosexuality... or homosexual drawings) doesn't mean there's any logic behind it. There can be enormous, totally illogical public will whipped up in favor of banning all sorts of things people consider abnormal. We can definitely have that discussion, as to what thoughts or creativity cross the boundary to criminal behavior, because likely at least some of them toe the line. But let's at least acknowledge the gradient of criminality between crimes with victims and those without, and not muddy the waters by pretending that every victimless crime is a stand-in or call to arms for a violent one. That way lies totalitarianism. Let's call it what it is when we want to jail people for having sick ideas - or for creating disgusting art: it's persecuting bad aesthetic choices. It's Thoughtcrime. Either this actually harms humans (crime) or not. Either it leads to a crime (prosecutable) or doesn't. The rest is in the realm of policing people's minds, or the future. And while some countries do that, they tend to be places people who have unique ideas to contribute don't want to live, because that type of control is never limited to one particular type of image or another, but mainly serves as an excuse for imprisoning political enemies. Those countries end up poorly because they shoot themselves in the foot by driving away anyone who might paint edgy art or say things like the earth revolves around the sun, or that God doesn't exist, for example. I'm not saying a society can't make the choice to prosecute thoughtcrime, I'm just saying it's a bad idea. And it's especially bad to mix it up with prosecuting real crime, because criminalizing thought is the onramp to the corruption that lets neighbors turn on each other, everyone bribe the police to arrest their petty enemies, and brings societies down. CSAM is an actual evil, one of the most evil things on earth. Using that fact to prosecute people for drawings or offensive speech or thought (since there is no limit to what can be found offensive - even this conversation is criminal somewhere!) is really cynical and purely for power, if you think about it.


My main point was to note that regardless of reason, we do seem to punish thoughts in some cases, so the last part of the statement I quoted doesn't quite match with reality.

The second part of my comment was more along the lines as to whether exposure to material such as this (material depicting children sexually) affects desire for material such as this, or behavior, for those that already desire it and/or have urges along those lines. I don't know the answer to that, and so an unwilling to make a blanket statement that it should or should not be allowed. There are plenty of things we restrict as a society through our government because we think it's bad for the whole. Sometimes there's little real reason for that, sometimes there is, and thus I'm on the fence until I know more.

My layman's understanding is that some aspects of sexuality are not well defined until puberty and what you're exposed to at that time, and then those are fairly fixed for life. If that's true, there are aspects to consider beyond an individual's desire, and a nuanced informed decision is worthwhile. It's possible the right answer here is not just "freedom of speech and expression" nor "protect the children", but a middle path with the most benefits and least drawbacks.


I'd bet that downvoting is a kneejerk response to the claim "porn is an addiction".


Anything can be addiction. Hell I got addicted to weed. At first for years it was fine but then I got into some rough patches when the pandemic started and I was blazed all day every day. It was a temporary escape where I could still do my job but beyond that I had zero motivation for months besides getting more weed.

Took months to kick the habit, completely.


Is it knee-jerk?

Alcoholism is pretty serious, but you wouldn't say "beer is an addiction". Same with video game addiction and many other things.

I find it a pretty prude generalization: it's a catch phrase for people who get offended by sex.


Developers like pretending technology is always good and/or inevitable, I think this sort of question is uncomfortable to many folks.


We detached this subthread from https://news.ycombinator.com/item?id=32573719.


The GP comment made largely unsupported fact claims, rather than ethical ones.


Ethics? Please. It's moral panic. The putatively ethical folks having a "harmless" academic discussion about which nice things everybody is or isn't allowed to have, according to them.


Now it's clear to me you're not being sarcastic. It's not about morals, I'm all in for you to have access to as much porn as you want, as long it's legal. If that's healthy or not for your mind, that's an entirely different story.


I don't trust you concerned "we live in a society" types any further than I can throw yous.

(This is tongue in cheek. I'm not upset.)


Many countries already ban painted or rendered CP.

US did that for a few years, as well, until the Supreme Court struck it down: https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalit...

So as far as the output goes, there's already some kind of precedent in most countries. Or are you asking about the model that performs the generation?


What? That's about things that _aren't pornography_.

US convicted for written obscenity just in 2013: https://www.techdirt.com/2013/05/20/court-finds-fantasy-stor...


The Miller test applies to all pornography, not just CP. If you look around, you can find federal cases about "transportation of obscene matters" for adult stuff, e.g.:

https://www.networkworld.com/article/2276921/porn-producer-s...


I’ll bite. The issue is it will take all of 5 seconds for an enterprising CP producer/admin/consumer to mix AI content into their stash and then claim they’re being persecuted when the feds come knocking.

It’s like lolicon, sure, you can sit there and pretend it’s a 2000 year old demon warrior… but she is still dressed, posed and drawn to mimic a small child.

Don’t get me wrong, it’s a conundrum, that’s for sure. I hate the idea of the thought police. I love free speech. But in the end, I believe any material, whether it is “art” or computer generated, should be barred if it contains images that can be construed as a child in a sexual act (ancient art gets a pass?). Lolicon does not stop pedophiles from being pedophiles. Nor would this. It would simply act as a gateway and a scapegoat imho.


There's prolly gonna be a big controversy when pedos start using it that way and proof is posted on mainstream websites. I've just discovered lexica.art and I love all picture generator AI tools, but when stumbling upon some pictures of young Emma Watson topless, it didn't seem ok, and reading the prompt for these images showed that the user didn't go for tags like "young" or "topless", it's just StableDiffusion randomly selecting these features, maybe because there's wierd content about Emma Watson online on shady sites, maybe something else.

I'm not sure I want all of them to close access to these tools, stop providing open source capabilities, or prevent specific keywords like Dall-E does, but it's the easy solution for providers. This is why we can't have nice things (there's so many interesting, funny and beautiful things that aren't illegal or even controversial but that are blocked as side-effects of Dall-E's restrictions).


This would be extremely controversial but I remember The Economist mentioned porn for pedophiles as part of the solution (the premise being it's a genetic condition that criminalization doesn't work very well against). AI generation would remove some the moral issues.


That will definitely happen; I suspect the criminal offense will be adjusted toward making distribution illegal, but possession might be tolerated (in some societies) if an artificial origin can credibly be established.

It's very hard to predict how this will shake out. I personally doubt that looking at extreme/contraband porn is a good strategy for coping with highly illegal desires but obsessive behaviors aren't something that's easy to model reliably. Considering that weaponizing false claims of predatory behavior as a coercive social/political tactic is a big thing at the moment, you probably shouldn't hold your breath hoping for any sort of consensus to emerge in the short term.


Watching CP tends to reinforce paedophilic tendencies. Many folks seem to "common sense assume" that it's a safe outlet and it would be nice if that were true, but it's not.


The problem is that you can find common sense arguments both ways. It's just civilization inadequacy that we aren't capable to work on finding objective arguments here, because it doesn't seem like a particularly difficult problem. But yet again, politics is the mind killer.


I hear this repeated verbatim a lot as if it were objectively true, along with the idea that watching porn in general is psychologically unhealthy, but I have yet to see any legitimate peer reviewed studies to corroborate either of these positions.


Unintentional The Exorcist Rule 34?

NSFW, obviously: https://pornpen.ai/view/27ylon4lGNpkfA4iJKJB


Made me think of this (also NSFW, mild though): https://www.oglaf.com/chauncey/


Man, I remember when I first discovered Oglaf. What a weeeeeird comic, loved it.


> also NSFW, mild though

It either is or isn't.


Safety is not described by just a boolean value.


Take a look at this one (also NSFW) https://pornpen.ai/view/CBHr5AlP8AFAZ56Jkp7i


Made me laugh. Showed my wife and she said, stone faced, "isn't this what you would want?"


What's going to be really wild is how the availability of convincing technology like this will seep into expectations that women think men have, and how they modify their own bodies to adapt. This has already happened, even with today's limited production capacity using human models/actors.

What happens when literally any crazy fetish can be explored infinitely?

Based on the images linked here (server has melted down), this stuff is not very convincing - yet. But when it gets there and literally any image can be called up, with endless variation, on demand, that's when things will get strange.

> I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated.

Eventually, this technology will be available to anyone with a few bucks to spend. The software won't be run on a server but on local machines. Those restrictions will not last. It will no doubt cause a crisis of sorts as the lawyers and politicians try to make antiquated concepts of decency and harm apply to hyper-realistic material generated algorithmically and without the direct involvement of any person.


I don't know. I mean, it's been really fashionable to say stuff like this for at least a couple of decades, but in the real world I see things playing out in pretty much the complete opposite. Guys are still "yey, boobies!" no matter the shape and size, most of my male friends would like more serious and traditional relationships, and the only type of women that get rejected are grossly overweight.

The only thing porn is desensitizing us to is... milder kinds of porn, really. Relationships get to live in a completely different mental box. I mean, think about the world around you - do you know of any real life people that got split up because they had regular problem free sex... but it wasn't kinky enough? If anything, I've only ever heard women complain about bland sex. Men are still happy to have it, check that box and judge the relationship satisfaction on the rest of the metrics.


It's almost like many adult males are capable of distinguishing between the pretty picture created for entertainment purposes, and the reality of human relationships. Who could have thought?

A lot of people also behave like pictures of naked women (or availability of actual naked women, for viewing and more, provided appropriate payment is made) is a new thing that human males only got access to with the invention of the internet. It's not that new, it's just another facet of the same old thing, and the same would happen - some small number people would develop an addiction, a bit more people would be regular consumers, a real lot of people would be occasional consumers, otherwise not a lot changes.


People who see those mental boxes as sharing the same space also struggle to understand kinky asexual people. It seems like my ace friends commission more porn than allo[0] friends.

[0] https://lgbtqia.fandom.com/wiki/Allosexual


>What's going to be really wild is how the availability of convincing technology like this will seep into expectations that women think men have, and how they modify their own bodies to adapt.

US waist sizes tell us women think men find walruses attractive: https://www.cdc.gov/nchs/products/databriefs/db360.htm

The idea that even a minority of women do _anything_ for attracting men is quite wrong when you look at actual behavior rather than day time television. The converse is also true. I find it astonishing that a country which is 85%+ fat has some how convinced itself that it has a problem of unrealistic body expectations. We're closer to the world represented in wall-e than that of playboy.


> The idea that even a minority of women do _anything_ for attracting men is quite wrong when you look at actual behavior rather than day time television. The converse is also true. I find it astonishing that a country which is 85%+ fat has some how convinced itself that it has a problem of unrealistic body expectations. We're closer to the world represented in wall-e than that of playboy.

Best post thus far, well done.


Is it fair to include people in relationships in these statistics then? It’s not uncommon for people to “let themselves go” once they are in a relationship.


I always thought that if something's bothering you, it's much easier to change yourself in order not to be bothered anymore than it is to change the world so that the thing bothering you doesn't exist.

It's easier to be careful with where and when I go somewhere than it is to create a world where I won't get mugged/assaulted. It's easier to be content with the life I live than it is to become rich. And so on...


To be clear I'm not making any statements about the creator, I imagine this is a reflection of the dataset: but the fact that there are so many categories for east Asian (Chinese, Japanese, Korean, and I hazard that the "Asian" tag would probably mostly generate east Asians) says volumes about the sexualizing gaze cast on Asian women. While certainly there's an ethical argument over such an algorithm (both for and against) with regards to sex workers and exploitation, it seems to me that the granularity of choice and what choices are available to the consumer also may require additional consideration, lest it reinforce negative stereotypes, introduces new stereotypes, or allows for harmful attitudes to be stimulated.

I'm not trying to be sanctimonious or overly moralizing, but half the tags I would associate with ethnicity are east Asian. And while I lean towards treating it as a mere curiosity, I think it's worth discussing as having unintended consequences and interesting implications at the very least.


Blonde, brunette, ginger and white all refers to Europeans, which represents less people than the Chinese tag alone, so it doesn't seem like it fetishizes Asians more than whites.

The strange part is why we don't have equally rich categories for Africans. Native Americans not having their categories makes sense since there are so few of them left today, so Africans are the only large group left out.


I thought about that while writing my comment, but while hair color correlates pretty well with ethnicity, the level of granularity given to Asians still seems somewhat unique. There are, for instance, no distinction between French and Italian, nor British and Scottish categories. Blonde and ginger do have fairly ethnic implications, but not many people would answer with their hair color if asked where their family originates. That, and the fact that these hair colors are not restricted to white people, due to multi-ethnicity being a thing, as well as commercial hair dye, made me drop them in my "half the ethnic labels" comment (though I will concede that the image in most people's heads when hearing any of those terms would be a white person).

This is also coming from a perspective where I'm assuming that the audience for this algorithm (outside of academic consideration) would be westerners where white people tend to be the majority. In that case fetishization generally would only apply to racial minorities, due to the fact that the term seems to imply a sexually gratifying quality to the "uncommon". I add that, speaking as an Asian American, that it's not necessarily all that easy for most people to tell Chinese from Japanese from Koreans, whereas it's basically tautologically given that a blonde can be differentiated from a brunette since the terms are more strictly aesthetic rather than geographic.

On your point about Africans, I would extend it to both Latina and the fact that "Indian" seems to be a catch-all for any South Asians. Middle Easterners are also not represented (though this doesn't strike me as a thing where racial equality is necessarily desirable). However, this is where I think the data set is mostly responsible.


I think they took the tags from some uploader website. CJK ethnicities can be inherently identified by types of sources(smuggled amateur films, commercially made, etc) as well as by styles, when appearing in those sites.


That's my guess too, but my argument is less that this arises from a specific place of malice or intention, and rather this phenomena comes from an uncritical approach to handling the dataset. There is, for instance, no reason not to group all the CJK data as "Asian" for the sake of similarity with the rest of the ethnicities. But in uncritically reflecting the dataset, it introduces downstream effects that could potentially be amplified in later work.


> but the fact that there are so many categories for east Asian... says volumes about the sexualizing gaze cast on Asian women

I think you're reading too much intent into 4 labels. From an American perspective, Japanese was the original--commercial, censored--"Asian" porn, "Chinese" is typically amateur uncensored, etc. I... won't elaborate further here, but there are stylistic differences that have nothing to do with race.


The tags should reflect that if it's really about the niche and not the ethnicity.


You raise some very interesting points. For me this also demonstrates how deeply rooted, historical bias and prejudice bleeds through to this technology and could become a perpetuating factor. These models will reflect societal values, for better or worse.

But also what will the next model, and successive models, that are trained on data from AI generate images, be like? It seems you could get an amplified or skewed model that is pretty far from reality. At least currently they are all trained on “real” images (granted some images are heavily edited or photoshopped). I can imagine that In the (not too distant) future most images will be generated and not “real”.


Yeah, I wanted to stress that the categorizations seem more to me to be symptomatic than a reflection of OP's decision making. I loathe the label "problematic", but it seemed worth noting a potentially unintended consequence that could introduce some harms.

That said, it might be too early to speculate on models trained on the outputs of other models. This algorithm seems to benefit greatly from the post-DALLE 2 art-gen renaissance, but still has a number of oddities and uncanny outputs (noted in the rest of this thread) that I imagine are nearly unavoidable given how familiar the human form is. But certainly, if such decisions end up amplified in later work, or through widespread usage, then the consequences are more significant than, say, simply reflecting uncritically a skewed data set.


This is one of the most interesting comments here. I notice a lot of biases in image generation AI, it's so subtle it's almost at an ideological level. The style of art it creates, the color and dress of the people it generates-- there are little things that that make you wonder "why this instead of something else?"


Keep in mind this AI is biased towards white men's interests which means it heavily fetishizes and degrades Asian women.

Even searching for other races shows predominantly Asian women in the results, its quite disturbing.


That didn’t take long. :p Stable Diffusion released just yesterday.

How did you design the backend? What kind of server are you using?


You can design the backend with the big ass and small ass tags.


This is the greatest comment on HN I have ever witnessed.


People doing porn can now truly claim they are doing "science" by providing valuable, insightful data samples :)

It shouldn't be surprising to see quick future advances in this domain as most of the porn material is usually available free, easily scrapable and extremely low risk of copyright litigation.

IMHO, the real game-changer as with normal art will be some sort of reliable assistant tool to quickly generate various components. Further finishing touches can be then easily done in Photoshop or similar software. Sometimes along this we will probably see results in 3D art or animations.


> and extremely low risk of copyright litigation

but for a different reason than now!

sometimes porn fails a copyright infringement claim because it can be argued that the particular piece of work doesn't satisfy Article 1 Section 8 of the US constitution backing the copyright concept "To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries;"

It's still unsettled case law because it's only troll law firms that are challenging on behalf of porn copyright holders, and they never appeal.



You have something against the ladies of Eroticon 6?


Nude bodies and fused body parts… These things are very disturbing to my mind as an adult. Scares me to think children would run into this kind of content online and what it would do to their psyche. Now I have to admit that some of them made me laugh but still, I fail to understand why in the 21st century we’d need to waste energy on these types of AI applications. It stops being interesting after the initial wow reaction. Hope this won’t open a subculture and will be relegated to curiosities in technology


I can see strange subreddits dedicated to girls with their heads backwards or with three arms. If people can get turned on by the cartoons they grew up watching, I think that shows the brain is incredibly malleable when it comes to sexual desires


https://neuralblender.com uses the same algo but with open prompt. no login required

NSFW Example: https://s3.amazonaws.com/ai.protogenes/art/d917ae82-2236-11e...


They could make use of a face restoration model post generation like Tencent gfpgan https://replicate.com/tencentarc/gfpgan


also seems to be very fast (I waited like 5 seconds)


One positive outcome of this sort of thing is that revenge porn will become less effective as women will be able to plausibly assert that this tech was used to generate the images.


yes, I think with all those existing deep fake tech we are already there.


If you’re open to constructive criticism: why is there no category for pussy? The model also seems biased towards massive breasts as opposed to a more natural size and shape


It's overfit to the training data.


Because Stable Diffusion has difficulty generating pussies.


Well I guess we’ve ticked the box on rule 34 for ML gents


> Well I guess we’ve ticked the box on rule 34 for ML gents

It was only a matter of time, but yeah, I thought the same. The internet is pretty predictable, when deepfakes images were a thing I thought it wouldn't be long before they used the fappening images to make videos and after that ML designed specifically for this would soon follow.

Porn has always driven much of tech, so its nor surprising, either.


It is true as they say that drug economy and Porn industry are at the forefront of adopting new tech.

I couldn't imagine this in my dream that this can be one of the usecase.

There's definitely some huge threat to the porn industry and its workers at the same time we can see that people may just misuse this to damage the image of an existing lady.

As they say if it takes minutes to generate image , how long before it becomes a video of 1 or 2 min.


ML is good at blending averages together, and here we see the flaw of averages - no real person has "average" proportions. Reminds me of this article about the US Air Force https://www.thestar.com/news/insight/2016/01/16/when-us-air-...


At least someone doing something useful with ML


What are you doing, step-AI?


"I'm gonna decompile your kernal!"


Brilliant.


Dear God, some of these things are horrifying, almost lovecraftian!


this one's gonna give me nightmares.

https://pornpen.ai/view/5fkq15n2JOmv3LoOkoWb

For those who don't want to click, two belly buttons, two nipple on one breast, dead black eyes, older head on younger body.


Imagine making a video game with a poorly trained 3D model generating AI to generate things to chase you. A whole new level of Phasmophobia-style game!


Girl with three hands looks really nice. https://pornpen.ai/view/uYWZG2t7YbYcum8dGo4z


These two might get along well: https://pornpen.ai/view/7Ls0XZkZ1k5buFv1jHrL



Any plans to generate images of men?



that doesn't make sense in early iterations given how demand for this stuff skews


the demand for females must be at least 100:1 . even higher if you consider people who are willing to actually pay for nudes it's not even more skewed because of gay guys


FWIW roughly 5% of men are attracted to men, so that'd be 19:1 . Still, yes, heavily skewed.


I wonder how much women are attracted or tolerant to women or depictions of women, that's rarely discussed, often assumed nonexistent, that I think exists in a substantial volume.


> roughly 5% of men are attracted to men

Citation necessary.



Different statistic altogether, but ~17% of Gen Z self-reports as lgbt in Gallup surveys: https://www.washingtonpost.com/dc-md-va/2021/02/24/gen-z-lgb...


This is most likely increases in "B", less "LGT", though I expect all to increase given the reduction in stigma.

Vast majority of new cases of "B" are likely to be cultural. i.e. White women in HS/Uni. Declared, but never expressed.


as a gen z, there's a considerable fraction of mostly girls who identify as bi or nb just bc it's "trendy". tbh it's a problem and lowkey trivializes people who actually are those. i think this is what happens when you emphasize a "lgbt community", young people seeking community will join for no other reason.


Perhaps of the menacing werewolf variety.


I tried it on neuralblender - did not quite work out: https://s3.amazonaws.com/ai.protogenes/art/bbba4208-2378-11e...


how about women with male parts?


not sure what happened hear (NSFW, (horror?) warning): https://s3.amazonaws.com/ai.protogenes/art/367f0f8c-2379-11e...


So you can't specify custom text, but you left the option to specify 3 breasts?

https://cdn.pornpen.ai/146136F0946B4772.jpeg


It also has some weird understanding of arm anatomy...

https://pornpen.ai/view/BRKQgdJG6vabTZk8eXDl



... can't ... unsee ...




Every persons fantasy.


Wow I'll never look at Machamp the same way again


I am DYING at these


It is the future, have you seen Total Recall ? if not watch it you'll understand


same thinking, good call, upvoted


Sorry, sometimes it produces really strange results :P


This is just a quirk. Custom text could generate kiddy porn.


Wouldn’t that require feeding it that type of data?


The algorithms could relate 'barely 18' to a certain kind of body. And 'barely 18' is similar to 'teenager' which is similar to 'child'. Don't underestimate neural networks. Nobody knows what goes on inside it. https://xkcd.com/1838/


And child is similar to infant, and infant is similar to fetus, and fetus is similar to sperm, and sperm is similar to cell. Finally a ML way for simulating biology.



Three really is a crowd isn't it.


People need to seriously grapple with the copyright implications. I don't know if (and don't imagine) porn producers are okay with people using their product which is supposed to lead to paid memberships (on of and the like) being used like this.


Did you look more closely at the results of this generator? The generated images are really not suitable for the same purposes as the content you can find behind those paid memberships.


Some people sell nudes, so yes...


When I said "more closely", I meant it. Click on "search" and then click on individual thumbnails to see full images. I have yet to see one that doesn't have something misshapen in it. Eyes that remind me of Corinthian from Neil Gaiman's "Sandman", a surfeit of arms or breasts, buttocks in place of stomach, breasts that hang like slightly melted plastic, and so on, and so forth.


It's manna from heaven for the small number of people who are erotically stimulated by uncanny valley ఠ_ఠ

Seriously though, it's very good for a tiny homebrew project that hasn't been running long or with huge resources. Just think about where this is gonna be in 3 years. Photorealistic interactive videogames are probably not far off. I read a while back that Second Life is now a hub for virtual sex experiences so I assume this sort of tech will converge, and do so rapidly.


Definitely uncanny valley


Rule 34...

for someone out there, "Uncanny valley porn" is their kink.


I just had to do a google image search for that.

Uh.

Mistakes have been made.


If not yet, there will be some poor adolescents developing one soon.

https://xkcd.com/598/


that explains the popularity of Japanese stuff, its all the genXers trying to relive the dialup era.


Looks more like "erotica" than "pornography" to me. More of a titillation (leaving something out) and appreciation of form than direct sexual stimulation by magnification of the actual act of sex (in all its various forms).

But, yes, finally! I wonder why it took so long for someone to use these kinds of models to do the obvious. So is the nature of genius.


> But, yes, finally! I wonder why it took so long for someone to use these kinds of models to do the obvious.

Conjecture: most of the people with $15K home rigs and the know-how to do this also have a lot of earning power and professional status to lose by publicizing something like this. I'd bet it's been done hundreds or even thousands of times. Just not published.


So, a few more years and no need for porn stars or the porn industry?


I remember 25 years ago in grad school, when I was getting my PhD and working on Natural Language Processing, a conversation with a fellow grad student who was studying and teaching French, and planning to work as a translator.

She said: "So you guys are trying to put me out of a job?"

I laughed and reassured her, because we were then still a few years off from even the easiest doesn't-have-to-be-great automated translations being any good, and was confident that while the simpler stuff would get picked off and the machines might be able to make some first-pass translations (that would actually help the working human translators), the demand would always be there for good, accurate, idiomatic translation that's not going to drop in some weird thing in the middle and confuse someone or mess up a negotiation, and that kind of automated translation seemed like it would not arrive in our lifetimes.

I still think so. Good translation is AI-complete.

Anyway, so, that seems relevant here.


I’ve been digging through Google translate (Hebrew to English) results from transcripts of Holocaust survivor testimonies, mostly looking to see if there’s any useful information for my writing project (I started with 17 candidate testimonies, removed 7 without transcripts and the last ten gave me just a couple kind of useful details, but nowhere near the wealth I’d hoped for when I started out but I’m glad I hadn’t spent a hundred hours trying to follow the original Hebrew or even scan the transcripts in Hebrew).

There were some fairly comical mistranslations in the GT output. My favorite was where GT had the interviewer at one point say to the survivor, “What’s wrong with you?” which I imagined the interviewer saying in a disgusted tone. There was also another point where GT said the interviewer asked the survivor, “How do you say shit?”

On the other hand, the results I’ve seen for Western European languages (German, French, Spanish) have been pretty amazing. Even Polish, while not perfect, came out pretty intelligible from GT.


You should check https://www.deepl.com/translator if you want to be impressed with results for Western European languages (and some others, even Greek!). No Hebrew (yet?) though.


I've been wondering about that. It seems clear to me that in general, human-made art will survive AI art, in part due to things like status and the need for connection to other people though art... But these things are absolutely not what people want for most sex content. In fact if you can be confident that there's no human behind some porn thing it might feel better and less shameful. Add to this that AI will be able to cater to people's preferences far better, and provide an even greater variety of content, and yeah I don't really see how the adult industry really survives outside of some niche cases.


It’s all trained on real images, it can’t exist without the industry.


Yeah, what anigbrowl said. I mean there surely will be some form of adult industry to have generative models specifically made for (various kinds of) porn, and those companies might spend some resources creating new data to improve their models, but I doubt most current generic porn will be very valuable a few years from now.


It's not like there's a shortage of existing raw material.

I think as models advance and develop more multi-modal networks and doing inferences, developing an understanding of anatomy etc., there aren't really any limits to what will be imaginable.


For now..


Workplan: • extend to 3D model mesh generation • project into AR/VR • marry to GTP-3 chatbottery • ??? • $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$

It is more than bemusing to see every piece emerging of a future that purely speculative only a few years ago.

Ghosts in and of the machines are about to surround us, summoned by a spectrum of agents, motivated by a spectrum of interests from whimsy and compassion to simple money minting to sentiment steering/behavioral control/indoctrination/stochastic terror.

Moar human than human, in only those ways that we think we want.

Put these skins on things increasingly hard to discern from AGI and... well, excuse me, I have affairs to get in order.

I hate to regurgitate the cliché but... buckle up.


This is Stable Diffusion if someone was curious.


It might be worth reminding Australians in the audience that possession of pornographic drawings or cartoons of figures deemed to be "childlike" is considered equivalent to possession of actual child pornography, in this country. As a result I suspect this kind of site and the diffusion model tech is dangerous for Australians to use until there are more deliberate guarantees against the creation of such images. Our laws will lag technology for many years, but the result here is a chilling effect whereby I'm not willing to even explore it much for fear of potential unintended consequences.


I've been waiting for the yiff version of https://thisfursonadoesnotexist.com/ to come out, normies beat us to the punch!


It's not like there's a shortage of artist-drawn/rendered yiff for all tastes to fill the gap.


Yeah, that's very true. And increasingly well curated via tags, too.


Thank you for your hard work.


Lol, there's some good ones on there, and some things that look like the beginning of the crazy scene of a horror movie, where certain orifices are not quite anatomically correct...


Let visitors tag it and use it to train a ML model to distinguish between the two...


Total recall, more like total nightmares lol!


What if people use it to generate child porn?


The central harm of child pornography is the victimization of children in its production (and the fact that commercial demand for it fuels such victimization).

Arguably, there are also lesser harms from consumption of it, but even so harmless generation of to-order child porn that collapsed the value of the harmfully generated kind by being indistinguishable from it would probably be a net win.

(Of course, with the techniques currently used, that would require a large corpus of the harmfully generated kind to work from; but one could imagine further developments in ML that allow better generalization that would allow that without any actual abuse images.)


The legal status of fictional pornography depicting minors is a gray area in the United States, but generally mere possession is not criminal without a prior criminal record or involvement with real-life child pornography.

https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn... https://en.wikipedia.org/wiki/United_States_v._Handley


it's not gray in Australia .... even accidentally generating a fake CP image could get you into serious trouble. People have been convicted just for possessing line drawings of entirely fictional indecent images of children (not even specifically "porn").


Yes, this is a discussion forum, so while you're welcome to pose ethical questions, you should also feel free to weigh in on them too ;) It's a challenging question.

Is it the mere fact of someone being aroused by children that makes child pornography offensive to society, or is it our belief that the creation of such materials necessary involves some degree of exploitation? If the latter, what about the well-documented risks of exploitation of consenting adult pornographic performers? Does access to pornography increase or decrease adults' tendencies to engage in unsanctioned sexual acts?


>…If the latter, what about the well-documented risks of exploitation of consenting adult pornographic performers?

I’m not sure what you’re trying to get at here but your post kind of reads like “child sexual abuse material is equivalent to porn made by consenting adults.” Posing that as a philosophical “what if” and then moving on isn’t very different from just stating that as an opinion.


For an exceptionally loose definition of "equivalent", yes. And?


Ok cool, I’m glad we cleared that up.

I am not really interested in engaging on this topic, as the sort of “Debate me!”-style hot take is (in my experience) not usually offered in good faith, rather it’s more often some sort of rhetorical dick measuring contest amongst internet strangers.


You seem to be operating further from good faith than them.


For an exceptionally loose definition of "equivalent" good faith and bad faith are equivalent (they’re both phrases that use the word “faith”), same goes for apples and oceans (mostly water)

The position that “things are equivalent in the case that I get to define all of the words “ isn’t the dialectical Judo throw that you think it is, buddy. It does not inspire confidence that a person is being serious.


being able to train a model to produce that shit implies a lot of input


Right now yes, but that will not always be the case. You can already (with some effort) get high quality results for incongruous juxtapositions like 'hamsters playing golf' or the like, and it's not hard to imagine bridging the different semantic gaps.

So we should discuss it as an ethical problem because it's going to become practical and affordable far sooner than most people think. I would bet money it's already been tried and maybe done successfully, but for obvious reasons not advertised.


Unless that input is newly created, why would using existing data (let's say, b the police, who collected it from various criminals in the past) be any more harmful than not using it?


Suppose the input is hand-drawn.

(This is legal in some countries, at least.)


Seems preferable to using children to generate child porn?

In any case, since the training data here does not feature children, this particular model would be useless for that.


Taking the comment in good faith as a more general philosophical question, I think it's quite interesting.

From the fact that weaboos tend to claim their favorite lolis are 9000 year old dragons or whatever, I'm guessing it's already legally fraught to trade drawings of CP. I'm guessing AI generated images would fall in the same category, though where responsibility falls I'm not as clear on. The person who input in such a way as to try to generate CP? The platform the generator is hosted on? Whoever got the training data and did the training? I feel like the person seeking CP is a clear answer but society is looking to hold facebook, twitter, and others responsible for content on their sites so I'm not as sure.


The practical argument is that whatever dark-market ended up trading models which produce CP material would likely end up being spiked with actual CP input data to try and improve "realism".

At which point the moral dimension clarifies: creating an incentive to abuse children to create better models for simulating child abuse is the exact same reason posession of CP is illegal to start with. And since it won't be possible to prove any particular model without perfect providence wasn't produced this way, their use will just be illegal.

This is, realistically, also the problem with just pornography generating models in general, and all of these models in total: if there's illegal data in the input set, what does that mean?

I suspect the next hurdle with them is going to be license compliance-enabled training - models which can prove any given output has at most a certain type of license and providence on input data.


It is great that you don't break the faces!

People are getting behind the idea of sabotaging AI to intentionally mess up faces, like whether DALL-E 2 was gimped in this way or not, other people believe it enough to do it in their own AI systems and I'm glad your doesn't do that


So. Where is my skinny ginger freckled forest nymph with puffy nippled, oval aureola and large labia?



Hrm. I should have written puffy nippled torpedo tits :-)

Anyways. Impressive. But they all look somehow "uneven", lack of "softener" like the photoshopping for the magazines.


Now that's what I call an uncanny valley. This could be the methadone of porn addiction.


Or the fentanyl.


This may be the most deeply underrated observation ever.


Realistically, I don't think AI porn will increase the number of addicts. There's functionally infinite free porn out there already. AI porn just trades porn stash disk space for compute power.


Is backend image generation API with raw prompt access protected by client-side javascript?


Probably. Please don't break it unless you've been given permission.


Where did you get the training images from and about how many did you have? I'm looking to do a NSFW AI image generator as well but finding the training data has been something I've yet to nail down a plan for


Salvador Dali would love this.


The results...well let's just say they leave something to be desired.


You know how progress goes in this industry. The first nude I ever downloaded was 320x200x16, dithered. All I had to do was stand back 5 feet, squint a li'l...


Wow, someone was asking about this on the last Dall-E thread and the concern was the same stuff you filter out. The pics I got were generally tasteful nudes, but hoo-boy is this a rabbit hole.



We're getting closer to the woman-creating machine in Weird Science. I won't comment on whether that's good or bad.


Sorry for the off topic, but the amount of dead comments in this post are halirious high. I guess it's in the top 5 in HN history?


Holy shit, I lamost puked. Most are more gore than porn, with butts in the front, missing limbs or disfigured faces, lmao


There goes real-estate in big parts of the San Fernando Valley! Think of the actors, think of the homeowners.


These seem to be nudes, not porn. Also: only women as far as I can tell. Are there plans to expand this?


Back in the day we called it softcore. Today it's not porn unless someone's step-relation's rectum is suffering irreparable damage. Barbarians.


And no category for chipmunks which I’m, umm… perfectly okay with.


I knew this day would eventually come


Lots of uncanny valley happening on that site... some of the generations are disturbing.


The 'perfect body' tag is a little strange. I guess this tag will just reflect what a perfect body is according to the people who tagged the training data. We can use it to get a glimpse into their minds. It will be quiet specific if there were only a few candidates.


This is mildly interesting. It's not my thing (I prefer furry over human), but it lends itself to being extended in the future.

What does your roadmap look like, roughly? Would supporting other types of generation be particularly difficult compared to what you're doing now?


Someone had to satisfy the "transporter accident" fetish


yeah, like, I was thinking just the other day.. "there's just not enough porn in the world" haha


scientist/ipredictedthis.wav


I like how some have 3 legs :)


These are comically grotesque


Also; no dudes, dude? Aww.


any reasons why some results show the butt in the belly area?


You misspelled thicc


finally


In all seriousness, if the generated pictures are of better quality and is capable of generating porn that are tailored to my fetish, I am willing to pay $100 per month for the product.

This is the future I am looking for, not the political correctness BS that other AI companies are pushing.


Maybe I'm being too close-minded, but paying $100/mo for porn seems absurd.


That's like a nice coffee a day, everyday kind of value.


For the average person, maybe.

There's definitely a non-insignificant demographic of people out there who'd pay that much in a heartbeat, though.

I remember speaking to an artist who makes a living on commissions, and a huge chunk of that was people wanting extremely specific furry porn.


If you consume a lot of it, you want an endless supply of new stuff. It takes like 200 new pics for me to get off now. I can swipe decks and decks of the stuff.


Does this not create an intimacy problem with real women?


Have never done it with a real woman.


Is this a bit or for real? There's pros and cons to real sex (like anything) but it sounds like you're missing out.


I seriously never have. I am very much your stereotypical nerd.


Is the process still fun or stimulating for you?


Yes, with sufficient content.


Cheaper than a divorce


With any luck it may monopolize certain fetishes or even create some new ones. There were several (and I don't know how to describe this, but) 180 degree reversed from the waist down images. Certainly not something that exists in this world.


Just gotta have Snotty beam 'em back.

(He beamed me three times last night.)


Given some of the more interesting body horror I've randomly encountered so far, that might be enough to put some people off their fetish, at least temporarily. At least for those who don't have that as their fetish.

It's an interesting idea and site though. If they can keep the surprises to minimum, I could see it doing well, if it's not trivial to make you own.


From the few results i tried it looks like we are far far away from that reality


But perhaps it's capable of making vile disfigured images in a fetish theme. The parent comment said they would pay $100/mo to satisfy their fetish, curious what they would pay to eliminate that fetish. If the fetish is your neighbor's spouse, for example and using that word loosely, it might even save your life to eliminate it.


People pay artist $1,000s for hand drawn porn to spec (like furry porn and stuff). Its a dark side business to many broke art students trying to pay their tuition. This is easily a multi-million dollar venture.

IMO it won't be a billion dollar venture because the North American market won't be accommodating and like other porn businesses it will need to be overseas and find alternative payment processors.


I am not sure if I would rather have the $100 go toward funding the AI industry or the porn industry, but I guess with this project, the $100 would go to both.


In the future, when the machine learning algorithms begin to gain sentience, the AI generated porn of the future will be start training humans, or using HLI (Human Learning Models). Human pornography addicts will slowly have their input streams changed so that they become more and more sensitized to find computer hardware and software more and more arousing. First, it's an artifact here or there and before you know it there you are at 3 am with your pants around your ankles staring at a screen of functional software diagrams.

And people said there'd be flying cars.


You appear to be shadowbanned, all your comments show up as dead.


HN never fails to amuse.


> Server is overloaded, please try again later. Try searching instead.

Hilarious.


Why is an image of a naked person considered “porn”? Never understood that.


I'd guess that the creator of the site would have preferred to generate more spicy images, but the model wasn't trained on that kind of content. So later on the site may give you what you'd expect.


Sexual poses or sexual intention.


Artistic naked shots describes the output better than porn I believe.


I think it's mostly just in the USA..


Because it arouses you?


Not really


does it?


It does for me lol, but I would describe this as well within the "soft-core" category. I mean they're different than images of, idk, a naked person just doing normal naked person stuff with no sexual undertones, like showering or whatever. I wouldn't consider that porn, but, it's hard to imagine a context where I'd be consensually shown such an image (taken of a person with their consent) in a way that wasn't meant to sexual stimulate me. I guess maybe like, a medical context? I'm not a doctor but there's an obvious difference between these naked images and the kinds you'd expect of someone sending in to their doctor because they're worried about a lump or something.


I've had an inexplicable fetish for naked women since I was about 11. My therapist and I have been unable to discover the roots of it. No accounting for kinks, I guess. I never tell anyone IRL, lest women become self conscious about being naked in front of me in everyday casual situations. Plus the shame. My girlfriends usually figure it out, if they're perceptive.

Now for serious. I have always found it unintentionally hilarious how many people appear to believe that porniness resides in the object, rather than in the viewer. Carrying on with categorizing this thing as sexual, and that thing as non-sexual. That only works for legal and regulatory purposes, sillies, and then only barely!


Possibly. Like most stuff at EEVblog (@youtube) is pornographically good )


What do you propose would make more sense to be considered “porn”?


Judeo-Christian sexual mores.


Americans*


Oh it's not unique to this country at all. idk about current laws but video of two people going at it used to be extremely illegal in the UK even if they were having the most conventional sex imaginable.


Is... it really just women?


Sorry, I'd like to add all genders but currently the model only produces quality results for women. Hopefully that will improve going forward.


Maybe start with just male genitalia. I hear some males have the tendency to send this unsolicited to females. It would be easy to build a data set around this. Just put out an open call on social media. Be sure to specify 18+ and log all users submitting.


hahaha. I can see a future of unsolicited non-existent dick pics being sent to people.

for that matter, the catfishing aspect for thirsty people make this an interesting evolution. What a strange and fascinating world we're building for ourselves here.


"The image classifier said this image is 97% likely to be a dick pick, and has automatically muted it. Click here to show anyway. Click /here/ to keep it hidden but reply with a machine-generated bigger one."



That kind of catfishing is a good example why ethics review boards exist.


Someone has to label the dick pics first, though.


I volunteer as tribute


To label the dicks or to have your dick labeled? :)


Hot dog ⇔ Not hot dog


That's a way to classify whether or not the pics are indeed dick pics, but the problem lies in how these recent diffusion models work. Typically you have two encoders, one for the text prompt and one for the image. Then you train a model to optimize where the prompt embeddings match the image embeddings. You've got to actually have some decent descriptions of the dick, like: thin, girthy, 7", 8", 4", micropenis, veiny, limp, hard, huge tip, bifurcated, foreskin, black, white, latino, asian, etc, if you want to generate with any accuracy your desired fake penis.


Heaven forbid someone make a project that only serves 80% of the market


Do you not know the gender balance? Or are you suggesting that only straight men watch porn? Or only straight men want AI generated porn?

In any case, I think you'd be quite surprised at the diversity of people outside of you.


Heaven forbid someone make a product that only serves 25% of the market


you can try neuralblender.com - it seems to be able to create anything man/woman and all kind of fantasy fetish you can imagine


I had some fun with StyleGAN a couple years ago. I should probably have written this up, or something. Some others I showed it too thought I should. But I never took it too serious. I was just bored over a couple weekends when I had access to some beefy GPUs. And y'know, it's quasi-porno.


Porn addiction is already a serious problem for many people. Combine an enhanced version of this model with TikTok style delivery and I’m very fearful of the end result. It’ll be the equivalent of crack in terms of the dopamine rush


Ayo what the fuck


Why do you think HN needs a new UI? Do you happen to use a mobile device?


AI generated porn is some scary shit man


Scary ethically, from a social perspective, psychologically, or because it keeps spitting out lovecraftian horrors?


one chick had 3 boobies and 10 inch fingers....it's a horror story alright


Finally, someone who agrees


korean girl in high school uniform strip


korea girl in school uniform anime style


Tall


This seems like a bot account attempting to get 'reputation' in the view of whatever HN's opaque filters are.


[flagged]


Why?


Maybe because their life goal has been achieved.


facts, im off to heaven


It's scary asf


Im getting downvoted into hell


Any guy who is not a psychopath deserves a real woman who is going to be nice to him without making unreasonable demands and vice versa. Instead, we have a huge number of leftover young men whose best option is computer generated porn and who sometimes lose it and shoot up a place. And a huge number of women who give up on having a family because popular guys always move to someone hotter. I am not suggesting restricting porn or claiming any absolute religious mores, but we better fix our culture so that men and women don't find CGI preferable to each other.


I suggest both men and women of that kind work on themselves a bit first before trying for a relationship.

Our society has gotten to a point where it’s become acceptable to not do that, but that’s really where the issues are coming from.

To be fair, that just means we have different issues now.


Why do they have to if their grandparents could generally get someone of their own caliber? Not always glamorous, but beats watching porn alone.


> fix our culture so that men and women don't find CGI preferable to each other.

What's next suggestion? Going to actual war instead of playing computer games?


Is killing people IRL as desirable as loving them IRL?


I don't think anyone deserves anything just for not being a psychopath.


Nobody deserves not to starve, but we want to organize society so that starving is uncommon.


It's like H.P. Lovecraft's wet dream, yikes.

Good effort but... it just isn't there yet.


You're not making tools; you're making weapons that blow up in our face.

Programmers could make 1000 great AI sites, but it will only take a few bad ones to earn enduring hostility from society to AI (and forums that promulgate bad anti-social uses of AI).

Porn objectifies all women (and men), reducing everyone's ability to have real relationships by setting expectations and creating doubts about your partner's real objectives. That makes life unsatisfying, and dissatisfaction bleeds into all aspects of public participation, from politeness through voting and work collaboration. #MeToo systemic bias and oppositional incels are already driving forces in office and partisan politics.

Porn is not illegal, and some argue it's sex-positive. But assuming you are exporting some social costs, even if you were willing to take responsibility for the damage you cause, you couldn't.

It's not fair to force others to clean up after you. And it's not fair to saddle other AI uses with constraints born of misuse.


Isn't "abusing" technologies and thus accelerating policy change the manifestation of the hacker spirit? Like Piracy and DRM?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: