I've been messing around with Image FX for the last few days and posting images on the AI page, mostly of scenarios I've created based off of people I've seen, but also of old scenes that are no longer available for purchase.
With that said- what are the optics of generating content inspired by stuff that IS for sale? Not for sale, obviously. More so like in the next batch of pictures I throw up, one of them would be like "this picture is inspired by So-so's scene."
The model would obviously look nothing like the model in the scene. But still. Open to discussion.
Personally, I'm not a fan of any A.I. content. I rather see the real thing. I don't agree with using the images of actual real people and manipulating them into wam situations. Also, I do find it kinda frustrating when searching for something on youtube and immediately 5 or 6 a.i. generated wam videos pop up.
BarryMcCockiner2 said: I've been messing around with Image FX for the last few days and posting images on the AI page, mostly of scenarios I've created based off of people I've seen, but also of old scenes that are no longer available for purchase.
With that said- what are the optics of generating content inspired by stuff that IS for sale? Not for sale, obviously. More so like in the next batch of pictures I throw up, one of them would be like "this picture is inspired by So-so's scene."
The model would obviously look nothing like the model in the scene. But still. Open to discussion.
A pretty clear line for me would be putting people in fake situations they're not actually in, especially celebrities. There's a difference between fantasy and creating sexual/fetish content and involving someone without their consent. So, to use AI to put [insert celebrity here] or [insert coworker here] into a fetish video is obviously a big red flag. That's not a new issue, though -- face swap technology has been around awhile and has been used to put celebrities into WAM videos.
The other one I was thinking about would be using AI to replicate someone else's content, so if we had a sort of AI Slapstickstuff that was making near-duplicates of Slapstickstuff videos. The problem is that that's exactly what AI is supposed to do -- duplicate the content of others, often trained upon and ingested without the consent of their creators. The systems we see now are built on some really problematic copyright issues, ethically, so when I saw making a second-rate Slapstickstuff via AI is a problem, all of it really is duplicating someone else's content that's already out there, more or less.
BarryMcCockiner2 said: I've been messing around with Image FX for the last few days and posting images on the AI page, mostly of scenarios I've created based off of people I've seen, but also of old scenes that are no longer available for purchase.
With that said- what are the optics of generating content inspired by stuff that IS for sale? Not for sale, obviously. More so like in the next batch of pictures I throw up, one of them would be like "this picture is inspired by So-so's scene."
The model would obviously look nothing like the model in the scene. But still. Open to discussion.
A pretty clear line for me would be putting people in fake situations they're not actually in, especially celebrities. There's a difference between fantasy and creating sexual/fetish content and involving someone without their consent. So, to use AI to put [insert celebrity here] or [insert coworker here] into a fetish video is obviously a big red flag. That's not a new issue, though -- face swap technology has been around awhile and has been used to put celebrities into WAM videos.
The other one I was thinking about would be using AI to replicate someone else's content, so if we had a sort of AI Slapstickstuff that was making near-duplicates of Slapstickstuff videos. The problem is that that's exactly what AI is supposed to do -- duplicate the content of others, often trained upon and ingested without the consent of their creators. The systems we see now are built on some really problematic copyright issues, ethically, so when I saw making a second-rate Slapstickstuff via AI is a problem, all of it really is duplicating someone else's content that's already out there, more or less.
Agree with both of your points- and as much as I'd like to think there's a difference between, say, asking AI to generate an image of a woman that looks nothing like my neighbor's hot wife but has a similar hair color and looks the same age as her and making and making an AI slime scene with a picture of my fiancée and posting it on here without her knowing (which I would never do).
My problem with AI is that those channels are starting to flood youtube where you used to be able to find lots of great clips. Now it's 99% fake nonsense of some guy's WAM fantasies. It's fine if you want to make that for yourself, but keep it to yourself. Just my opinion on the matter.
I would think you'd have to work pretty hard to violate some perceived infringement when it comes to wam. "Pretty girl hit in the face with a pie", or "Bitchy patron gets the sliming they deserve" are pretty ancient, pretty common setups. I suppose there might be some huffiness if you recreate something shot-by-shot, line by brilliantly-wordsmithed-line, but why would anyone bother to put in that much work? Tempest in a teapot, imo.
I'd prefer seeing a poor real shoot over the "perfect" AI shoot. AI is just stale yet too clinically clean. I like the idea that someone has gone to the trouble to prep even the littlest amount of mess. You can't recreate the actual reactions of proper mess.
I wound up taking the post down, for a variety of reasons. Privacy, writers block, and partially because I have a very addictive personality and was finding myself spending way too much time on my phone creating these scenes. Figured it would be best to just remove it.
Not in a WAM context, but I have seen several posts recently where a male ice dancer is carressing his partner's "undercarriage" during a lift in a way that would not be seen on the competioin rink.
Whilst it may be vaguely amusing for the casual onlooker, it must be distressing for the couple involved. Indeed it may be detrimental to their careers.
...Or rather, a guy last night posted what he called an "amazing" excerpt from the 2025 reboot of Guerreros Colombia, a Latin American show, and asked if anyone had the longer clip. Unfortunately, there IS no 2025 reboot of this now-cancelled show, and I responded that what he'd found was actually an AI fake. And I guess he deleted the post out of his anger at being duped. Which... fair, but it WAS a good fake, to the point that I spent a few minutes Googling around for any evidence that the clip might actually be real before posting my response.
AI content has already crossed so many lines so often that people are not even bothering to draw lines anymore. Deep fakes: check. SPAM: check. Scams: check.
But what's most fascinating/frustrating to me is that the quality of the content has hit a HARD plateau where even the good stuff, if you zoom out and think objectively, still looks like hot garbage. Sure the hands and teeth have been fixed, but the content still lives in a place of soulless banality that it cannot crawl out of. And the creators of this content are still flooding the internet with this uninspired same-note crap. Even the gooniest gooners are having enough of it.
I'm ashamed that I started to buy into the "It'S gOnNa Be InDeStInGuIsHaBlE!" hype earlier this year. There's still a lot of work to do and the companies with the deepest pockets aren't going to invest in giving us better AI fetish porn. At this point the biggest line that AI content has crossed is the line of annoyance. It's all so tiring. Give us real human talent or GTFO.
Where I see a real danger is when a famous person or politician is in a video either promoting or trashing another prominent politician, such as Mr. Trump. The voice and face are often very realistic and the person's image is tarnished by all who think they actually said what was in the AI video. Suddenly people start to hate or support the person who actually never said what we see and hear, and usually isn't even aware of its existence.
I think that ALL AI content must be identified as such. At least we would know what the hell we are looking at.
As for WAM, it can come close, but I prefer the video not to try and copy someone else's work. If anything, it gives us an opportunity to go wild in our imaginations to create a scenario that might be impossible to do in real life, due to costs or other reasons. I did an AI video about a small city flooded by mud, where nobody gets hurt and everyone seems to enjoy it. This would not be possible in real life. But using AI to copy existing producers' work or imitate actual people makes little sense to me.
AI crosses the line when it shows real people doing things and saying things they never did. The "new" George Carlin special for example. AI also crosses a line when it is used to replace years of experience and training. For example, going to AI to write your code, or paint a picture etc. These are things that people spent years working on skills to do these things. Then to think AI can just replace it one day. AI cannot replace talent and free thought.
dalamar666 said: For example, going to AI to write your code
To be fair, that ship has sailed and isn't coming back, in the same way no-one's ever going to use horses for power or transport again the way we did before the steam engine and electric motor. A major part of my $dayJob used to be as a Perl coder. Nowadays, AI does all the coding, what us developers do is instruct the AI in what to code. It still needs skilled managent so people with the ability to read and write code are still needed - AI can go down some monumental rabbitholes and then refuse to come back from them. I spent an hour arguing with Google Gemini yesterday trying to get it to give up on the obsession it had built up over the wrong piece of code - in that case I eneded up fixing the system by hand and then showing it the changed version, after which it started apologising like a pesant in front of an angry medieval monarch about to summon the executioner for ignoring my multiple clear and precise instructions as to the actual issue - but it is steadily and rapidly improving, and learning from its mistakes, so while I'm fairly sure my job is safe for the rest of my working lifetime, I'm not so sure for anyone coming out of uni with a programming degree today.
But the entire history of work is littered with cases where someone invented a machine to do something previously done entirely by hand, and overnight replaced a centuries-old industry. It happened to the weavers, it happened to many others. In the middle of the 19th century there were quite a lot of artists who made a living painting portraits of not just the aristocracy and monarchy, but also of upper middle-class families who could afford to have the odd painting of themselves done, though perhaps only once or twice in a lifetime. Then someone invented photography and that entire business model went the way of the dodo, almost overnight. Gas networks made the people who sold lamp oil obsolete. Electric light in turn made gas light obsolete and with it all the people who sold the consumables it needed. And now long-lasting LED bulbs mean there's no more need for millions and millions of filament lamps to be made and sold, and replacing a light bulb is something you do less than once a year in an entire house, rather then every lamp every year. That's had an effect on the shops that sell lightbulbs, as they now see way less frequent trade, even though the bulbs they do sell cost more.
The only line crossing might be selling vids with real people likenesses. Creating them at home no prob, creating and putting them on the Internet, no prob. It's fantasy art at this point people. I look forward to the day when I can tell my TV to create a wild pie fight with actresses from the 70s and 80s and have it look as real as real can look.
ACE PIE said: The only line crossing might be selling vids with real people likenesses. Creating them at home no prob, creating and putting them on the Internet, no prob. It's fantasy art at this point people. I look forward to the day when I can tell my TV to create a wild pie fight with actresses from the 70s and 80s and have it look as real as real can look.
Completely wrong. Putting real people in fetish situations that they did not consent to in advance is fundamentally wrong, and will almost certainly eventually be illegal, though as ever it'll take the law a while to catch up with the technology.
Fantasy characters you've created why don't resemble anyone real, fair enough. But misusing the linkesses of any real people like that, absolutely not.
Think of it this way, how would you feel if someone took say CCTV footage of yourself from an airport or city street, and used it to AI you into the middle of a totally photorealistic mass orgy of a fetish you don't have and would be repulsed by, and put that on the Internet, and then your boss, business colleagues, or friends came across it?
Meanwhile until the law catches up, the UMD rule is that the only way you can post AI WAM (or indeed any created fetish images, regardless of how created) of real people here, is if you have their full informed consent and a signed model release. Exactly the same as if you'd done an actual photo or video shoot with them.
The only thing I'd say to that is as long as it's noted as AI created it's gonna be legal. If you are a public figure you are legally free game as long as it's noted and nobody can reasonably believe it's real. This stuff has already been argued in court. Keep in mind AI IS going to be infused in home entertainment, they ARE working on that. Music artists complain as well because there is a ton of music that features original music with current and deceased artists. As long as the music itself doesn't violate copyrights it's all good. Legally is all that matters the morality part is subjective.
ACE PIE said: The only line crossing might be selling vids with real people likenesses. Creating them at home no prob, creating and putting them on the Internet, no prob. It's fantasy art at this point people. I look forward to the day when I can tell my TV to create a wild pie fight with actresses from the 70s and 80s and have it look as real as real can look.
Funny. Every word you just said was wrong.
No. You absolutely cannot use (much less abuse) another person's likeness without their consent. We have laws about this for a reason
No. It is not "fan art" (which in and of itself is hilarious because calling it "art" is a stretch). Using another person's likeness in any mature or graphic scenario without their consent is illegal at its best. A violation at its worst. We ALSO have laws for this but they need an overhaul to include AI content.
In the end, Hollywood actors, artists and writers already said their piece on the matter and won. The law will be changed to match their language in the very near future if it hasn't already and anything you're longing for will most likely be nuked at the very first court ruling.
ACE PIE said: The only thing I'd say to that is as long as it's noted as AI created it's gonna be legal. If you are a public figure you are legally free game as long as it's noted and nobody can reasonably believe it's real. This stuff has already been argued in court. Keep in mind AI IS going to be infused in home entertainment, they ARE working on that. Music artists complain as well because there is a ton of music that features original music with current and deceased artists. As long as the music itself doesn't violate copyrights it's all good. Legally is all that matters the morality part is subjective.
Not true at all. One of the main conditions of the actors strike was because of AI and it's potential for abuse and in that they struck a huge blow to any possible misuse of their likeness in the industry and what typically trends industry wide does become legislation. We are not talking about clear manips of a celeb poorly edited in Gimp. We are talking very lifelike footage that can cause clear detrimental harm to the individual. Because of that nature of the technology, you can kiss that goodbye and if your hopes are in the hands of people "doing the right thing" and labeling their content as AI generated in an influencer age, I have a bridge to sell you lol
BarryMcCockiner2 said: I've been messing around with Image FX for the last few days and posting images on the AI page, mostly of scenarios I've created based off of people I've seen, but also of old scenes that are no longer available for purchase.
With that said- what are the optics of generating content inspired by stuff that IS for sale? Not for sale, obviously. More so like in the next batch of pictures I throw up, one of them would be like "this picture is inspired by So-so's scene."
The model would obviously look nothing like the model in the scene. But still. Open to discussion.
Use of likeness and what content you use to train your models come to mind. Obviously make sure the producer you decide to use as inspo is no longer in production and not still selling videos. Otherwise it's kinda hard to copyright a pie to the face lol
Pom Pom has used it to animate mud wrestling stills from magazines which seems cool because it's still technically the photo. Just animated for a few seconds.
I use it gently for edits such as generative remove for distracting things in the background of my photos so it has a place. Just a matter of using it respectfully which you seem to be trying to do so it's appreciated
I should have clarified that posting AI nudes or porn that can cause harm is s different.The Hollywood strike was about studios owning likenesses and using AI to make movies with those likenesses without real actors. All of that stuff had to do with commercial use. At your home you can make whatever you want legally as long as you don't try to sell it. Sharing AI porn with real likenesses can get you in trouble too. But generally you have to be able to show harm. Save this thread and revisit in 5 to 7 years when AI TVs hit the market, you will be able to program your own entertainment.
Bobographer said: Where I see a real danger is when a famous person or politician is in a video either promoting or trashing another prominent politician, such as Mr. Trump. The voice and face are often very realistic and the person's image is tarnished by all who think they actually said what was in the AI video. Suddenly people start to hate or support the person who actually never said what we see and hear, and usually isn't even aware of its existence.
I worry just as much about the opposite actually. What if people get off the hook for things they have done because people can't prove the footage of it is real? If that "locker room" video of Trump emerged today he could claim it was fake and AI did it, and a lot of his sycophants would swallow it like they do with everything else he says.
As for WAM use, I've had a mess around with it. Some of what I'm creating makes me feel like a bit of a creep really. I justify it by telling myself I'm only bringing the thoughts in my head to life, and I'm not sharing it with anyone.