You can use subjects such as people in scanned photos, yearbooks, and social media! (Celebs aren't allowed).
Gross. Just gross.
Also, a good illustration of the commodification of people. "Yeah, let me scan a pic of that person I creeped on back in high school and create niche kink scenes of them without their consent!" Sure, we can do that, but touching the face of that precious (and rich and powerful) celebrity, who at least to some limited extent HAS made their money by selling their image to the public for entertainment? No way Jose. But all those regular little people? Sure, make all the fetish content of them without their consent that you want.
How would you feel if some rando were making, like, r*pe fetish stuff with YOUR yearbook picture? Jerking it to the idea of YOU getting violently assaulted against your will? Is that the digital environment that you want to praise? I'm gonna go out on a limb and guess that you're probably not too comfortable with that idea. Even if you want to argue that they TECHNICALLY are allowed to do so, I'm betting the thought of it makes your skin crawl. As it should.
In a thread with a few disappointing posts, the one you're responding to is the worst of the bunch. Truly horrific.
You can use subjects such as people in scanned photos, yearbooks, and social media! (Celebs aren't allowed).
Gross. Just gross.
Also, a good illustration of the commodification of people. "Yeah, let me scan a pic of that person I creeped on back in high school and create niche kink scenes of them without their consent!" Sure, we can do that, but touching the face of that precious (and rich and powerful) celebrity, who at least to some limited extent HAS made their money by selling their image to the public for entertainment? No way Jose. But all those regular little people? Sure, make all the fetish content of them without their consent that you want.
Retweet. Disney has the financial firepower to sue to protect their precious iconography, but the everyday people are left in the cold.
You can use subjects such as people in scanned photos, yearbooks, and social media! (Celebs aren't allowed).
Gross. Just gross.
Also, a good illustration of the commodification of people. "Yeah, let me scan a pic of that person I creeped on back in high school and create niche kink scenes of them without their consent!" Sure, we can do that, but touching the face of that precious (and rich and powerful) celebrity, who at least to some limited extent HAS made their money by selling their image to the public for entertainment? No way Jose. But all those regular little people? Sure, make all the fetish content of them without their consent that you want.
How would you feel if some rando were making, like, r*pe fetish stuff with YOUR yearbook picture? Jerking it to the idea of YOU getting violently assaulted against your will? Is that the digital environment that you want to praise? I'm gonna go out on a limb and guess that you're probably not too comfortable with that idea. Even if you want to argue that they TECHNICALLY are allowed to do so, I'm betting the thought of it makes your skin crawl. As it should.
In a thread with a few disappointing posts, the one you're responding to is the worst of the bunch. Truly horrific.
Lighten up!!! As long as one doesn't put "it out there" and just keeps it themselves, there's nothing wrong with that. I think its common sense (for quite a long time) before AI, not to post unapproved content or digitally manipulated content. In essence, its no different than taking that year book and doing whatever one feels like in private. Get off your sanctimonious high horse!!!
Lighten up!!! As long as one doesn't put "it out there" and just keeps it themselves, there's nothing wrong with that. I think its common sense (for quite a long time) before AI, not to post unapproved content or digitally manipulated content. In essence, its no different than taking that year book and doing whatever one feels like in private. Get off your sanctimonious high horse!!!
See, this is where you are wrong. Sure, it's always been possible to alter images with photoshop for last couple of decades but ethics aside it took time and effort to do so. Recent AI models make it a touch of the button to create something indistinguishable from a real image. That's a dangerous amount of power and is becoming a huge legal and moral issue. I know several schools that have had blackmail and extortion attempts after images of students were downloaded from their website and altered. Kids are already creating deepfakes and sharing photos of other students. What you are saying is producing the material is fine as long as it's for your private wank. I'd argue respectively that you are wrong on every level.
The new Sora2 model prevents users from creating videos from uploaded real images and requires the user to scan and verify their face if they are to be used in videos. I can see other platforms doing the same not just for videos but also image editing to prevent exactly the usage you seem to be defending.
Breslowlab said: Lighten up!!! As long as one doesn't put "it out there" and just keeps it themselves, there's nothing wrong with that. I think its common sense (for quite a long time) before AI, not to post unapproved content or digitally manipulated content. In essence, its no different than taking that year book and doing whatever one feels like in private. Get off your sanctimonious high horse!!!
Which yearbook photos are you referring to? We need to make sure per the rules of decency and the rules of the website, plus the threat of the credit card boogiemen that we are not discussing creating spank bank from the yearbook pictures of underage people.
Breslowlab said: Lighten up!!! As long as one doesn't put "it out there" and just keeps it themselves, there's nothing wrong with that. I think its common sense (for quite a long time) before AI, not to post unapproved content or digitally manipulated content. In essence, its no different than taking that year book and doing whatever one feels like in private. Get off your sanctimonious high horse!!!
Which yearbook photos are you referring to? We need to make sure per the rules of decency and the rules of the website, plus the threat of the credit card boogiemen that we are not discussing creating spank bank from the yearbook pictures of underage people.
Unfortunately there are quite a few outright nonces posting and worse, making, wam content online. There are a few channels on youtube that upload videos of children from tv shows etc, but the worst I've come across has been this guy:
[[admin] details removed to prevent giving publicity or traffic to the channel in question]
Blatant paedophile running events and filming it to post online later, absolutely 100% a wammer.
Knowing that the person in question isn't actually being pied/gunged etc completely kills it for me. Anyone who can get off on that is very, very lucky.
You can use subjects such as people in scanned photos, yearbooks, and social media! (Celebs aren't allowed).
Gross. Just gross.
Also, a good illustration of the commodification of people. "Yeah, let me scan a pic of that person I creeped on back in high school and create niche kink scenes of them without their consent!" Sure, we can do that, but touching the face of that precious (and rich and powerful) celebrity, who at least to some limited extent HAS made their money by selling their image to the public for entertainment? No way Jose. But all those regular little people? Sure, make all the fetish content of them without their consent that you want.
How would you feel if some rando were making, like, r*pe fetish stuff with YOUR yearbook picture? Jerking it to the idea of YOU getting violently assaulted against your will? Is that the digital environment that you want to praise? I'm gonna go out on a limb and guess that you're probably not too comfortable with that idea. Even if you want to argue that they TECHNICALLY are allowed to do so, I'm betting the thought of it makes your skin crawl. As it should.
In a thread with a few disappointing posts, the one you're responding to is the worst of the bunch. Truly horrific.
What you're describing is no different than someone mentally fantasizing about another real life person in a WAM scenario (or any other scenario) and masturbating to that fantasy. It's not "horrific", it's not even the real person's business, and it's something nearly every human being has done.
diegoshay said: What you're describing is no different than someone mentally fantasizing about another real life person in a WAM scenario (or any other scenario) and masturbating to that fantasy. It's not "horrific", it's not even the real person's business, and it's something nearly every human being has done.
Right. I don't think there's any of us with the WAM fetish who hasn't seen a photo or video of someone attractive and thought "Man I would love to see that person get pied/ slimed/ soaked/ thrown in the mud" or whatever your kink is.
diegoshay said: What you're describing is no different than someone mentally fantasizing about another real life person in a WAM scenario (or any other scenario) and masturbating to that fantasy. It's not "horrific", it's not even the real person's business, and it's something nearly every human being has done.
I think the difference you are missing is the fact that someone is fantasizing about a minor. That is the horrific part.
The AI part of it is a different kind of horrific, like you are watching someone you care about get diagnosed with a disease that there is no cure for. Where you have to sit and watch it whither away and die. Yes it is over dramatic, but with the more and more AI advances the lower the appeal will be for the real stuff from a financial stand point. Why pay people for something when you can just make it yourself for free mentality. Similar to that of executives of movie companies that think 8 hours with an actor and a green screen gives them the right to have AI use those 8 hours forever. Completely removing the talent and human element from the finished product. Where instead of enhancing something, it completely takes over for a human making another industry where a human is obsolete. Yes, I know we are maybe 5 to 10 years away from this, but it is coming.
You can use subjects such as people in scanned photos, yearbooks, and social media! (Celebs aren't allowed).
Gross. Just gross.
Also, a good illustration of the commodification of people. "Yeah, let me scan a pic of that person I creeped on back in high school and create niche kink scenes of them without their consent!" Sure, we can do that, but touching the face of that precious (and rich and powerful) celebrity, who at least to some limited extent HAS made their money by selling their image to the public for entertainment? No way Jose. But all those regular little people? Sure, make all the fetish content of them without their consent that you want.
How would you feel if some rando were making, like, r*pe fetish stuff with YOUR yearbook picture? Jerking it to the idea of YOU getting violently assaulted against your will? Is that the digital environment that you want to praise? I'm gonna go out on a limb and guess that you're probably not too comfortable with that idea. Even if you want to argue that they TECHNICALLY are allowed to do so, I'm betting the thought of it makes your skin crawl. As it should.
In a thread with a few disappointing posts, the one you're responding to is the worst of the bunch. Truly horrific.
What you're describing is no different than someone mentally fantasizing about another real life person in a WAM scenario (or any other scenario) and masturbating to that fantasy. It's not "horrific", it's not even the real person's business, and it's something nearly every human being has done.
If you can't discern how these two situations are different... oh boy.
The AI part of it is a different kind of horrific, like you are watching someone you care about get diagnosed with a disease that there is no cure for. Where you have to sit and watch it whither away and die. Yes it is over dramatic, but with the more and more AI advances the lower the appeal will be for the real stuff from a financial stand point. Why pay people for something when you can just make it yourself for free mentality. Similar to that of executives of movie companies that think 8 hours with an actor and a green screen gives them the right to have AI use those 8 hours forever. Completely removing the talent and human element from the finished product. Where instead of enhancing something, it completely takes over for a human making another industry where a human is obsolete. Yes, I know we are maybe 5 to 10 years away from this, but it is coming.
If it makes you feel any better, the economics and practicalities of creating real-life WAM were fucked well before AI--and you can thank society and the fetish media production industry for that. Within the 5-10 year timeframe you're predicting, women will be sick enough of creepy male producers' shit that they'll either make their fees to work with male producers too exorbitant to pay for or they'll flat-out refuse to work outside the home. The new misogyny-driven purity culture will continue to shame peers who would otherwise pay for content while financial institutions make getting paid for the work even more difficult. By the time AI becomes enough of a factor, it'll probably be seen as a welcome relief. At least it would be if it were easy and affordable, which it won't be because business.
So no need to to fret about the damage AI may do in the future when humans are doing more than enough right now.
TheSpecialist said:If it makes you feel any better, the economics and practicalities of creating real-life WAM were fucked well before AI--and you can thank society and the fetish media production industry for that. Within the 5-10 year timeframe you're predicting, women will be sick enough of creepy male producers' shit that they'll either make their fees to work with male producers too exorbitant to pay for or they'll flat-out refuse to work outside the home. The new misogyny-driven purity culture will continue to shame peers who would otherwise pay for content while financial institutions make getting paid for the work even more difficult. By the time AI becomes enough of a factor, it'll probably be seen as a welcome relief. At least it would be if it were easy and affordable, which it won't be because business.
So no need to to fret about the damage AI may do in the future when humans are doing more than enough right now.
The things you are talking about are not new. There have been creepy boys out there producing videos for a long time. Models talk to each other and have a warning list. Models also talk about the good producers out there that make things easy to work for. I am not sure which creeps drive people away from this fetish faster, some of the people here who say absolutely horrible things to women in DM's, which by the way are protected and women cannot use them to out people for their bad behavior. Or you have the way people demand to see evidence of who people are because there have been maybe 5 or 6 models out of the hundreds that have come here that were found out to be fake. One of those models was even a mod here that passed the sniff test from the admins. There are people that have met on this site that have had more conversations on their phones, skype, in person etc than the owners have with anyone chosen to be a mod. (They do not have any kind of conversation with mods outside of DM's.)
The misogyny driven purity culture is nothing new either. It has been a thing driving people away from this fetish for as long as I have been here. I have lost track of how many ammeter producers have been ran out of here by those kinds of people. This site has been used against people in custody cases, divorces, etc.
As far as no need to fear about the damage AI can do. I think Kelly Carlin might have something to say about that. She is fighting to get the AI stand up special released of her dad, George Carlin pulled off the internet. Her dad's material was used without permission to train AI and then a special was released completely AI generated. The damage is already being done.