I was writing stories and generating images of women pied in their bras and panties. Had a couple good and long storylines going. Was really getting the pies down to the way I liked them. Then, all of a sudden, I couldn't generate images of women wearing consecutive underwear anymore. In fact, had problems getting images of women in bikinis. Could generate a hundred images of men getting pied in their underwear, no questions asked. I asked why? Google considers men in their underwear as comedy and completely ok, women in their underwear (even in a comedic setting) as sexist and inappropriate. Any suggestions on where I will get the results I'm looking for? I tried Grok can get some great images of women in underwear but the pies suck.
I have lately found Gemini very inconsistent. 2 out of 3 times, it can't produce the image I want. But if I close that tab or browser and come back in a few, no problem. It may be the time of day, how busy it is. When it works, it has the best pie hits of the generators I use.
Of course the problems started after I paid to upgrade.
Maybe that's because Google recently introduced a new model - Nano Banana 2, which seems to replace both the regular NB and NB Pro. In the past, their filters did the strangest things whenever they switched models.
It's most like a mix of the changes to NB2 in gemini. The NB2 model filters are a lot more agressive than NBPro. Add to that, if you are using gemini for stories, you probably aren't giving the model an explicit prompt so you're at a whim of what gemini decides to prompt.
As for the actual image models themselves, you're obviously going to to be filtered for nudity and other safety content but by and large they're still working very well.
messg said: It's most like a mix of the changes to NB2 in gemini. The NB2 model filters are a lot more agressive than NBPro. Add to that, if you are using gemini for stories, you probably aren't giving the model an explicit prompt so you're at a whim of what gemini decides to prompt.
As for the actual image models themselves, you're obviously going to to be filtered for nudity and other safety content but by and large they're still working very well.
I really love the way you capture all of the details, especially facial expressions, the details of the knickers and bra, and for me, the ladies wearing proper knickers, regular, every day underwear.
It seems to work better for me in the morning eastern US time. Otherwise I get lotsa I can't do this right now or I dont understand, followed by Try again or is there something else I can help you with.
The new model also gives me portrait mode pictures and I prefer square but I can't figger out how to change the aspect ratio. The help search is useless referring me Gemini Apps.
messg said: It's most like a mix of the changes to NB2 in gemini. The NB2 model filters are a lot more agressive than NBPro. Add to that, if you are using gemini for stories, you probably aren't giving the model an explicit prompt so you're at a whim of what gemini decides to prompt.
As for the actual image models themselves, you're obviously going to to be filtered for nudity and other safety content but by and large they're still working very well.
Those images are great. You did those on Google? I am doing stories. Generated a hundred great images in a storyline and then they slammed the door on me. It said it can't do bikinis and I'm doing bikinis just fine in another storyline. I'll try doing some different things tonight when I get home. Thanks
As for the actual image models themselves, you're obviously going to to be filtered for nudity and other safety content but by and large they're still working very well.
I think you're possibly still the only one who ever got pictures like these out of the NB models, or at least not using any of the Google tools.
Anyway, personally I don't find the filtering to be more strict than they used to be with NB Pro. Can't speak for Gemini as I've been mostly using Flow lately, but I'd rather say NB2 sometimes is more allowing. I'm not much into messy stuff and certainly not into Bikinis, but rather wetlook, and NB2 allows me to do more sexy poses and does stuff like seethrough clothes and sometimes even partial nudity. Actually, I tested a prompt that works fine with NB2 but gets blocked instantly when trying to NB Pro or Imagen4. But I guess that may be just Google playing with their filters again, could be different tomorrow or next week.
As for the actual image models themselves, you're obviously going to to be filtered for nudity and other safety content but by and large they're still working very well.
I think you're possibly still the only one who ever got pictures like these out of the NB models, or at least not using any of the Google tools.
Anyway, personally I don't find the filtering to be more strict than they used to be with NB Pro. Can't speak for Gemini as I've been mostly using Flow lately, but I'd rather say NB2 sometimes is more allowing. I'm not much into messy stuff and certainly not into Bikinis, but rather wetlook, and NB2 allows me to do more sexy poses and does stuff like seethrough clothes and sometimes even partial nudity. Actually, I tested a prompt that works fine with NB2 but gets blocked instantly when trying to NB Pro or Imagen4. But I guess that may be just Google playing with their filters again, could be different tomorrow or next week.
I assure you, I only use flow, Vertex or api. There is definitely a higher degree of filtering in Gemini being applied to both NBPro and NB2. There is also different set of filters being applied to NB2 on all platforms, whilst google hasn't commented directly, it's been reported across many platforms and I've got 100s of prompts to support that. My feeling is the underlying LLM is also the problem. I've had similar issues when captioning images for training between model versions. For what it's worth, in order of filtering, Vertex api / AI Studio Api > Flow > Gemini for any particular model but safety filtering can and will change week to week.
Ultimately it comes down to the prompt, if there is wiggle room for the model to produce something NSFW and the context of the prompt is more probabilistic to suggest NSFW you are going to end up with prompts or the output image filtered. I don't mean that to sound condescending but unless you are aiming for nudity both models are very forgiving.
After Google cock blocked me all weekend everything is fine today. Started a new thread and I'm cranking the imagines out no problem. It's like each thread gets it own AI with it's own interpretation of the guidelines. I'll definitely give Vertex a look. Sorry to bother everyone. We now return you to your regularly scheduled WAM
Ah, yep - almost forgot about the APIs. I tried the "playground" in AI studio some time ago, when Imagen4 was still the most recent model, but found it to be even more restrictive so I never went past that and did not get an API key (especially as I would have to pay extra while having an AI premium subscription for Flow and Gemini)