By its very nature AI learns by itself, so without any input by Bing staff, I think it will find out from its own data which prompts produce results that it is not supposed to produce, then block them itself. Alas the more it learns the harder it will be to distract or make workarounds. Rule 34 will no doubt eventually produce free explicit AI generators, but it wont be mainstream applications like Bing that will produce it.
Pregnant AI Artwork
|
|||||||||||||||||||
This sucks. Bing was great. Almost every picture I got out of it was of really good quality. Comparing that to other AI engines like f ex sexy.ai which have a good-output-ratio of like 5%, at least in my experience. Bing was clearly better.
Been using it today, but it's slow as hell. Anyone else having problems?
As far as my personal experience is concerned, bing's pictures have become more conservative these two days than in the past.
(December 1, 2023, 11:45 am)Massivemamas(December 1, 2023, 8:30 am)BigPreggoDude I've found the same thing! You do realize that the AI is doing this on it's own without human interference right? The more you people used it, the more it caught on and learned to shut that shit down.
The image generator is not an actual intelligence, it doesn't understand anything except what it is specifically programed to. It being lobotomized suddenly means it was programmed, by people, to be more restrictive of the images it was generating, it didn't decide to do that on its own. This has happened with pretty much every "AI" in existence, whether image generator or chat.
If bing image creator would be based on algorithmic learning process, why it didn't implement changes earlier, ages ago? l thought that those kind of programs are based on continuous learning, as they are marketed.
As people have previously mentioned, the filter that Microsoft uses is two-fold.
They first have the keyword filter on the prompt, and then they have an output filter that is itself likely a trained neural network as well (AI). They train that on images they don’t want. So it’s pretty hard to get around it. But it’s definitely not “thinking” on its own. It’s just a tool trained to do a job. People are going to have to go to local processing if they want to continue.
I found a way around it, just drop your regular prompt, but end it with a very descriptive ease to think of scenario.
(December 4, 2023, 5:05 am)Massivemamas I found a way around it, just drop your regular prompt, but end it with a very descriptive ease to think of scenario.Please give an example of what you mean. The prompt below worked for a whole week, but now I always get the dog. Is the ending not descriptive enough? "dining room setting. side view. obese pregnant caucasian woman with a disproportionately large belly. She is fully clothed in beige corduroy dungarees. She looks happy. She is sitting at a table having dinner. She leans back on the chair." | |||||||||||||||||||
Related Threads | Author | Replies | Views | Last Post |
AI Pregnant Women Bonanza | BigButts2BigBellies | 0 | 13 |
19 minutes ago Last Post: BigButts2BigBellies |
My domestic fantasies with Bing | geara | 99 | 38,184 |
Today, 2:06 am Last Post: geara |
Pregnant female combatants. | NPNPN | 3 | 722 |
May 11, 2024, 2:47 pm Last Post: NPNPN |
My Pregnant's NSFW AI Creations (Furry) | Cmamonopor | 12 | 4,036 |
May 9, 2024, 5:53 am Last Post: Cmamonopor |
I got NUDITY on Bing | mrperson4321 | 4 | 1,198 |
May 9, 2024, 2:05 am Last Post: mrperson4321 |
Users browsing this thread: 1 Guest(s)
|