Unable to load image

Porn site bans porn, or something

https://www.aisekai.ai is one of the numerous AI chat bot sites that people use for jerking off. Apparently it's been experiencing server issues and bugs the last week or so and has been offline, and has within the last 24 hours come back online. When it came back online however, they added a note about banning NSFW characters, which obviously made everyone very upset, as that's the only reason anyone even uses AI chat bots.

https://old.reddit.com/r/aisekai

They have a subreddit and every single thread on the first page is about the rules change. I scrolled through some of these threads a bit, and apparently the site has financial issues and they're trying to become more advertiser friendly? And they're not really banning porn, just hiding or something, and only banning more unsavory stuff like r*pe, incest, p-do shit, etc?

https://archived.moe/g/search/text/aisekai

AI chat bot general on /g/ discusses it a bit, mostly complaining about them banning bestiality content. Guess that's to be expected.

I decided to visit the site myself, and was greeted by this message.

https://i.imgur.com/VHoKAfG.jpg

Surely this pretty obviously means "no more porn"? But then I clicked on "Let's start!" and mouseovered one of these random chat bots that show up by default on the front page and looked at the text:

https://i.imgur.com/50sMTDZ.jpg

Now I'm by no means an AI chat bot expert, but my gut feeling tells me this might be a NSFW character. So I guess the entire situation is a big nothingburger?

48
Jump in the discussion.

No email address required.

How bad is it running textgen on CPU? I'm never wasting another second on remote after I got a quantized model running on my NVIDIA laptop GPU.

Jump in the discussion.

No email address required.

What's the best setup at this point in time?


Follower of Christ :marseyandjesus: Tech lover, IT Admin, heckin pupper lover and occasionally troll. I hold back feelings or opinions, right or wrong because I dislike conflict.

Jump in the discussion.

No email address required.

4090 connected to a mid gaming rig. the next best costs five figures. you can sperg out on cpu/ram/etc but it's largely irrelevant for being able to talk to a robot.

if you don't know what a python virtual environment is then be prepared to learn shit/read things.

or trust a random "one click install" thing ydy booboo.

Jump in the discussion.

No email address required.

I'm no minmaxxer. I've not even bothered with SD yet. I just installed text-generation-webui and picked some random quantized model off hackernews (zephyr)

I'd use a quantized mistral but AWQ is currently broken on Windows in that setup. There's an issue for it in the repo.

Jump in the discussion.

No email address required.

textgen is mostly dependent on RAM, you can start at 16gb with 13B models, the more RAM the better the models. There are some quants that have barely any quality loss (these q5 with variable quant). Generally it's still very good on CPU

You can offload some layers to the GPU but the gains for few layers are very small. Macs with a shitload of unified memory are a good option, though you'd have to shell out a LOT and it's apple garbo ecosystem.

Jump in the discussion.

No email address required.

Reported by:
  • gonight : This person is going on the internet and telling lies. JANNIES.

Completely infeasible. Text generation isn't even really viable on consumer GPUs.

Jump in the discussion.

No email address required.

Works fine on my GPU. I haven't even got one of the fancy wallet-raping ones. :marseyshrug:

Jump in the discussion.

No email address required.

What are you running? I had terrible results on a 2080ti with one of those text adventure generators.

Maybe things have changed since I tried it, but I remember thinking you'd need something like dual 4090s for it to be worth it. You need a shitload of VRAM to fit larger models.

I'm just paying for novel AI now, which is awesome btw.

Jump in the discussion.

No email address required.

I'm running text-generation-webui on an 8GB "RTX 2080 Super with Max-Q Design" and I'm using LoneStriker Zephyr 7b. I'm going to need more info on your "terrible results" but mine seems OK.

I generally use it in chat mode because that seems to meet all my needs. I ask it to elaborate on some disturbing fantasy and it does so. I can interact with it by typing out my commands (IE: "'Snappy, what weapon systems do you have?', I ask")

The output can be mildly rslurred sometimes but I think that's more a model issue. It does what I want most of the time, only having knowledge gaps for specific niche things like Skibidi Potty, Super Milk Chan etc...

I haven't had much success with character mode but I've only tried one "character" and that was more of an adventure than a character. It would get confused on if it was supposed to be the narrator or an in-game character and it would occasionally just regurgitate the starting text.

It's not lightning fast but it's about as fast as your standard RPG textbox so I hardly notice. Sometimes it chugs if I stop using it for a while and come back but then I just reload the model and everything's good again.

Keep in mind that this is all coming from an AI noob who got all his tweaking skills from configuring emulators and operating systems.

Jump in the discussion.

No email address required.

Neat, I'll give it a try. I think I was using a 7b model as well but I don't remember the details. The 2080 was not quite enough to handle a 13b model.

I found that the AI could only really handle short prompts and its output was limited and usually not that related to what I prompted it to do. I was mostly trying to use character mode though. I'll guessing it can't keep enough tokens in memory to handle a conversation.

Tbh the huge models like GPT are so much better that I'm not sure it's really worth it.

Jump in the discussion.

No email address required.

Tell me how it goes. I'm a lazycel and am interested in others' testimonies.

>Tbh the huge models like GPT are so much better that I'm not sure it's really worth it.

This is the crux of the issue, I think. My personal one is pretty much "just as good" as the AIDungeon crap I used to play with (which is where most of my AI experience lies). GPT is probably far more advanced but since it won't do anything fun without fiddling, it's pretty much useless to me. I'd play with it for comparison but my work M$ account has it turned off.

Jump in the discussion.

No email address required.

Man you should try Novel AI. It costs money, but I think it does exactly what you want. There's a free version you can play with a bit, but you'll run out of tokens pretty fast.

Jump in the discussion.

No email address required.

zoz

Jump in the discussion.

No email address required.

zle

Jump in the discussion.

No email address required.

More comments

Are you feeling okay bud?

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.