Unable to load image

Does anyone here know anything about self hosting llms with ollama?

I've been putzing around with a self host. I have a, very basic, react.js frontend, minimal flask app and am using Ollama to serve llama3 8b. I'm running into a problem though, each query is handled one shot instead of as a chat

https://i.rdrama.net/images/17186713758539402.webp

has anyone else messed around with this stuff?


:#marseytwerking:

:marseycoin::marseycoin::marseycoin:
12
Jump in the discussion.

No email address required.

I CUT MY FRICKING BUTTHOLE OPEN SHAVING IT BECAUSE I WANTED A NICE, PRETTY, PRESENTABLE HOLE. I'M NOT TALKING LIKE A LITTLE BABY PAPERCUT BUT LIKE, DEEP, HEMORRHAGING SQUIRTING GASH AND NOW 10 HOURS LATER I AM LAYING IN BED WITH THE MOST UNIMAGINABLE THROBBING PAIN YOU CAN IMAGINE EMANATING FROM MY BOYPUCCI LIPS EVERY TIME I GIVE IT A LIL SQUEEZE BUT I NEED TO TAKE A MEGA SHIT NOW AND THIS IS LITERALLY WORSE THAN CHILDBIRTH HELP ME PLEASE

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.