Jump in the discussion.

No email address required.

So i have maybe 20 tb of stuff, and i figure while i have something running ill plug in an old ryzen and a nvdia gpx card. I have enough spares except hdds to do that. I would use the card for maybe a locally hosted llm and maybe some other recreational ai stuff. Id love for one of those to be local and figure out all my files and memes and shit. Just a true organizer

!codecels any suggestions there?

Maybe i could train one on my likes on rdrama and filter out bad pings

Jump in the discussion.

No email address required.

>any suggestions there?

Install stable diffusion so you can jack off to huge titted anime sluts!

>Maybe i could train one on my likes on rdrama and filter out bad pings

You can train it on my pings :marseyembrace:

Jump in the discussion.

No email address required.

You could install Proxmox which has native ZFS support (it's just regular Debian underneath) and then pass through the GPU to a dedicated ML VM. (Nvidia cards drivers are picky on linux but there's lots of video guides for it I'm sure.)

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.