Since I've now listened to one (1) entire talk on AI, I am the world's foremost leading expert.
As such I will be running a localized llama 13B instance. For the first time in over a decade I've bought a PC that wasn't an HP workstation.
Specs:
H110 motherboard w/celeron and 4GB ram alongside Nvidia quadro 5200 8gb
Do you guys think it'll actually run a quantized llama? Is 500W PSU enough?
Jump in the discussion.
No email address required.
good luck bud!
Let me know what you think of llama. I kinda want to do something similar but am worried llama will suck too much in comparison to ChatGPT
Jump in the discussion.
No email address required.
It's not a fair comparison at all tbh. ChatGPT is just so incredibly powerful. Llama seems like it would be useful when trained on very specific things, I'll let you know how things go.
There's a project where they're training an instance of it on HN, which seems like a good use case of a small LLM. Just train it in a narrow range of topics and regurgitate answers
Jump in the discussion.
No email address required.
More options
Context
More options
Context