Since I've now listened to one (1) entire talk on AI, I am the world's foremost leading expert.
As such I will be running a localized llama 13B instance. For the first time in over a decade I've bought a PC that wasn't an HP workstation.
Specs:
H110 motherboard w/celeron and 4GB ram alongside Nvidia quadro 5200 8gb
Do you guys think it'll actually run a quantized llama? Is 500W PSU enough?
Jump in the discussion.
No email address required.
@TouchFluffyTails seems to legitimately have a good grasp on the image generation concepts. I assume his tips will be useful too.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context