Since I've now listened to one (1) entire talk on AI, I am the world's foremost leading expert.
As such I will be running a localized llama 13B instance. For the first time in over a decade I've bought a PC that wasn't an HP workstation.
Specs:
H110 motherboard w/celeron and 4GB ram alongside Nvidia quadro 5200 8gb
Do you guys think it'll actually run a quantized llama? Is 500W PSU enough?
Jump in the discussion.
No email address required.
Follower of Christ
Tech lover, IT Admin, heckin pupper lover and occasionally troll. I hold back feelings or opinions, right or wrong because I dislike conflict.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context