For me I have to run down at a measly 13B, so I've been using mainly wizard and airoboros, hoping though that the new orca (or dolphin) models will work well, especially down to 7B, want to create a highly specialized home AI and would be so handy if I could QLora a really powerful but small model
this post was submitted on 04 Jul 2023
22 points (92.3% liked)
LocalLLaMA
2237 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
Being stuck on ROCm and also having an allergy to the reams of garbage python throws out when it doesn't work.. I'm still on the launchpad.
Yeah rocm is very limiting :/
The same (65B versions) and wizard vicuna uncensored 30B.