Jaff_Re

joined 1 year ago
[–] [email protected] 1 points 1 year ago

Tesla P40 is a good low budget option, it has 24gb and CUDA cores. I’ve tried running 13b LLMs with 1 and it did well, plus you can afford multiple if you have enough slots