I think major training should just be done on dedicated servers/on the cloud. That being said it is very helpful to test locally, so in case you are planning on using Nvidia equipped servers just get any somewhat recent consumer Nvidia card and you can always run locally on some sample data and test much more easily.
this post was submitted on 11 Jun 2023
1 points (100.0% liked)
Machine Learning
1765 readers
6 users here now
founded 4 years ago
MODERATORS
I second that. Being able to test medium sized models locally can make debugging much easier.
I have a 3070 with 8GB VRAM, which can train e.g. a GPT2 with a batch-size of 1 with full precision.