What is the best GP...
 
Notifications
Clear all

What is the best GPU for training AI models locally?

2 Posts
3 Users
0 Reactions
127 Views
0
Topic starter

What is the best GPU for training AI models locally? Honestly I am so lost and keep seeing all these numbers like VRAM and it makes literally zero sense to me.

I just want to run Stable Diffusion at home so I stop paying for midjourney but I only have like 800 bucks to spend and I dont want to buy the wrong thing. My friend said Nvidia is the only way to go but then I see people talking about 3090s vs 4070s and I'm just gonna pull my hair out. Is 12gb enough or do I need more? I really dont want to have to return a giant heavy box because I messed up...


2 Answers
11

TL;DR: Grab a used NVIDIA GeForce RTX 3090 24GB. I once bought a 12GB card and regretted it immediately when my renders crashed, so be careful and get 24GB to avoid errors, tbh.


4

^ This. Also, if you want something brand new with a solid warranty, you should seriously check out the Nvidia GeForce RTX 4070 Ti Super 16GB. I know it pushes that 800 buck limit by a tiny bit, but the performance is absolutely amazing for local AI! You get 16GB of VRAM which is such a fantastic sweet spot for running SDXL or even training your own LoRAs. If you need to stay strictly under budget, the Nvidia GeForce RTX 4060 Ti 16GB is another fantastic alternative. Even tho people complain about the memory bus, having that 16GB buffer is a total lifesaver for AI compared to 12GB cards. I love how cool these cards run! Basically, you gotta stick with Nvidia because of CUDA support... it makes the setup way less of a headache. Honestly, dont settle for 12GB if you can help it... that extra headroom is a total game changer.





Share:
PCTalkTalk.COM is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com. As an Amazon Associate, I earn from qualifying purchases.

Contact Us | Privacy Policy