Ive been doing DS for about six years now and honestly I always just pushed everything to the cloud because local hardware usually just melts when you try to train anything substantial but my situation changed recently. Im gonna be traveling through some parts of Montana and Wyoming next month for a contract and the internet there is gonna be hit or miss so I cant rely on SSHing into my usual AWS instances.
I need something that wont throttle the second I start a local training loop on a decent sized dataset. I was looking at the Razer Blade 16 with the 4090 because of the VRAM but I heard the heat management is kind of a nightmare and honestly I dont want to spend 4k just for a logo. Ive also looked at some Lenovo Legion models but Im worried about Linux driver support since I refuse to do serious dev work on Windows.
My budget is capping out at $3,500 and I really need at least 16GB of VRAM if possible otherwise whats the point. Is anyone actually doing heavy local training on a laptop these days or am I just chasing a pipe dream here? Are there specific brands that handle the thermals better for long compute sessions or should I just give up and buy a chunky workstation...
Honestly, training locally on a laptop is a bold move but i get the internet struggle in the sticks. I tried doing something similar last summer with a thin gaming laptop and it basically became a space heater that throttled to half speed within ten minutes. If you want that 16GB VRAM, you're looking at the 4090 mobile chips, but be careful because not all 4090s are built the same. Some manufacturers cap the wattage to keep the thing from melting, which defeats the whole purpose for deep learning. I would suggest looking at the Lenovo Legion Pro 7i Gen 8 RTX 4090 16GB VRAM or maybe the ASUS ROG Strix SCAR 17 RTX 4090 16GB VRAM. I've had way better luck with Lenovo on Linux personally, though you might want to consider checking the specific kernel version for the Wi-Fi drivers because that usually bites me. A few things to keep in mind: