Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how many people are scrambling to set this up on their startup infra.

6x24GB NVRAM on 6 GPUs linked with NVSwitch is a little pricey, but totally doable.



I got it running using Colab Pro+ (immediately got a V100 40GB VRAM GPU) - the 7B model works with batch size of 8 and a max seq len of 1024


Sure, but the real value here is the 65B. Can you have multiple GPUs on colab?


I can't even get the 13B on colab to do inference with a very small sequence length.


How pricey would you estimate?


If you want to do it the cheap way by buying used stuff, the most expensive parts are:

- $2000 for a Threadripper 3xx5WX with a socket sWRX8 mainboard

- $5000 for 6x RTX 3090

- $350 for two 1500W PSUs

- $700 for 256GB RAM

You will also need PCIe extenders and perhaps some watercooling. And find a suitable case. The 2-card NVLink bridges are between $100 and $300 each (you nay want 3). All in all i think less than $10k.


Would rather put that in AWS.


AWS is probably among the priciest options.

Privacy costs money.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: