Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it just me or are the 'open source' models increasingly impractical to run on anything other than massive cloud infra at which point you may as well go with the frontier models from Google, Anthropic, OpenAI etc.?
 help



It's because their target audience are enterprise customers who want to use their cloud hosted models, not local AI enthusiasts. Making the model larger is an easy way to scale intelligence.

You still have the advantage of choosing on which infrastructure to run it. Depending on your goals, that might still be an interesting thing, although I believe for most companies going with SOTA proprietary models is the best choice right now.

depends on what you mean by impractical. but some of us are trodding quite along.

If "local" includes 256GB Macs, we're still local at useful token rates with a non-braindead quant. I'd expect there to be a smaller version along at some point.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: