Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're saying it as if privacy was worthless? Also not many people would consider the price of buying a macbook and put it strictly towards running a local model.

Instead if you wanted to get a macbook anyway, you get to run local models for free on top. Very different story.



The privacy angle is not that interesting to me.

- You can find inference providers with whatever privacy terms you're looking for

- If you're using LLMs with real data (let's say handling GMail) then Google has your data anyway so might as well use Gemini API

- Even if you're a hardcore roll-your-own-mail-server type, you probably still use a hosted search engine and have gotten comfortable with their privacy terms

Also on cost the point is you can use an API that's many times smarter and faster for a rounding error in cost compared to your Mac. So why bother with local except for the cool factor?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: