Hey HN, we're a YC startup building an open-source, privacy-first alternative to Perplexity Comet.
No invite system unlike bunch of others – you can download it today from our website or GitHub: https://github.com/browseros-ai/BrowserOS
--- Why bother building an alternative? We believe browsers will become the new operating systems, where we offload much bunch of our work to AI agents. But these agents will have access to all your sensitive data – emails, docs, on top of your browser history. Open-source, privacy-first alternatives need to exist.
We're not a search or ad company, so no weird incentives. Your data stays on your machine. You can use local LLMs with Ollama. We also support BYOK (bring your own keys), so no $200/month plans.
Another big difference vs Perplexity Comet: our agent runs locally in your browser (not on their server). You can actually watch it click around and do stuff, which is pretty cool! Short demo here: https://bit.ly/browserOS-demo
--- How we built? We patch Chromium's C++ source code with our changes, so we have the same security as Google Chrome. We also have an auto-updater for security patches and regular updates.
Working with Chromium's 15M lines of C++ has been another fun adventure that I'm writing a blog post on. Cursor/VSCode breaks at this scale, so we're back to using grep to find stuff and make changes. Claude code works surprisingly well too.
Building the binary takes ~3 hours on our M4 Max MacBook.
--- Next? We're just 2 people with a lot of work ahead (Firefox started with 3 hackers, history rhymes!). But we strongly believe that a privacy-first browser with local LLM support is more important than ever – since agents will have access to so much sensitive data.
Looking forward to any and all comments!
Comments URL: https://news.ycombinator.com/item?id=44523409
Points: 51
# Comments: 14
Jelentkezéshez jelentkezzen be
EGYÉB POSTS Ebben a csoportban


Article URL: https://github.com/permissionlesstech/bitchat
Hey HN, Henry and Roman here - we've been building a cross-platform framework for deploying LLMs, VLMs, Embedding Models and TTS models locally on smartphones.
Ollama enables deploying LLMs mode