Show HN: gpudeploy.com – "Airbnb" for GPUs

Hi HN,

YC w24 company here. We just pivoted from drone delivery to build gpudeploy.com, a website that routes on-demand traffic for GPU instances to idle compute resources.

The experience is similar to lambda labs, which we’ve really enjoyed for training our robotics models, but their GPUs are never available for on-demand. We are also trying to make it more no-nonsense (no hidden fees, no H100 behind “contact sales”, etc.).

The tech to make this work is actually kind of nifty, we may do an in-depth HN post on that soon.

Right now, we have H100s, a few RTX 4090s and a GTX 1080 Ti online. Feel free to try it out!

Also, if you’ve got compute sitting around (a GPU cluster, a crypto mining operation or just a GPU) or if you’re an AI company with idle compute (hopefully not in a Stability AI way) and want to see some ROI, it’s very simple and flexible to hook it up to our site and you’ll maybe get a few researchers using your compute.

Nice rest of the week!


Comments URL: https://news.ycombinator.com/item?id=40260259

Points: 58

# Comments: 22

https://www.gpudeploy.com

Creato 1y | 4 mag 2024, 23:40:08


Accedi per aggiungere un commento

Altri post in questo gruppo

Show HN: We made our own inference engine for Apple Silicon

We wrote our inference engine on Rust, it is faster than llama cpp in all of the use cases. Your feedback is very welcomed. Written from scratch with idea that you can add support of any kernel an

15 lug 2025, 16:50:31 | Hacker news
Ask HN: Is it time to fork HN into AI/LLM and "Everything else/other?"

I would very much like to enjoy HN the way I did years ago, as a place where I'd discover things that I never otherwise would have come across.

The increasing AI/LLM domination of the site has m

15 lug 2025, 16:50:28 | Hacker news
Ask HN: What's Your Useful Local LLM Stack?

What I’m asking HN:

What does your actually useful local LLM stack look like?

I’m looking for something that provides you with real value — not just a sexy demo.

---

After a recent interne

15 lug 2025, 16:50:26 | Hacker news