Show HN: Can I run this LLM? (locally)

One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.


Comments URL: https://news.ycombinator.com/item?id=43304436

Points: 21

# Comments: 26

https://can-i-run-this-llm-blue.vercel.app/

6mo | Hacker news
Show HN: I built an app to get daily wisdom from Mr. Worldwide

Pitbull is coming to Stockholm. As a part of that prep, I built an app with glassmorphism style counting down to the big day


Comments URL: https://news.ycombinator.com/item?id=43304785

Points: 15

# Comments: 2

https://daale.club/

6mo | Hacker news

Ricerca