What is currently the best LLM model for consumer grade hardware? Is it phi-4?

I have a 5060ti with 16GB VRAM. I’m looking for a model that can hold basic conversations, no physics or advanced math required. Ideally something that can run reasonably fast, near real time.


Comments URL: https://news.ycombinator.com/item?id=44134896

Points: 23

# Comments: 10

https://news.ycombinator.com/item?id=44134896

Erstellt 1d | 30.05.2025, 12:50:06


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

Show HN: AI Peer Reviewer – Multiagent System for Scientific Manuscript Analysis

After waiting 8 months for a journal response or two months for co-author feedback that consisted of "looks good" and a single comma change, we built an AI-powered peer review system that helps re

31.05.2025, 16:40:17 | Hacker news