What is currently the best LLM model for consumer grade hardware? Is it phi-4?

I have a 5060ti with 16GB VRAM. I’m looking for a model that can hold basic conversations, no physics or advanced math required. Ideally something that can run reasonably fast, near real time.


Comments URL: https://news.ycombinator.com/item?id=44134896

Points: 23

# Comments: 10

https://news.ycombinator.com/item?id=44134896

Erstellt 22d | 30.05.2025, 12:50:06


Melden Sie sich an, um einen Kommentar hinzuzufügen