What is currently the best LLM model for consumer grade hardware? Is it phi-4?

I have a 5060ti with 16GB VRAM. I’m looking for a model that can hold basic conversations, no physics or advanced math required. Ideally something that can run reasonably fast, near real time.


Comments URL: https://news.ycombinator.com/item?id=44134896

Points: 23

# Comments: 10

https://news.ycombinator.com/item?id=44134896

Creato 8d | 30 mag 2025, 12:50:06


Accedi per aggiungere un commento