There are many options and opinions about, what is currently the recommended approach for running an LLM locally (e.g., on my 3090 24Gb)? Are options ‘idiot proof’ yet?
Comments URL: https://news.ycombinator.com/item?id=39893142
Points: 49
# Comments: 18
Created
2mo
|
Apr 1, 2024, 1:30:09 PM
Login to add comment
Other posts in this group
Can anyone help me understand the economics of video streaming platforms?
Streaming, encoding, and storage demands enormous costs -- especially at scale (e.g., on average each 4k video with clos
Article URL: https://github.com/naklecha/llama3-from-scratch
I wanted to record the aurora last weekend, but I only have a Blackmagic Design video camera which is clearly not made for this purpose. Recording a video of the night sky results in extreme noise