We built any-llm because we needed a lightweight router for LLM providers with minimal overhead. Switching between models is just a string change : update "openai/gpt-4" to "anthropic/claude-3" and you're done.
It uses official provider SDKs when available, which helps since providers handle their own compatibility updates. No proxy or gateway service needed either, so getting started is pretty straightforward - just pip install and import.
Currently supports 20+ providers including OpenAI, Anthropic, Google, Mistral, and AWS Bedrock. Would love to hear what you think!
Comments URL: https://news.ycombinator.com/item?id=44650567
Points: 56
# Comments: 43
Login to add comment
Other posts in this group

Article URL: https://petetimessix.itch.io/nuclear-reactors
Article URL: https://mathstodon.xyz/@tao/114910028356641733

Article URL: https://asciinema.org
Comments URL: https://news.ycombinator.com/item?id=44679048
