Anthropic announced on Monday a new family of AI models, collectively called the Claude 3 model family. As is commonly done, the company released three different sizes of models, each with a varying balance of intelligence, speed, and cost.
The largest of the new models, called “Opus,” outperforms both OpenAI’s and Google’s most advanced models, GPT-4 and Gemini Ultra, respectively, on tests measuring undergraduate level expert knowledge (MMLU), graduate level expert reasoning (GPQA) as well as basic mathematics (GSM8k), Anthropic says.
The middle child in the family, Claude 3 “Sonnet,” is twice as fast as Anthropic’s previous best model, Claude 2.1, and with higher intelligence. Anthropic says Sonnet excels at intelligent tasks demanding rapid responses, like knowledge retrieval or sales automation.
The smallest model, called “Haiku,” beats other comparably sized models in performance, speed and cost, the company says. It can read a dense research paper of roughly 7,500 words with charts and graphs in less than three seconds.
All three models can process visual imagery, which enables them to understand uploaded documents, analyze web interfaces, and generate image catalog metadata. Anthropic says that for many of its enterprise customers, up to half of their knowledgebases consist of documents in image formats such as PDFs, flowcharts, or slides.
The Opus and Sonnet models are available today, while the Haiku model will be available soon.
Jelentkezéshez jelentkezzen be
EGYÉB POSTS Ebben a csoportban


AI chatbot therapists have made plenty of headlines in recent months—s

The latest version of Elon Musk’s artificial intelligence chatbot Grok is echoing the views of its

When an emergency happens in Collier County, Florida, the

A gleaming Belle from Beauty and the Beast glided along the exhibition floor at last year’s San Diego Comic-Con adorned in a yellow corseted gown with cascading satin folds. She could bare

The internet wasn’t born whole—it came together from parts. Most know of ARPANET, the internet’s most famous precursor, but it was always limited strictly to government use. It was NSFNET that bro
