News
Cohere Releases Its Latest Enterprise-Grade LLM, Command R+
There's a new model in Cohere's family of enterprise-grade Command R large language models: Command R+, described by the company as its "most powerful" scalable LLM yet.
Command R+ is now available via Microsoft's Azure AI platform, with availability on other cloud platforms, including Oracle Cloud Infrastructure, planned for the coming weeks.
"Command R+ joins our R-series of LLMs focused on balancing high efficiency with strong accuracy, enabling businesses to move beyond proof-of-concept, and into production with AI," Cohere said in a blog post last week announcing the launch.
The "R" lineup of Cohere LLMs comprises models that are especially tuned for modern businesses. For example, they have multilanguage capabilities; extra protections against hallucinations; the ability to automate business workflows; and heightened RAG (retrieval augmented generation) capabilities, allowing them to respond to queries using context from business data.
In addition, they all have 128,000-token context windows. A "tokenizer" capability in the R models compresses non-English text, helping minimize their cost of use.
The new R+ model features several enhancements over existing models. Per Cohere, R+ "improves response accuracy and provides in-line citations that mitigate hallucinations."
It also supports "Multi-Step Tool Use which allows the model to combine multiple tools over multiple steps to accomplish difficult tasks."
Besides English, R+ "excels" at the following languages:
- French
- Spanish
- Italian
- German
- Portuguese
- Japanese
- Korean
- Arabic
- Chinese
"Command R+ outperforms similar models in the scalable market category," said Cohere, "and is competitive with significantly more expensive models on key business-critical capabilities." By "similar models," Cohere specifically means GPT-4 Turbo and Mistral Large, both of which are available on Azure AI.
Like the other R series models, R+ aims for enterprise-grade privacy and security. "We don't access customers' data unless they want us to," said Cohere. "We offer private LLM deployments and the option to opt out of data sharing."