News
IBM watsonx Expands with Addition of Mistral LLM
Big Blue's developer library of enterprise-grade LLMs has two new open source additions.
The company on Thursday announced that models from France-based Mistral and Japan-based Elyza are now available on its watsonx AI platform. Their additions are indicative of IBM's goal "to expand capabilities to help clients innovate with IBM's own foundation models and those from a range of open-source providers," according to the announcement.
The Elyza model in question is ELYZA-japanese-Llama-2-7b, based on Meta's LLaMa-2.
Meanwhile, Mistral, a startup with significant backing from Nvidia and Microsoft, is bringing its Mixtral-8x7B model to watsonx. This version, according to IBM, is "optimized," supporting us much as 50 percent higher throughput than the un-optimized model and reducing latency by as much as 75 percent.
"This is achieved through a process called quantization," explained IBM, "which reduces model size and memory requirements for LLMs and, in turn, can speed up processing to help lower costs and energy consumption."
Mixtral-8x7B is particularly tuned for organizations looking for an efficient and lightweight model. Per IBM:
Mixtral-8x7B was built using a combination of Sparse modeling -- an innovative technique that finds and uses only the most essential parts of data to create more efficient models -- and the Mixture-of-Experts technique, which combines different models ("experts") that specialize in and solve different parts of a problem. The Mixtral-8x7B model is widely known for its ability to rapidly process and analyze vast amounts of data to provide context-relevant insights.
Its flexibility gives users more options in terms of use cases, said IBM senior vice president Kareem Yusuf. "By offering Mixtral-8x7B and other models on watsonx, we're not only giving them optionality in how they deploy AI -- we're empowering a robust ecosystem of AI builders and business leaders with tools and technologies to drive innovation across diverse industries and domains."
IBM intends to add more models to watsonx in the coming months.