The Week in AI: IBM Granite LLMs Integrated into Salesforce Einstein, Mistral Codestal Launches, More

This edition of our weekly roundup of AI products and services includes a new chip design from ARM, the integration of IBM's Granite LLMs into Salesforce's Einstein 1,an AI upgrade of the Opera browser, Mistral's new Codestral release, OpenAI for Nonprofits, and more!

British chip designer Arm Holdings launched a new high-performance and low-power core design last week. The new Cortex-X925 can be used to build CPUs for handsets and personal computers, the company said. It's implemented inside a DSU-120 DynamIQ cluster and connected to the DynamIQ Shared Unit-120 that behaves as a full interconnect with L3 cache and snoop control. This connection configuration is also used in systems with different types of cores where the Cortex-X925 core is the high-performance core, the company said. The design is rolling out alongside a GPU blueprint called the Immortalis-G925. According to Arm, it speeds up rendering by reducing the need to perform a type of computing operation known as sorting. Both the Cortex-X925 and Immortalis-G925 can be manufactured using a three-nanometer process.

Opera announced plans to enhance its flagship browser, Opera One, and its gaming browser, Opera GX, with new on-device AI capabilities. This move follows the company's groundbreaking step in April 2024, when Opera became the first browser to support on-device AI by integrating access to local large language models (LLMs) directly within the application. The on-device AI support is set to transition from the experimental early access phase to a fully integrated feature, the company said. This enhancement will be available across all major operating systems, including Windows, macOS, and Linux, allowing both existing and new hardware to benefit from the technology. Since April, has expanded its support to include more than 2,000 local LLM variants from more than 60 model families.

Artificial intelligence startup Mistral AI launched Codestral, a large language model (LLM) designed specifically for software development. Codestral, proficient in over 80 programming languages, aids developers by automating coding tasks in high-level languages like Python and facilitating low-level syntax programming for hardware interaction, the company said. This model supports a range of coding activities, from explaining code snippets to generating new code based on natural language instructions. It 22 billion parameters, allowing it to excel in tasks such as autocomplete and code modification. It also enhances the bug-testing process by automatically scanning code for flaws. In internal tests, Codestral outperformed other open-source LLMs, particularly in Python programming and SQL tasks, thanks to its large context window that can process up to 32,000 tokens, the company said. Available under an open-source license for research and testing, Codestral can also be accessed via Mistral’s cloud-based API for commercial use, facilitating the development of programming automation plugins for code editing applications.

IBM's Granite series of Large Language Models (LLMs) will soon be integrated into Salesforce's The Einstein 1 Platform, the company's advanced AI platform, the two companies announced last week. Einstein 1 is designed to integrate Customer Relationship Management (CRM) applications, external system data via the data cloud, and AI models through a unified metadata framework. This integration aims to elevate the capabilities of AI solutions, including Einstein Copilot, and enable customers to leverage Salesforce's robust LLMs in conjunction with IBM's platform and the Einstein 1 Studio, which offers a suite of low-code AI development tools. Key aspects of this partnership include bidirectional data integration, enhanced flexibility in deploying various LLMs, and prebuilt actions and prompts tailored for CRM solutions.

Mentimeter, maker of a leading online presentation-building platform for audience engagement, unveiled AI Menti Builder, a new tool designed to leverage generative AI to accelerate creating draft interactive presentations, which the company calls “Mentis,” via simple prompts. ┬áThe AI analyzes the user's intent and crafts a purpose-built presentation following Mentimeter's knowledge base of best practices for facilitating meetings and classes. Within seconds, the company said, users are equipped with a ready-to-edit Menti, whether it is a workshop, quiz, seminar, poll, retrospective, or any other type of session they intend to host, along with the corresponding topic.

OpenAI announced OpenAI for Nonprofits, a new initiative aimed at enhancing the accessibility of the company's generative AI tools for nonprofit organizations. The company seeks to help nonprofits overcome operational challenges, limited funding, staffing shortages, and other impediments to productivity they face. The initiative will offer discounted rates for ChatGPT Team and Enterprise, the company said. With OpenAI for Nonprofits, nonprofit organizations will have access to ChatGPT Team at the discounted rate of $20 per month per user. Larger nonprofits ready for large-scale deployment can contact the company's sales team to access a 50% discount on ChatGPT Enterprise. These offerings provide access to the company's latest tools and services, including GPT-4o, advanced tools and custom GPTs, a dedicated collaborative workspace, admin tools for team management, and robust privacy and security(opens in a new window) standards, the company said.

Incorta, a leading operational lakehouse, announces the launch of Operational GenAI, an unlimited and enterprise GenAI solution. Incorta’s Operational GenAI is private and secure, built together with the leading, state-of-the-art technologies, including Retrieval Augmented Generation (RAG) from Vectara and model serving and fine-tuning from aiXplain, the company said. The platform offers a fully automated, out-of-the-box solution for GenAI applications within a secure and customizable environment. This setup maintains strict data privacy and corporate governance standards. By integrating top-tier, enterprise-grade, and customized open-source models fine-tuned by Incorta, the solution helps customers navigate issues related to accuracy, safety, and governance with ease and confidence, the company said. Incorta's Operational GenAI represents a significant advancement in the deployment of AI solutions, enabling enterprises to leverage cutting-edge technology while ensuring data integrity and security, the company said.