News

The Week in AI: Cloudera AI Assistants, ARC Reactor and More

This edition of our weekly roundup of AI products and services includes Cloudera's three new AI Assistants, Covideo's new AI assisted tool for auto dealers, ARC Solutions' ARC Reactor GenAI release, Murf AI's launch of Murf Speech Gen 2, and more!

OpenAI announced the launch of GPT-4o Mini, a slimmed down, more affordable version of its flagship multimodal GPT-4o model. The "mini" version, which replaces the GPT-3.5 model, is aimed at developers to "significantly expand the range of applications built with AI by making intelligence much more affordable," the company said in a statement. Free and paid users of ChatGPT, including those on the Teams plan, will have access to GPT-4o mini today, and the company plans to roll it out to enterprise customers next week. Our reporting on this release can be found here.

Cloudera, a leading provider of data management and analytics solutions, announced the launch of three new AI-powered assistants designed to accelerate the creation of data, analytics, and AI business applications. These assistants were developed to broaden employee access to advanced tools and streamline complex processes, the This edition of our weekly roundup of AI products and services includes Cloudera's three new AI company said.

  • The new SQL AI Assistant addresses common challenges associated with writing SQL queries, including data discovery and query optimization. By allowing users to describe their needs in plain language, the assistant employs advanced techniques such as retrieval augmented generation (RAG) and prompt engineering to find relevant data, optimize queries, and explain them in an easy-to-understand manner.
  • The AI Chatbot in Cloudera Data Visualization assistant provides the ability to engage directly with enterprise data to provide contextualized business insights. Unlike traditional BI dashboards that offer limited interactivity, this chatbot was designed to deliver deeper, more actionable insights by allowing users to ask questions in plain language. The AI tool then accesses and analyzes the underlying data to provide the most relevant and accurate output.
  • Cloudera’s third new AI assistant, the Copilot for Cloudera Machine Learning, was designed to overcome common challenges in deploying AI and ML models. Powered by pre-trained large language models (LLMs), the Copilot is seamlessly integrated with more than 130 Hugging Face models and datasets, supporting the entire lifecycle of AI application development.

Text-to-speech platform provider Murf AI launched Murf Speech Gen 2, an advanced and customizable speech model that combines human-like realism with advanced customization capabilities, the company said. This new model was designed to allow users to transform ideas into reality, solidifying Murf AI's position as a tech powerhouse committed to ethical AI voiceover technology. Murf Speech Gen 2 enhances Murf AI’s portfolio of more than 120 voices capable of creating studio-quality voiceovers in more than 20 languages. Features include the "Say it My Way" option, which allows users to record or upload their own rendition of a text, which AI then replicates using the chosen AI avatar. Advanced word-level emphasis and variability features provide fine-grained control and multiple voiceover versions for precise output. The model also offers upgraded broadcast-quality audio and enhanced pronunciation accuracy, benchmarked at more than 99% for general American English.

Indianapolis-based software company Covideo launched "Covideo AI Assist," a new AI tool designed to enable auto dealerships generate vehicle-specific video scripts, emails, and text messages efficiently. The solution uses ChatGPT technology to integrate with a dealer's inventory, creating precise and personalized scripts for any vehicle, streamlining the process, and enhancing sales communications. Covideo pioneered personalized video email and is now a leading provider of video messaging tools for dealership sales and service departments, enabling the recording, sending, and tracking of personalized videos across multiple channels.

ARC Solutions, which specializes in efficient AI and secure Web3 products, announced the public availability of ARC Reactor GenAI ("Reactor"). Launched in early June, this AI platform was engineered to deliver faster and more accurate results while significantly reducing energy consumption, the company said. The public release includes enhancements to voice interaction, image processing capabilities, and overall user experience. Ahead of its public launch, more than 10,000 early access users experienced Reactor's average response times of under six seconds, the company said, with each response consuming approximately half a watt. In three industry benchmarks, ARC Reactor outperformed leading AI models in raw scores while achieving unprecedented training energy efficiency. The rapid ontological classification provided by Reactor enables powerful applications such as custom search engines, content recommendation systems, and personalized AI assistants without the extensive computing and energy demands or data privacy risks of other AI models, the company said.

Liminal Data launched Liminal Omni-1, an industry-agnostic AI-powered platform designed to simplify the complex data science essential for impact insights discovery. Developed to enhance efficiencies for more than 90% of mid-market enterprises lacking in-house data science expertise, the platform replaces error-prone spreadsheets, significantly reduces or eliminates consulting time, and cuts analysis time from days to hours, making it easier and more cost-effective to access a broad range of high-value insights, the company said. The platform can aggregate diverse data sources, such as physical assets and portfolio companies, into a unified portfolio view using a manager-reporter model, providing a comprehensive insights landscape.

SwarmOne, the world’s first "swarm computing" company, launched a software-as-a-service training platform designed to enable data centers to enter the AI training market without significant capital investment. So-called swarm computing operates using new instance-less software stacks, which allows AI training tasks to be seamlessly spread across a swarm of GPUs, increasing the total supply of AI compute power without huge capital investments.  The new training platform is flexible enough for a data center to run it on existing infrastructure, the company said, with little or no adaptation and is easy to install and manage. This innovation opens access to a market expected to grow from USD 2.39 billion in 2023 to USD 17.04 billion by 2032, the company said.

Featured