Mistral Joins Amazon Bedrock's Roster of AI Foundation Models

UPDATE, 4/2: The Mistral Large model is also now available on Bedrock.

Startup Mistral is extending access to its two open source large language models to developers using Amazon's prodigious cloud platform.

The models, Mistral 7B and Mixtral 8X7B, will be available on the Amazon Bedrock developer platform "soon," according to an AWS blog post last Friday, though no specific release date was given.

Both models are available under the Apache 2.0 license, and are designed to be lightweight, fast and customizable.

Mistral 7B is a "dense Transformer, fast-deployed and easily customisable," according to the company's site. The smaller of the two, Mistral 7B only has English language support, coding support and an 8,000-token context window.

Mixtral 8X7B supports English, French, German, Spanish and Italian and has a 32,000-token context window. It's a "sparse Mixture-of-Experts model with stronger capabilities than Mistral 7B" and "uses 12B active parameters out of 45B total."

"With these two Mistral AI models," said AWS, "you will have the flexibility to choose the optimal, high-performing LLM for your use case to build and scale generative AI applications using Amazon Bedrock."

Amazon Bedrock is a managed serverless cloud service that gives developers looking to build apps leveraging generative AI access to models from major AI players, including Anthropic, Meta, Cohere, Stability AI, AI21 Labs and Amazon itself.

Based in France, Mistral is valued at an estimated $2 billion, thanks in part to a recent cash infusion from AI chip giant Nvidia.

About the Author

Gladys Rama (@GladysRama3) is the editorial director of Converge360.