Amazon Investing Millions to Train 'Olympus' LLM
Amazon is reportedly making a significant investment in training a large language model (LLM) it hopes could rival top models from OpenAI and Alphabet. Media reports last week indicated that Amazon is spending millions to train a high-powered LLM codenamed "Olympus."
Amazon has its own family of generative AI models under the "Titan" brand, but those models have yet to set off the buzz or adoption of OpenAI's various GPT versions.
Amazon cloud rival Microsoft, having a minority interest in OpenAI, has benefitted greatly from the latter's success. If Amazon's alleged efforts around Olympus come to fruition, it has the potential to disrupt Microsoft's dominance of the generative AI space.
In an article Monday, Reuters, citing two anonymous sources familiar with Amazon's plans, said Olympus supports 2 trillion parameters, nearly double the estimated number of parameters supported by GPT-4. While a higher number of parameters doesn't necessarily make an LLM more accurate, it can contribute to the nuance, complexity, and relevance of its output.
A separate report by The Information, which revealed the "Olympus" moniker, says Amazon plans to sell the resulting technology to corporate enterprises, as well as to use it in its online retail business, its Amazon Web Services cloud and its Alexa natural-language assistant.
So far, Amazon has not verified these reports, though it could make a public announcement of the project in December, per The Information's source. That timing would coincide with Amazon's annual cloud conference, AWS re:Invent, which takes place this year Nov. 27 through Dec. 1.