Fast company logo
|
advertisement

Amazon announced this week that it would offer AWS cloud customers the chance to leverage some of the most popular new generative AI models.

Amazon arrives fashionably late to the generative AI arms race, but with a smart solution

[Source Photo: Getty Images]

BY Mark Sullivan2 minute read

Four of the five Big Tech companies have now planted their generative AI flags in the ground.

Amazon quietly announced Thursday that it would offer its AWS cloud customers the chance to leverage some of the most popular new generative AI models, including those developed by Anthropic (LLM for conversations and questions), AI21 Labs (LLM for translation), and Stability AI (for image generation).

Amazon adds to the mix access to some of its own models, collectively called “Titan,” which are designed to translate and summarize text inputs.

The suite of foundational models is branded as “Amazon Bedrock,” and AWS gives customers access to the models via an application programming interface (API).

But, as the branding suggests, the foundational models are just the base layer. AWS will help its corporate customers build additional layers onto these models so that it will have access to aspects of the customer’s corporate braintrust. And Amazon adds that it will provide a secure means for customers to share their corporate data with AI models in the cloud without fear of data leaks.

“Amazon is essentially giving options to developers, including their own LLM (large language model, Titan), and providing the API hooks and tools to make it easy and seamless,” says Creative Strategies CEO and principal analyst Ben Bajarin. “The tools Amazon is providing also recognize the need for companies (AWS customers) to also bring their own data, or smaller LLMs, to the equation.”

Salesforce took a similar approach when it announced in early March that it would let developers on its platform choose from large language models developed by third parties, including OpenAI, Cohere, and Anthropic. Salesforce, too, says it’ll help its customers avail their proprietary data to the LLM to make the model more grounded and useful.

Amazon’s approach to generative AI is different from those of its cloud rivals. Rather than offering a completely homegrown solution (Google), or leveraging an investment in a talented partner (Microsoft, with OpenAI), Amazon is offering its AWS customers a menu of generative AI models developed by smaller LLM developers, as well as a homegrown LLM.

AWS CTO Werner Vogels gives some insights into how Amazon thinks about generative AI models in this clip published on Twitter Thursday.

Amazon’s announcement came well after major AI announcements from Microsoft and Google. But its open approach to model selection and focus on proprietary data may have been worth the wait, especially for its legacy customers.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics