Amazon is the latest player in the generative AI game, but with a twist

AWS re:Invent 2022 logo sign
(Image credit: Future / Mike Moore)

As OpenAI has thrown down the gauntlet to big tech with ChatGPT, Amazon has announced its own generative AI project to meet the challenge, highlighting a two-decade history of machine learning and artificial intelligence developments.

The company has reminded us several times during the new AI era that its own e-commerce model, warehouse robots, and logistical operations are based on ML, but it seems its cloud division is finally ready to commit to generative AI.

With the launch of Bedrock, Amazon will support third-party companies and startups to develop their own generative AI apps using pre-trained models.

Amazon Bedrock generative AI

Democratizing is a word that gets thrown around a lot in 2023, and that’s what Amazon plans to do for machine learning. But what exactly is the tech giant planning to do with Bedrock?

According to the firm, the announcement appeals to three customer wishes: access to high-performing Foundation Models (FM), seamless and affordable integration into applications, and customizability.

Bedrock makes FMs available via an API from companies like I21 Labs, Anthropic, Stability AI, and itself, including models with support for multiple languages, text-to-image, and conversation.

Currently in limited preview with hand-picked companies, AWS plans to make Bedrock available to more customers this year.

It has also shared a use case for FMs in its CodeWhisperer tool, which is now generally available and free for individual developers following preview last year. It’s designed to save time by generating code suggestions much like GitHub’s GPT-powered Copilot, aimed at preventing the vulnerabilities and inefficiencies typically related to copying code from the web.

Slightly ahead of general availability, Amazon has also announced the general availability of EC2 Trn1n and Amazon EC2 Inf2 instances to give potential customers some research to do before committing to AWS’s cloud infrastructure. They use the AWS Trainium and AWS Inferentia chips with headline figures promising huge savings on training costs whilst delivering high performance.

While significant time likely stands between Amazon’s announcements and the effective democratization of AI and ML, support for startups is clearly there as we enter the era of AI.

Craig Hale

With several years’ experience freelancing in tech and automotive circles, Craig’s specific interests lie in technology that is designed to better our lives, including AI and ML, productivity aids, and smart fitness. He is also passionate about cars and the decarbonisation of personal transportation. As an avid bargain-hunter, you can be sure that any deal Craig finds is top value!