AWS’ Bedrock offers foundation models to build generative AI apps

The new AWS cloud service, Amazon Bedrock, is designed to let enterprises select foundation models for building their own generative AI applications for targeted use cases and commercial use.

Amazon Web Services (AWS) has released a new service, dubbed Amazon Bedrock, that provides multiple foundation models designed to allow companies to customize and create their own generative AI applications — including programs for general commercial use.

Amazon Bedrock provides users with foundation models from AI21 Labs, Anthropic, Stability AI, and Amazon, accessible via an API. The service, announced Thursday and now in private preview, comes just a day after Databricks announced its own open-source-based large language model (LLM), Dolly 2.0, and has a similar strategy: to help enterprises circumvent constraints of closed-loop models (like ChatGPT) that stop them from making their own customized generative AI applications.

Closed-loop, trained foundation models bar enterprises from creating any form of generative AI that competes with the original.

This constraint has seen an acceleration in research to create open-source models and other alternative methods for generative AI that can be targeted for commercial use, as enterprises demand customizable models for targeted use cases.

Amazon’s foundation models for AI

The Amazon foundation models available via the new service include Amazon’s Titan FMs, which consist of two new LLMs — generally, AI models trained on vast textual data to generate human-like responses — that were previewed on Thursday and expected to be made generally available in the coming months.

The first Titan foundation model, according to the company, is a generative LLM for tasks such as summarization, text generation, classification, open-ended Q&A, and information extraction.

The second is an LLM that translates text inputs (words, phrases or possibly large units of text) into numerical representations (known as embeddings) that contain the semantic meaning of the text, the company said.

“While this LLM will not generate text, it is useful for applications like personalization and search because by comparing embeddings the model will produce more relevant and contextual responses than word matching,” Swami Sivasubramanian, vice president of data and machine learning at AWS, wrote in a blog post.

These foundation models, according to the company, have been tuned to detect and remove harmful content in enterprise data provided to it for customization, AWS said, adding that the models can also remove harmful content from outputs as well.

Amazon Bedrock, according to Sivasubramanian, can be used to find the right model for various use cases and customize it accordingly with enterprise data sets before integrating and deploying it into their applications using AWS tools.

“Customers simply point Bedrock at a few labelled examples in Amazon S3, and the service can fine-tune the model for a particular task without having to annotate large volumes of data (as few as 20 examples is enough),” Sivasubramanian said.

Training AI models without using enterprise data

“None of the customer’s data is used to train the underlying models, and since all data is encrypted and does not leave a customer’s Virtual Private Cloud (VPC), customers can trust that their data will remain private and confidential,” Sivasubramanian added.

In addition to the Titan foundation models, Bedrock includes the Jurassic-2 family of multilingual LLMs from AI21 Labs, which follow natural language instructions to generate text in Spanish, French, German, Portuguese, Italian, and Dutch.

Anthropic’s LLM, Claude, also included in Bedrock, can perform a wide variety of conversational and text processing tasks and is based on Anthropic’s extensive research into training honest and responsible AI systems, AWS said.

In addition, Bedrock gets a text-to-imaging foundation model in the form of Stability AI’s text-to-image suite.

Other updates to AWS’ arsenal of generative AI tools include making Amazon CodeWhisperer generally available for Python, Java, JavaScript, TypeScript, and C#.

CodeWhisperer, which is an AI-based code generator, competes with Microsoft-owned GitHub’s CoPilot, which recently added AI features to aid developers.

AWS also has announced the general availability of Inf2 cloud instances powered by AWS Inferentia2. These instances, according to the company, are “optimized specifically for large-scale generative AI applications with models containing hundreds of billions of parameters.”

About TechX Corp.

TechX Corporation is “AWS Partner of the Year” 2021 – 2022 in Vietnam.

TechX Corporation is a young startup, founded in 2020 by a team of well-established technology experts, with years of experience in multi-national enterprises and VN30 corporations with the mission of supporting Vietnamese companies in their digital transformation journey. TechX’s team of cloud experts possesses a comprehensive insight of Vietnam market, especially in major industry such as banking and finance, technology, E-commerce, etc.
Became AWS Advance Consulting Partner in less than 1 year since its establishment, TechX has been leveraging AWS advance cloud services and technology to provide tailored cloud transformation solutions our customers. Currently, TechX Corp. proud to be cloud consulting partner to top banks and financial institutes in Vietnam, such as Maritime Bank (MSB), Vietnam International Bank (VIB), VietinBank, FE Credit, etc., and many other companies in different industries.