Seamless Integration of Generative AI with Existing Business Systems
Generative AI has revolutionized technology through generating content and solving complex problems. To fully take advantage of this potential, seamless integration with existing business systems and efficient access to data are crucial. Amazon Bedrock Agents provides the integration capabilities to connect generative AI models with the wealth of information and workflows already in place within an organization, enabling the creation of efficient and impactful generative AI applications.
Overview of the Solution
The following diagram illustrates the architecture of the solution.
The system workflow includes the following steps:
- The user interacts with the generative AI application, which connects to Amazon Bedrock Agents.
- The application uses Amazon Bedrock Knowledge Bases to answer the user questions. These knowledge bases are created with Amazon Simple Storage Service (Amazon S3) as the data source and Amazon Titan (or another model of your choice) as the embedding model.
- Amazon Bedrock Agents uses action groups to integrate with different systems.
- The action groups call different AWS Lambda functions within private subnet of a virtual private cloud (VPC).
- The agent uses a tree-of-thought (ToT) prompt to execute different actions from the action groups.
- A Lambda function fetches the classification of the device from Amazon DynamoDB. The function invokes DynamoDB using a gateway endpoint.
- A Lambda function checks if quality documents exist in Amazon S3. The function invokes Amazon S3 using interface endpoints.
- A Lambda function calls the Appian REST API using a NAT gateway in a public subnet.
- The Appian key is stored in AWS Secrets Manager.
- A Lambda function uses AWS Identity and Access Management (IAM) permissions to make an SDK call to Amazon Simple Email Service (Amazon SES). Amazon SES sends an email using SMTP to verified emails provided by the user.
Prerequisites
You will need the following prerequisites before you can build the solution:
- A valid AWS account.
- Access to Anthropic’s Claude 3 Sonnet or the model you intend to use (for more information, see Access Amazon Bedrock foundation models). For this post, we use Anthropic’s Claude 3 Sonnet, and all instructions are pertaining to that model. If you want to use another FM, update the prompts accordingly.
- An IAM role in the account that has sufficient permissions to create the necessary resources.
- AWS CloudTrail logging enabled for operational and risk auditing. For more details, see Creating a trail for your AWS account.
- AWS Budgets policy notifications enabled to protect you from unwanted billing. For more details, see Enable Budget policy.
- Two email addresses to send and receive emails. Do not use existing verified identities in Amazon SES for these email addresses. The AWS CloudFormation template will fail otherwise.
Business Workflow
The following workflow shows the fictitious business process.
The workflow consists of the following steps:
- The user asks the generative AI assistant to determine if a device needs review.
- If a device type is provided, the assistant checks if it’s a Type 3 device.
- If it’s a Type 3 device, the assistant asks the user for the device name.
- The assistant checks if a document exists with the provided name.
- If the document exists, the assistant creates a case in Appian to start a review.
- If the document doesn’t exist, the assistant sends an email for review.
Best Practices
Consider the following best practices for building efficient and well-architected generative AI applications:
Clean Up
To avoid incurring future charges, delete the resources you created. To clean up the AWS environment, complete the following steps:
- Empty the contents of the S3 buckets you created as part of the CloudFormation stack.
- Delete the agent from Amazon Bedrock.
- Delete the CloudFormation stack you created.
Conclusion
Integrating generative AI with existing systems is crucial to unlocking its transformative potential. By using tools like Amazon Bedrock Agents, organizations can seamlessly connect generative AI to core data and workflows, enabling automation, content generation, and problem-solving while maintaining connectivity. The strategies and techniques showcased in this post demonstrate how generative AI can be orchestrated to drive maximum value across a wide range of use cases, from extracting intelligence from regulatory submissions to providing prescriptive guidance to industry. As generative AI continues to evolve, the ability to integrate it with existing infrastructure will be paramount to realizing its true business impact.
About the Authors
Sujatha Dantuluri is a seasoned Senior Solutions Architect in the US federal civilian team at AWS, with over two decades of experience supporting commercial and federal government clients. Her expertise lies in architecting mission-critical solutions and working closely with customers to ensure their success. Sujatha is an accomplished public speaker, frequently sharing her insights and knowledge at industry events and conferences.
Arianna Burgman is a Solutions Architect at AWS based in NYC, supporting state and local government agencies. She is a data and AI enthusiast with experience collaborating with organizations to architect technical solutions that further their missions for continuous innovation and positive, lasting impact.
Annie Cimack is an Associate Solutions Architect based in Arlington, VA, supporting public sector customers across the federal government as well as higher education. Her area of focus is data analytics, and she works closely with customers of all sizes to support projects ranging from storage to intelligent document processing.
Sunil Bemarkar is a Sr. Partner Solutions Architect at AWS based out of San Francisco with over 20 years of experience in the information technology field. He works with various independent software vendors and AWS partners specialized in cloud management tools and DevOps segments to develop joint solutions and accelerate cloud adoption on AWS.
Marcelo Silva is a Principal Product Manager at Amazon Web Services, leading strategy and growth for Amazon Bedrock Knowledge Bases and Amazon Lex.
FAQs:
Q: What is Amazon Bedrock Agents?
A: Amazon Bedrock Agents is a fully managed service that enables the development and deployment of generative AI applications using high-performance foundation models (FMs) from leading AI companies through a single API.
Q: How does Amazon Bedrock Agents integrate with existing business systems?
A: Amazon Bedrock Agents uses action groups to integrate with different systems, including AWS Lambda functions, Amazon DynamoDB, Amazon S3, and Appian.
Q: What are the prerequisites for building a solution with Amazon Bedrock Agents?
A: You will need a valid AWS account, access to Anthropic’s Claude 3 Sonnet or another foundation model, an IAM role with sufficient permissions, and AWS CloudTrail and Budgets policy notifications enabled.
Q: What are the best practices for building efficient and well-architected generative AI applications?
A: Consider using Amazon Bedrock Agents to integrate generative AI with existing systems, using action groups to integrate with different systems, and deleting resources to avoid incurring future charges.