Experimenting With S3 Bucket Operations Using Agents for Amazon Bedrock+Slack
Hello! This is Guo from KINTO Technologies’ Generative AI Utilization Project.
How does your company manage AWS resources? Several options are available, including Terraform, the AWS CLI, or manual operations through the AWS Console. This time, I harnessed the capabilities of generative AI to develop a system that allows you to manage AWS resources using natural language commands directly in Slack. The backend utilizes Agents for Amazon Bedrock (hereafter referred to as "Bedrock") to handle resource management seamlessly.
System overview
The overall structure is shown in the diagram below. System overview
How it works
Users enter natural language commands in Slack, and the backend, powered by Bedrock, processes these commands to create or delete S3 buckets. How it works
Steps to create the system
The system can be built in three steps:
- Create an Agent on Bedrock
- Set up AWS Chatbot
- Configure Slack
Below, I’ll explain each step in detail. By following these steps, you’ll be able to build the same system yourself, so please give it a try!
1. Create an Agent on Bedrock
-
Open the Bedrock Management Console. Click “Agents” in the left menu.
-
Click “Create Agent”
-
Enter a name and click “Create”
-
You will be taken to the Agent Builder screen. Select a model, such as Claude 3 or Sonnet (you can choose any model you prefer). Click “Save and Exit” in the top-right corner.
-
Click “Prepare” on the right side.
-
A message saying "Successfully prepared" will appear.
-
Adding an Action Group. Click "Add" in the upper right corner of the Action Groups section.
-
Configure the Action Group.
・Enter an Action Group Name.
・Set the Action Group Type to "Define using function details".
・Under “Specify how to define the Lambda function", select "Quickly create a new Lambda function (recommended)" ・For the Action Group Invocation, choose recommended option: "Quickly create a new Lambda function (recommended)" -
Adding and configuring Action Group Functions
・ Create the following Action Group Functions: “delete-ai-agent-gu-function" and "create-ai-agent-gu-function"
・Configure each function in the “Description (optional) field as follows: "delete S3 bucket posted bucket name" and "create S3 bucket posted bucket name"
・Add a parameter as following. The name should be bucket_name, the description should be S3 bucket name, the type should be String, and the mandatory value should be True. -
Adding Instructions for the Agent
・ Open the Agent edit screen and enter the following in the “Agent Instructions” field:You are an agent working with an S3 bucket. Use the appropriate functions to create or delete S3 buckets based on user requests. Task 1: If a user submits a request such as "Please create an S3 bucket named test-gu," trigger the Lambda function named create-ai-agent-gu-function Task 2: If the user requests something like “Please delete an S3 bucket named test-gu,” execute the Lambda function named delete-ai-agent-gu-function.
-
Creating Lambda Functions Access the Lambda console. Since the option to quickly create a new Lambda function was selected, dummy lambda function has been automatically created.
-
Add the code for creating and deleting S3 buckets to dummy_lambda.py
import json
import boto3
AWS_REGION = "ap-northeast-1"
s3Client = boto3.client("s3",region_name=AWS_REGION)
location = {"LocationConstraint":AWS_REGION}
def lambda_handler(event, context):
agent = event["agent"]
actionGroup = event["actionGroup"]
function = event["function"]
parameters = event.get("parameters", [])
# Execute your business logic here. For more information,
# refer to: https://docs.aws.amazon.com/bedrock/latest/userguide/agents-lambda.html
bucket_name = next(item for item in parameters if item["name"] == "bucket_name")["value"]
if function == 'delete-ai-agent-gu-function':
bucket_instance=s3Client.delete_bucket(Bucket=bucket_name)
responseBody = {
"TEXT": {
"body": f"Instance Deleted: {str(bucket_instance)}"
}
}
elif function == 'create-ai-agent-gu-function':
bucket_instance=s3Client.create_bucket(Bucket=bucket_name, CreateBucketConfiguration=location)
responseBody = {
"TEXT": {
"body": f"Instance Created: {str(bucket_instance)}"
}
}
action_response = {
"actionGroup": actionGroup,
"function": function,
"functionResponse": {
"responseBody": responseBody
},
}
function_response = {"response": action_response, "messageVersion": event["messageVersion"]}
print(f"Response: {function_response}")
return function_response
Extract the function value from the event dictionary and route the process to the corresponding predefined action group function, either create-ai-agent-gu-function or delete-ai-agent-gu-function.
-
Grant the Lambda function permissions to manage S3 buckets by attaching the required policies to its execution role.
-
Click Deploy (Ctrl+Shift+U) on the left side.
-
Return to the Agent page, click “Create Alias” at the top.
-
Enter an “Alias Name,” and click “Create Alias.” The alias will be created. This completes the agent setup.
2. Set up AWS Chatbot
-
Open the Chatbot configuration page in the AWS Console. Click "Set up a new client." Select “Slack” as the chat client, and click "Configure."
-
Authorize AWS Chatbot to access your Slack workspace
-
Return to the Chatbot configuration page and click “Set up new channel.”
-
Enter the Configuration Name and Channel ID.
-
For permissions, follow these steps:
Specify the IAM Role Name.
Attach the AmazonBedrockFullAccess policy to the Channel Guardrail Policy. (For production environments, ensure to adhere to the principle of least privilege.) -
Click Save in the bottom-right corner.
-
Click the link for the added configuration (in this case, ktc-gu-test).
-
Click the link for the channel role.
-
In the permission policy, click “Add permissions” and select “Attach Policies”
-
Search for "AmazonBedrockFullAccess," select it from the list, and click "Add permissions."
This completes the AWS Chatbot setup.
3. Configure Slack
Finally, let’s configure Slack.
- Send the following message in the Slack channel:Once the connection is successful, the following message will appear. This completes the Slack setup. Now, let's test the system.
@aws connector add {Connector Name} {Bedrock agent's Agent ARN} {Bedrock agent's Alias ID}
System Verification: Enter an S3 operation command in Slack.
Enter an operation command like this: @aws ask {Connector Name} {Prompt}
S3 bucket creation and deletion should now work successfully!
Summary
In this demonstration, we showcased how to utilize AWS's generative AI agent service, Agents for Bedrock, to enable S3 bucket creation and deletion solely through natural language input. This approach demonstrates how various operations can now be executed effortlessly using natural language commands. Thank you, and see you next time!
関連記事 | Related Posts
「だれもが当たり前に生成AIを活用している企業を目指して」文化醸成と内製研修の取り組み紹介
Eight Preparations We Made in Order to Hold an External Event Aimed at IT Engineers
10X innovation culture program 内製化における苦悩と挑戦【前編】
Hosting a Hybrid IT Event: Connecting External Attendees and In-House Teams On-Site and Online
Impact Effort Matrix(インパクト・エフォートマトリックス)を使って社内交流を実践してみた
8 Key Mindsets That Helped Me as a New Team Leader