Automate AWS EBS Snapshots And Notify Via Email By Using Python Boto3 Script
Introduction
EBS Snapshots are a point in time copy of your data, and can be used to enable disaster recovery, migrate data across regions and accounts, and improve backup compliance. You can create and manage your EBS Snapshots through the AWS Management Console, the AWS CLI, or the AWS SDKs.
In this blog we are going to automate it using Python SDK for AWS. It allows you to directly create, update, and delete AWS resources from your Python scripts.
Prerequisite
An AWS Account An IAM User with:
- AWS Management Console access to verify your EC2 instances launched,listed and terminated.
- The IAM permissions required to perform IAM, EC2, and CloudWatch activities. IAM policy creation and AWS Application Programming Interface (API) permissions are outside this article’s scope. Always adhere to the principle of least privilege when authorizing accounts to perform actions. Administrative access to an EC2 Instance.
- Install awscli using aws official documentation here
- Install python and boto3
- Configure aws cli by using official documentation here
Launch AWS EC2 Instance with tags using python script
- Python code in one module gains access to the code in another module by the process of importing it. The import statement combines two operations it searches for the named module, then it binds the results of that search to a name in the local scope.
import boto3
- We will invoke the client for EC2
client = boto3.client('ec2')
- To launch EC2 instances we have to use method "run_instances()". This method helps us launch AWS EC2 instances based on our requirement.
response =client.run_instances(<arguments>)
- Goto link where you will find all arguments list. Based on your requirement you can put this arguments to launch your EC2 instances. This document also mentions datatype of the parameter.
Note:- Arguments which are with "REQUIRED" tags mentioned in documentation is mandatory, if you don't specify those arguments code block to launch EC2 will not execute successfully.
Example:- "MinCount", "MaxCount".
Below code will launch EC2 instance based on your provided input.resp1=client.run_instances(ImageId='<Image ID>', InstanceType='t2.micro', MinCount=1, MaxCount=1, KeyName='<Your Key Name>', TagSpecifications=[ { 'ResourceType': 'instance', 'Tags': [{'Key': 'Env','Value': 'Production'},] }, ], ) resp2=client.run_instances(ImageId='Image ID', InstanceType='t2.micro', MinCount=1, MaxCount=1, KeyName='<Your Key Name>', TagSpecifications=[ { 'ResourceType': 'instance', 'Tags': [{'Key': 'Env','Value': 'UAT'},] }, ], )
- Once above method will run it will launch EC2 and launched EC2 information will be captured in variable "resp1" and ""resp2". It will return infomation in dictonary.
To view entire github code please click here
Code python script to automate EBS Snapshot
- Python code in one module gains access to the code in another module by the process of importing it. The import statement combines two operations it searches for the named module, then it binds the results of that search to a name in the local scope.
import boto3
- We will invoke the service resource for EC2.
Note:- Make sure to explore EC2 service resources hereec2 = boto3.resource('ec2')
- Create a variable which we will use to store our EC2 instances tags that we have attached to our EC2 instances above.
tagfilters=[ { 'Name': 'tag:Env', 'Values':['Production'] }, { 'Name': 'tag:Env', 'Values':['UAT'] } ]
- Create an empty list in which we will save the list of snapshot ids.
snapshot_list=[]
- Now we will write for loop which will capture the instance information in instance variable while traversing through EC2 resources using iterator and will fetch instances based on tags.
for instance in ec2.instances.filter(Filters=tagfilters):
- Now we will use another for loop which will fetch EBS volume ids attached to these instances.
for instance in ec2.instances.filter(Filters=tagfilters): for volume in instance.volumes.all():
- We will now create the snapshot of the EBS volumes using EC2 resource method "create_snapshot()" and will store it in variable "snapshot".
Checkout documentation for this method herefor instance in ec2.instances.filter(Filters=tagfilters): for volume in instance.volumes.all(): snapshot=volume.create_snapshot(Description='Snapshot created via script')
- Now lets append these snapshot ids to our empty list we create earlier "snapshot_list"
for instance in ec2.instances.filter(Filters=tagfilters): for volume in instance.volumes.all(): snapshot=volume.create_snapshot(Description='Snapshot created via script') snapshot_list.append(snapshot.snapshot_id)
- Now our code is ready to take snapshot backup to add notification email while snapshot creation we will need to first create SNS topic.
Create an SNS topic
- Open the Amazon SNS console, and then choose Topics from the navigation pane.
- Choose Create topic.
- For Name, enter a name for your topic.
- For Display name, enter a display name for your topic.
- Choose Create topic.
- On the Subscriptions tab, choose Create subscription.
- For Protocol, choose Email.
- For Endpoint, enter the email address where you want to receive the notifications.
- Choose Create subscription.
- A subscription confirmation email is sent to the address you entered. Choose Confirm subscription in the email.
- When you click on "Confirm Subscription" it will notify whether your subscription is confirmed.
- Note the SNS topic you created. You use this topic ARN in your python script.
Add SNS Topic to python script to send emails after successful snapshot creation.
- First we need to invoke sns client using below code.
sns_client= boto3.client('sns')
- We will use "publish()" method from SNS resource.
Checkout method documentation here
Once you run the final code you will be notified with the snapshot id via email as shown below.sns_client.publish( TopicArn='<SNS Topic ARN>', Subject='EBS Snapshots', Message=str(snapshot_list) )
To view entire github code please click here
Conclusion
Using this automation you can run it on weekly basis or once in 3 days as per your organization requirement and can be used to enable disaster recovery, migrate data across regions and accounts, and improve backup compliance.
Stay Tuned For My Next Blogs
So, did you find my content helpful? If you did or like my other content, feel free to buy me a coffee. Thanks.