AWS Lambda Python Script to Cleanup EBS Volumes

The below is useful lambda function to delete EBS volumes older than ‘x’ days on all regions in the aws account.


import json
import os
import time
import subprocess
from datetime import datetime,date
import boto3

client = boto3.client('ec2')
awsRegions = client.describe_regions()['Regions']

def lambda_handler(event, context):
for region in awsRegions:
try:
owner_id = boto3.client('sts').get_caller_identity()['Account']
print ('#Method %s'%owner_id)
except Exception as e:
print ('Cant Connect to AWS %s' %e)
return
#Set which snapshot you want to delete number of days old
no_of_days = 30
formatter_string = "%Y-%m-%d"
current_date = time.strftime("%Y-%m-%d-%H-%M-%S")
try:
response_snapshot = client.describe_snapshots(OwnerIds=[owner_id,])
today_date = time.strftime("%Y-%m-%d")
datetime_object_today = datetime.strptime(today_date, formatter_string)

for snap in response_snapshot['Snapshots']:
try:
print (snap)
datetime_object_create = datetime.strptime(snap['StartTime'].strftime("%Y-%m-%d"), formatter_string)
if abs((datetime_object_today - datetime_object_create).days)>=no_of_days:
try:
print (datetime_object_today - datetime_object_create)
client.delete_snapshot(SnapshotId=snap['SnapshotId'])
print (snap['SnapshotId'] +' '+ 'Deleted')
except Exception as e:
print ('Error::%s'%e)
else:
print (datetime_object_today - datetime_object_create)
print (snap['SnapshotId'] +' '+ 'Not Deleted')
except Exception as e:
print ('Error::%s'%e)

except Exception as e:
print ('Error::%s'%e)
return

#lambda_handler(None,None)

Hope you enjoyed the post.

Cheers

Ramasankar Molleti

LinkedIn

 

Published by Ramasankar

Hi. I’m Ramasankar Molleti. I’m a passionate IT professional with over 14 years of experience on providing solutions for customers who are looking on cloud computing, Database Migration, Development, and Big Data. I love learning new technologies and share my knowledge to community. I am currently working as Sr Cloud Architect with focus on Cloud Infrastructure, Big Data. I work with developers to architect, build, and manage cloud infrastructure, and services. I have deeep knowledge and experience on working with various database platforms such as MS SQL Server, PostgeSQL, Oracle, MongoDB, Redshift, Dyanamodb, Amazon Aurora. I worked as Database Engineer, Database Administrator, BI Developer and successfully transit myself into Cloud Architect with focus on Cloud infranstructure and Big Data. I live in USA and put my thoughts down on this blog. If you want to get in touch with me, contact me on my Linkedin here: https://www.linkedin.com/in/ramasankar-molleti-23b13218/ My Certifications: Amazon: AWS Certified Solutions Architect – Professional AWS Certified DevOps Engineer – Professional certificate AWS Certified Big Data – Specialty AWS Certified Security – Specialty certificate AWS Certified Advanced Networking – Specialty certificate AWS Certified Solutions Architect – Associate Microsoft: Microsoft® Certified Solutions Associate: SQL Server 2012/2014 Microsoft Certified Professional Microsoft® Certified IT Professional: Database Administrator 2008 Microsoft® Certified Technology Specialist: SQL Server 2008, Implementation and Maintenance

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: