This is simple example of how we can delete the indices older than ‘x’ days.
import boto3
from requests_aws4auth import AWS4Auth
from elasticsearch import Elasticsearch, RequestsHttpConnection
import curator
host = 'XXXXXXXXXXXXXXXX.us-east-1.es.amazonaws.com' # Provide the elasticsearch endpoint
region = 'us-east-1' # Provide the region
service = 'es'
credentials = boto3.Session().get_credentials()
awsauth = AWS4Auth(credentials.access_key, credentials.secret_key, region, service, session_token=credentials.token)
# Lambda execution starts here.
def lambda_handler(event, context):
# Build the Elasticsearch client.
es = Elasticsearch(
hosts = [{'host': host, 'port': 443}],
http_auth = awsauth,
use_ssl = True,
verify_certs = True,
connection_class = RequestsHttpConnection
)
index_list = curator.IndexList(es)
# Delete the indices for the pattern yyyy-mm-dd* with creation_date greater than x days.
# Source https://curator.readthedocs.io/en/latest/examples.html
index_list.filter_by_age(source='creation_date', direction='older', timestring='%Y-%m-%d', unit='days', unit_count=7)
print("Found %s indices to delete" % len(index_list.indices))
if index_list.indices:
curator.DeleteIndices(index_list).do_action()
print('Indices deleted successfully')
This example needs aws4auth, elasticsearch, curator modules installed. You can build these modules on using linux machine. I’ve used one of the ec2 instance that has amazon linux installed.
# Install Dependancies
yum -y install python-pip zip
pip install virtualenv
# Create the virtual environment
mkdir -p /var/es-cleanup && cd /var/es-cleanup
virtualenv /var/es-cleanup
cd /var/es-cleanup && source bin/activate
pip install requests_aws4auth -t .
pip install elasticsearch -t .
pip install elasticsearch-curator -t .
# Copy the code to current directory and set the file permission to execute mode
chmod 754 es-cleanup.py
# Package the lambda
zip -r /var/es-cleanup.zip *
# Send the package to S3 Bucket
# aws s3 cp /var/es-cleanup s3://BUCKET_NAME/
Hope you enjoyed the post.
Cheers
Ramasankar Molleti
LinkedIn
Published by Ramasankar
As a Principal Cloud Architect with over 18 years of experience, I am dedicated to revolutionizing IT landscapes through cutting-edge cloud solutions. My expertise spans Cloud Architecture, Security Architecture, Solution Design, Cloud Migration, Database Transformation, Development, and Big Data Analytics.Currently, I spearhead cloud initiatives with a focus on Infrastructure, Containerization, Security, Big Data, Machine Learning, and Artificial Intelligence. I collaborate closely with development teams to architect, build, and manage robust cloud ecosystems that drive business growth and technological advancement.Core Competencies:
• Cloud Platforms: AWS, Google Cloud Platform, Microsoft Azure
• Technologies: Kubernetes, Serverless Computing, Microservices
• Databases: MS SQL Server, PostgreSQL, Oracle, MongoDB, Amazon Redshift, DynamoDB, Aurora
• Industries: Finance, Retail, Manufacturing.
Throughout my career, I’ve had the privilege of working with industry leaders such as OCC, Gate Gourmet, Walgreens, and Johnson Controls, gaining invaluable insights across diverse sectors.As a lifelong learner and knowledge sharer, I take pride in being the first in my organization to complete all major AWS certifications. I am passionate about mentoring and guiding fellow professionals in their cloud journey, fostering a culture of continuous learning and innovation.Let’s connect and explore how we can leverage cloud technologies to transform your business:
• LinkedIn: https://www.linkedin.com/in/ramasankar-molleti-23b13218/
• Book a mentorship session: [1:1] Together, let’s architect the future of cloud computing and drive technological excellence.
Disclaimer
The views expressed on this website/blog are mine alone and do not reflect the views of my company. All postings on this blog are provided “AS IS” with no warranties, and confers no rights. The owner of https://ramasankarmolleti.com will not be liable for any errors or omissions in this information nor for the availability of this information. The owner will not be liable for any losses, injuries, or damages from the display or use of this information.
View more posts
Hi , I want to delete old indexes from AWS elastic search using lambda, and before deleting I need take backup to S3. Can u please guide me how can I do it. I don’t have any knowledge on scripting and lambda