Building an Event-Driven Image Resizer Using AWS S3, SQS and Lambda

A Comprehensive Tutorial to Automatically Resize Images with AWS

Md Shamim
Towards AWS

--

This article will demonstrate how we can create an event-driven architecture using several AWS services like AWS S3, SQS, Lambda and CloudWatch.

Overview:

To illustrate the event-driven architecture, we’ll build an image processing pipeline for a social media platform that will enable us to automate the process of resizing images for our app using Amazon S3, SQS, and Lambda.

Following is the GitHub repository containing relevant material and code discussed in this article:

Following is the step-by-step guideline to build this project :

Step 1: Create Two S3 Buckets

Create two S3 Buckets and name them media-app-initial-image and media-app-resized-image .

  • Create three new folders inside the media-app-initial-image bucket and name them Cover , Post and Profile.
`media-app-initial-image` — Bucket Folder Structure
  • Keep the media-app-resized-image bucket as it is (initially empty).

Step 2: Create an SQS queue

Create an SQS queue with Standard type and name it media-app-queue . For now, keep all other settings default. Moving forward we will change some of the settings where required.

Step 3: SQS queue — Access Policy

We will edit the Access policy of the SQS queue to allow the s3 bucket ( media-app-initial-image) to publish the event to the SQS queue.

Add the following Policy:

{
"Version": "2012-10-17",
"Id": "Policy1679966379166",
"Statement": [
{
"Sid": "Stmt1679966377447",
"Effect": "Allow",
"Principal": "*",
"Action": "sqs:*",
"Resource": "arn:aws:sqs:ap-northeast-1:391178969547:media-app-queue",
"Condition": {
"ArnEquals": {
"aws:SourceArn": "arn:aws:s3:::media-app-initial-image"
}
}
}
]
}

After adding the Access policy, the ‘Access Policy’ section will look like this :

Access policy (SQS)

Setp 4: S3 Bucket ( media-app-initial-image ):

Create an Event notification that will publish an event to the SQS queue if any image is uploaded to the corresponding S3 Bucket.

1. Move to the “properties” section :

2. Search for “Event notifications” :

3. Create an event notification :

4. Event types:

Select All object create events as event type:

5. Destination :

Set the SQS queue ( media-app-queue ) as a destination, where the S3 event will be published.

Step 5: IAM Policies

We need to create three distinct IAM policies to give the lambda function the necessary permissions, which we will create in the following step These policies will allow the function to:

  • Write logs to CloudWatch
  • Full access to the S3 buckets that were previously created
  • Read-only access to the SQS queue

By creating these policies, we can ensure that the lambda function has the appropriate level of access to these AWS resources.

Policy 1: WriteCloudWatchLogs
Policy 2:
MediaAppBucketAccessPolicy
Policy 3: MediaAppQueueAccessPolicy

Step 6: Lambda Execution Role

After creating the IAM policies, we have to create a Lambda Execution Role by attaching the policies we created earlier.

1. Create an IAM role with Lambda Use case

Lambda Execution Role

2. Attach the policies

Attach Policies

3. Create the role and name it “ImageResizerLambdaRole”

Step 7: Configure Lambda

Create a Lambda function from scratch with the following configuration:

Function name: imageResizer
Runtime: Python 3.9
Architecture: x86_64
Execution Role: Attach the ImageResizerLambdaRole role

1. Deploy Code

Deploy the following code to the imageResizerlambda function:

import json
import boto3
from io import BytesIO
from PIL import Image

client = boto3.client('s3')
destination_bucket = "media-app-resized-image"
exclude_keys = {'cover/', 'post/', 'profile/'}

# Custom Image Size
image_sizes = {
'cover': (820, 360),
'profile': (170, 170),
'post': (1080, 1080)
}

def resizer(img, key):
image_type = key.split("/")[0]
if image_type in image_sizes:
resized_image = img.resize(image_sizes[image_type])
temp_buffer = BytesIO()
resized_image.save(temp_buffer,format=img.format)
resized_bytes = temp_buffer.getvalue()
client.put_object(Body=resized_bytes, Bucket=destination_bucket, Key=key)

def download_image(bucket_name, key):
response = client.get_object(Bucket=bucket_name, Key=key)
return response['Body'].read()

def lambda_handler(event, context):
print(event)
try:

for item in event['Records']:

s3_event = json.loads(item['body'])

if 'Event' in s3_event and s3_event['Event'] == 's3:TestEvent':
print("Test Event")

else:
for item in s3_event['Records']:
source_bucket = item['s3']['bucket']['name']
key = item['s3']['object']['key']
print(key)

if key not in exclude_keys:
image_content = download_image(source_bucket, key)
with Image.open(BytesIO(image_content)) as img:
img.format
resizer(img, key)

except Exception as exception:
print(exception)

2. Add Trigger

Now, add a trigger so that SQS can trigger the lambda function if any new message (image uploaded) arrives.

Add Trigger

3. Add Lambda Layer

In the lambda function code, we have used the Pillow module. To run the lambda function flawlessly, we have to add a lambda layer by zipping the pillow package.

To add a lambda layer within our lambda function, we can create our own custom layer or we can use the layer someone has already created. There is an open source github repository that provides a collection arns of Python Packages as AWS Lambda(λ) Layers

Check this out: https://github.com/keithrozario/Klayers

To add the Pillow module with python3.9 we can use the following arn :

arn:aws:lambda:ap-northeast-1:770693421928:layer:Klayers-p39-pillow:1
Adding Lambda Layer

Step 8: Test

Finally, let's upload some images to the media-app-initial-image bucket and check whether its resized and saved to the media-app-resized-image .

If any unexpected result occurs we can check the cloudwatch logs to find out the reason behind that.
To see CloudWatch Logs → Move to CloudWatch → Log groups → /aws/lambda/imageResizer

With that, we have successfully created an image process pipeline on AWS using S3, SQS, Lambda and CloudWatch.

If you found this article helpful, please hit the Follow 👉 and Clap 👏 buttons to help me write more articles like this.
Thank You 🖤

🔔 Follow Me: Medium | LinkedIn | GitHub | Twitter

Master DynamoDB

Get The DynamoDB Book today with 35% OFF using code TOWARDSAWS’.

--

--

Cloud Infrastructure Engineer | AWS Community Builder | AWS | Kubernetes | GitHub Actions | Terraform | 👇👉 linkedin.com/in/shamimice03 github.com/shamimice03