Automation of an order processing system with AWS Lambda, Amazon S3 and Dynamo DB
In this project based in a real-world scenario, I acted as the Cloud Specialist playing three roles(as a Cloud Solution Architect — to design the solution, as a Cloud Developer to develop the code for the Lambda Function, and as a Cloud Engineer/DevOps Engineer to maintain the solution once it is in Production) to establish a Serverless system using AWS Lambda by fetching csv files from S3, processing the data, and storing the results in a DynamoDB table.
The company receives daily orders in csv file. My goal is to automate the order processing system. Any new order that comes in is uploaded to Amazon S3 which is then written to Dynamo DB table. The system must be efficient, scalable and serverless to handle the volume of orders.
I have set up an Amazon S3 Bucket and a new table in DynamoDB. Used AWS Cloud 9(similar to VS code) for developing the code in Python for the AWS Lambda Function to be triggered by an S3 event when a new csv file is uploaded. The Lambda function reads the csv file from the S3 bucket, parse the data, and transform it into a format suitable for DynamoDB and used the AWS SDK to write the processed data into the DynamoDB table.
Below are few screenshots:
Created a virtual environment in AWS Cloud 9 and used Python Script and requirements txt for AWS Lambda Function
Executed the Python script along with the requirements txt file and created the zip file lambda.zip so that all the library to python code, Lambda function and txt file is in a zip format. The AWS Lambda uses this zip file to run whenever there is a new csv file for the Orders is uploaded.
Created the AWS Lambda function on the AWS console
Uploaded the zip file using AWS CLI command on Cloud 9 to publish to the Lambda function on AWS.
Then created a trigger in the Lambda function.
Any time a new file is uploaded to AWS S3 bucket the Lambda function is triggered, but the Role created by Lambda function does not have permission to write the records to Dynamo DB table, so we need to give permission to this role to access the S3 bucket and write to the Dynamo DB table.
The first order loaded to AWS S3 bucket is written to Dynamo DB table is shown below:
The second order loaded to AWS S3 bucket is written to Dynamo DB table is shown below:
The third order loaded to AWS S3 bucket is written to Dynamo DB table is shown below:
We can see that a Serverless system which is a highly scalable serverless architecture that can take any number of orders and number of retailers and write to the Dynamo DB table. Amazing!