Cloud Resume Challenge — My Experience

Rakshita Kaulgud
3 min readFeb 27, 2021

Last Fall I started working on the Cloud Resume Challenge, and 2 attempts later, I finally ended up finishing it! I was familiar with AWS in a superficial sense when I started, and while parts of the challenge were familiar to me, some of it, especially using SAM to deploy the lambda function proved to quite tricky.

The official set of instructions can be found here, but how I approached solving this challenge is documented below:

Step 1: AWS Practioner Certification

This was a fairly simple exam to pass and I got this certification in August 2020. Auditing the free courses on Coursera, going through the AWS Whitepapers and FAQs should do the trick.

Step 2,3,4,5,6: Setting up a static site hosted on AWS S3.

I have detailed the steps and sources to refer to in order to set this up in this blog.

Step 7: Visitor counter for the website that displays a count of the number of visitors to the site

This part is fairly simple, at least on the frontend side of the logic. All I had to do was create a div on the web page that displays the visitor count, and have the javascript that updates the count whenever someone accesses the page.

Step 8: Database

As suggested, I used AWS DynamoDB to create a table and hold the visitor count for the website.

Step 9: API to access the visitor count database

This step involved setting up an API gateway to access a lambda function that actually accessed the DynamoDB and updated the count, and returned the updated count to the called function.

The javascript function in the frontend accessed this serverless code by calling the API. And setting all this up was handled by IaC through SAM.

Step 10: Python API code

I wrote the Lambda function in Python using the boto3 library and deployed it as a part of a serverless stack using SAM. This function was exposed through an API gateway that was set up as a part of this serverless stack as well.

Step 11: Unit tests

Having been a Java/C# developer, I had not written unit tests in Python for the most part. These resources shared by Forrest Brazeal in the challenge and this blog on testing serverless services proved to be a great resource for me.

Step 12, 13: Separate repositories for frontend and backend code, with C!/CD setup

I chose GitHub as the version control system to store my file and used GitHub Actions to set up a CI/CD pipeline for my front-end part of the site so that changes pushed into the repository are reflected in the website automatically, without me having to go and manually push the changed files on to the S3 bucket hosting my website. Similarly, I set up CI/CD pipeline using Github actions, again for my backend repository so that changes checked into my API part of the code results in unit tests being run to ensure the changes dint break anything, and once the tests pass, the SAM application is automatically built and updated on to AWS.

This was by far the most tricky part of the challenge, and being new to Infrastructure as a Code, I had to put in quite a bit of effort to get this right. This blog by Chris Nagy helped me a lot in figuring out how to write the SAM template YAML file to configure my serverless stack.

Step 14: This blog is it!

I’m glad I took this challenge up since it gave me an opportunity to try hands-on a lot of AWS services and technologies I only knew theoretically. I would recommend this challenge to anyone who is getting started with AWS and cloud.
Here’s the personal site developed as a part of the challenge.

--

--