jkisolo.com

Optimizing DevOps Workflows with GitHub Actions and Docker

Written on

In this third installment of my DevOps series, I’ll explore the integration of GitHub Actions and Jest testing for automating workflows.

As I am currently in job search mode, I decided to dedicate a few days to teaching myself DevOps while documenting the process in a blog. In the first part, I successfully set up a React project within a Docker container, which was quite simple. My intention for Part 2 was to configure AWS to enable GitHub Actions to build and deploy the project whenever I pushed updates. In this segment, my goal is to expand on the GitHub Actions setup.

However, things didn't unfold as planned. If you recall from Part 2, I encountered issues with the Service failing to create and subsequently disappearing. I suspected this was due to the Task failing to retrieve the container from ECR, as the action responsible for this hadn’t been built yet.

While troubleshooting Part 2, I managed to get the action to push the container, which I highly recommend if you're following along. Failures can often disrupt your service.

Before I get ahead of myself, I had referenced a tutorial in Part 2 that guided me in establishing the correct role for OpenID Connect functionality.

Following the tutorial closely, I ended up with a GitHub Action that could authenticate using OpenID Connect. However, it failed when trying to print a variable that wasn't configured in AWS (the tutorial’s author had set his up externally, which I hadn't prioritized researching).

I also discovered Amazon's official action for deploying a container and maintaining the Service and Task. I could access a similar action in the GitHub marketplace by navigating to the "Actions" tab in my repo. Although this action was better formatted, it similarly required secrets that I lacked due to opting for OpenID Connect, which is generally considered a best practice. I find it odd that the official action deviates from Amazon's recommended protocols, but I digress.

Combining The Actions

After scrutinizing my partially functioning file and the two official action versions, I sought a way to merge my authentication with the action code.

I ultimately concluded that starting with the official action necessitated adding the following permissions right beneath the environment variables:

permissions:

id-token: write # Required for requesting the JWT

contents: read # Required for actions/checkout

Next, I needed to replace the block in the "configure AWS credentials" step with:

with:

role-to-assume: arn:aws:iam::actual_number_from_aws:role/Github

role-session-name: github_session

aws-region: ${{ env.AWS_REGION }}

When I tried this, it didn’t work due to discrepancies between the task definition file I created for the ECS_TASK_DEFINITION environment variable and the CONTAINER_NAME variable (more details to follow). Instead, I found success by incrementally incorporating elements from the official AWS action into my working file. This piecemeal approach enabled me to debug each component as it malfunctioned, ultimately leading to a similar solution as described previously.

Task Definition File

As previously mentioned, I encountered several challenges with the task definition file. The documentation lacks clarity regarding its purpose and usage. After some investigation (including consulting ChatGPT), I learned that having this file in my GitHub repository allows the step "Fill in the new image ID in the Amazon ECS Task Definition" to create a new revision of the task with the updated container image tag. Following this, the step "Deploy Amazon ECS Task Definition" updates the service to refer to this new revision.

However, correct setup of the task definition is crucial. The GitHub Actions documentation for deploying to Amazon Elastic Container Service suggests utilizing the following command:

aws ecs register-task-definition --generate-cli-skeleton

This command merely outputs an unpopulated JSON format to the console, which left me skeptical about its utility. Instead, I opted to adapt the JSON from my own definition, removing unnecessary properties (as indicated in the GitHub Actions log) and the image tag from the container image. Additionally, I adjusted the container name in both the YAML and JSON files to ensure compatibility with my changes from Part 2.

Although it was frustrating, this experience provided valuable insights into the workings of these files. There may be more efficient methods to manage this, but my motivation for writing this is to share the fragmented pieces necessary for creating this pipeline.

Once everything is configured correctly, updating the task definition in my repository will automatically reflect in the running ECS service. I wonder, however, if it would suffice to keep the task definition pointed at the latest version without including the task definition JSON or the actions to update it in the repository. Perhaps the new revision is essential for the service to recognize container updates.

Regardless, I now have AWS configured to automatically deploy changes pushed to my repository via GitHub Actions within five minutes. The next step is ensuring this deployment occurs only if all tests pass.

Running the Tests Automatically

I should have verified if the tests executed in the project after saving from CodeSandbox to GitHub. Oops. While I won’t provide a tutorial on fixing that, I did manage to resolve the issue. If you wish to use my code for your experiments, consider forking my repository and removing the contents of the .github/workflows and .aws directories.

Try #1: Running the Tests in the Docker Container

Since my Dockerfile was already executing npm commands, my initial thought was to add to that. Here’s what I first attempted:

My rationale was that if we could perform npm install, we could also run npm test, and it would be wise to verify the tests passed before copying files.

This attempt failed with the following error message:

It was clear I needed to reevaluate my understanding, but the question was: how? After some research, I discovered I could add the following to my Dockerfile to inspect the contents of the /app directory (which observant readers will note was designated as the WORKDIR on line 3 of the Dockerfile):

RUN pwd && ls /app

Moreover, when building locally, the Docker Compose command must be executed as follows:

docker build -t gitusername/reponame . --progress=plain --no-cache

This led to the realization that at the moment I was attempting to run my tests, only the package.json, package-lock.json, and node_modules were present. I was trying to execute the tests before copying all relevant files. Once I grasped this, I moved the RUN npm test command below the COPY command, and it succeeded.

However, I then recognized that I had been assuming the GitHub action checked out the repository before constructing the Docker container, leading to some confusion about where one process ended and the other began.

Once I clarified this, I reverted the Dockerfile change (though I documented the experience here, as it may be useful for others wanting to see the contents of the working directory during container build).

Try #2: Running the Tests in the GitHub Action

As is often the case, I discovered several resources that were relevant to my objectives, though not entirely applicable:

  • Build and Test React app with GitHub Actions

    This article walks through setting up GitHub Actions for building and testing React applications.

  • Using scripts to test your code on a runner - GitHub Docs

    This documentation illustrates an example workflow showcasing key CI features of GitHub Actions.

By synthesizing insights from these resources, I devised two new steps to incorporate into my existing workflow YAML file, situated between the code checkout and AWS credentials setup steps:

  • name: Setup node

    uses: actions/setup-node@v3

    with:

    node-version: 16.13.x

    cache: npm

  • name: Install/Test

    run: |

    npm ci

    npm test

Once I had this functioning, I realized that we were effectively executing npm install (in this case, npm ci) twice during the build process, which may be less efficient than running npm test within the Docker build after the initial npm install. Now, I'm uncertain which approach is optimal.

Summary/Conclusion

This journey has been enlightening, and I've gained a wealth of knowledge throughout the process. It took me about a week to work through everything, including writing this documentation. Naturally, as you repeat such tasks, the process becomes quicker. However, once established, it streamlines operations for the entire team, automating tasks without relying on a single individual with the necessary expertise.

A big thank you to all the DevOps professionals who managed these processes behind the scenes, allowing me to learn about them now!

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

Reconnecting: The Key to Overcoming Loneliness and Isolation

Explore how genuine connections can combat loneliness, with strategies for fostering deeper relationships.

# Five Innovators Behind Everyday Technologies We Rely On

Discover five visionary figures whose inventions have transformed our daily lives through technology.

Why a Former PHP Developer Wouldn’t Recommend the Language

Insights from a former PHP developer on why the language may not be the best choice for new projects.

Exploring the Connection Between Time, Consciousness, and Reality

Delving into the intricate relationship between time, consciousness, and the universe, exploring concepts of retrocausality and cosmic connections.

Understanding the Detrimental Effects of Sugar on Health

Discover the negative impacts of excessive sugar consumption on health, including weight gain, diabetes, and more, along with tips for reduction.

Unlocking Your Potential: 6 Ways to Simplify Your Life

Explore six ways to make life easier by identifying common pitfalls and how to avoid them for a more fulfilling experience.

The Javelin's Impact on the Ukrainian Conflict: A New Perspective

Analyzing the Javelin missile's role in Ukraine's defense and its implications for warfare.

# The Day Grandma Outwitted a Robot: A Lesson in Intuition

A heartfelt tale of how my grandmother taught me the importance of human intuition over technology's complexities.