How to Deploy to S3 With Github Actions
Let’s go on a journey of automically building Hugo artefacts and deployment on AWS S3 with Github Actions. First, I will briefly outline the process, followed by a focus on how to achieve it with GitHub Actions.
Hugo generates public/
and resources/
directories during the build process. Subsequently, the contents of these directories need to be copied to the AWS S3 bucket. That’s it!
Deployment with Github Actions
First of all, we need to setup an IAM role in AWS to use in Github Actions/Pipeline. This step has been well documented by AWS. click here for tutorial. Afterwards, we need to setup the Github secrets with AWS credentials as following:
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
Github provides built-in action called actions/checkout@v4
to checkout the repository into the build context, including submodules.
- uses: actions/checkout@v4
with:
submodules: true
Now, we need to ensure that hugo
CLI is installed and afterwards, execute the hugo
command to build the artefacts. However, I prefer not to install external CLI tools manually. Fortunately, GitHub’s Marketplace for Actions offers an Action for this purpose.
- name: Build via Hugo
uses: lowply/build-hugo@v0.123.8
This action does two things: install the hugo
CLI and executes the build command. All that is left now is to copy the artefacts to the S3 bucket. One of the naive approaches is to execute the following command:
aws s3 cp <source_dir> <target_dir_on_bucket>
However, this approach copies the contents of source_dir
to the bucket without considering the existing contents. For example: if I have three blog posts, and I delete one of them, this command will not delete it because of the missing cleaning step.
In order to avoid this problem, I am using the sync
operation provided by AWS S3 CLI.
aws s3 sync <source_dir> <target_dir_on_bucket>
Sync operation compares the contents of the target_dir
with source_dir
, cleaning files/folders no longer needed and copying new files/folders. Yay!!
Complete Source Code
name: CI/CD
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
workflow_dispatch:
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
submodules: true
- name: Build via Hugo
uses: lowply/build-hugo@v0.123.8
- name: Upload artifacts to S3 bucket
run: aws s3 sync ./public s3://blog.zulfiqarjunejo.com
Feedback
Feedback is appreciated; let’s connect on Github!