Deploying a Hugo Site to AWS S3 via Codeship

As you may have guessed looking at the footer of this site, I’ve started using Hugo as my static website engine. If you’re interested in why I’m using one of those, rather than a classic database-driven blog engine, there are a few articles on the subject.

The reason I chose Hugo is almost purely because it’s written in Golang, the main language I use both at work and for personal projects. This gives me some scope in making my own templates, or editing the templates of others, as well as contributing back to the project if there’s anything I feel can be improved.

The great thing about deployment with statically generated sites is that there’s no convoluted setup; once the files are generated by the system, you just throw them up onto a server or CDN. Simple. However, I wanted to be able to edit pages on the move, or create a new post directly from Github, and subsequently deploy these without needing access to a laptop. Enter continuous deployment. Also I can keep my site completely open by having it as a public repo on Github.

Codeship is a company that provide continuous integration and deployment as a service. With it, you can run tests, build projects, and then run deployment steps for specific branches (or branches with a given prefix). In this instance, it allowed me to upload Hugo-generated files up to AWS S3 once a change had been made to the master branch.

A basic Codeship box won’t have Hugo installed, but it’s simple to get set up when defining the CI commands:

tar -zxvf hugo_0.16_linux-64bit.tgz
./hugo version

This grabs version 0.16 of Hugo, unzips it, and outputs the version (just for debugging purposes, and so I know what I’m dealing with on the box). Once this is in place, I can set Codeship up to actually build my site. You will notice from my repo that no public directory is created. This is because it is git ignored, ensuring only the configuration, theme, static files and contents are contained in the git repo. To create the public folder that in turn gets uploaded to S3, the following command is run during the testing phase:


This does all the magic of actually generating the site. Nice! After this the public directory can be uploaded to S3. I’m not going to go through the process of setting up your website on S3 as there is a very good walkthrough on the AWS docs site.

If you’re looking for a cheap way to host a site you can easily edit yourself, I’d definitely recommend this method. The only thing that costs outside of the AWS free usage tier is hosting the DNS (for me a mere $0.50), and Route53 is a pleasure in both it’s simplicity and it’s functionality. I’ll no doubt be posting more about Hugo as and when I sink my teeth into it a bit more!