Building a continuous delivery pipeline for database migrations with GitLab and AWS
After having tested tools to automate the database migrations, it is time to integrate the chosen one with my GitLab repository and build a continuous delivery pipeline for AWS.
Project stack
You can check my previous story about a comparison between Flyway and Liquibase here, but *spoiler alert* this implementation has the following stack:
- Flyway to manage database migrations.
- MySQL database.
- Flyway Docker image.
- GitLab CI/CD.
- Amazon Web Services (AWS).
- Check the repository here.
The process
- GitLab CI builds a Flyway Docker image and pushes it to Amazon Elastic Container Registry (ECR).
- GitLab CI triggers a lambda that runs an Amazon Elastic Container Service (ECS) task with the Flyway Docker image from ECR.
- The Flyway command “migrate” is executed and the database schema is updated.
The image below illustrates the process.
GitLab CI
A demo project has the folder db-migrations/scripts where migration scripts are placed. Every time a change is pushed to this folder on GitLab repository, the pipeline will run, build a Flyway Docker image with the scripts and push it to the Amazon Elastic Container Registry (ECR).
Additionally, GitLab CI triggers a lambda that calls an Amazon Elastic Container Service (ECS) task which will run the image built.
The image details are on the Dockerfile below.
The following .gitlab-ci.yml shows GitLab’s actions. Check stages “build-docker-image” and “execute-migrations”.
Lambda
The lambda is not strictly required: the same results could be achieved using the CLI.
However, a lambda offers more flexibility to the process. It is possible to get the execution results, send emails, feed a database table with information to collect statistics and everything else your heart desires.
Also, it keeps the infrastructure control through the code and its versions.
The lambda was written in Node.js 12.x and I have reused the code I wrote for another test. The task is triggered on line 47 by the command “ecs.runTask(params)”.
Elastic Container Service Task
The ECS task gets the Flyway Docker image and runs it. The command “migrate” will be executed, the scripts will be applied and the database schema will be updated.
In case of any errors, I have decided to keep the fix execution manual for now, but it would be possible to automatize the usage of other commands such as “validate” and “repair”.
I have created an alarm on CloudWatch to notify me by email in case errors. For a future implementation, I intent to manage execution’s errors through the lambda.
Conclusions
It can be painful to manage database migrations manually, especially if we have multiple environments such as development, staging or production.
However, a migration tool like Flyway integrated with a continuous delivery pipeline, avoid manual execution and therefore mitigates human error. Furthermore, it relieves the burden and boredom of the activity.
This article was written in partnership with Elson Neto.