Master in Computer Engineering – Mobile Computing (2025/2026)
Date: 7th April 2026
Based on the previous lab's source code, in this worksheet we will create a CI/CD pipeline to build and push Docker images to the GitHub Container Registry (GHCR) using GitHub Actions. In the last part, we will ensure services resilience by adding a retry and fail mechanism on the database connection and customizing the PostgreSQL service health checks.
Fork the previous lab repository. Your source code should include:
- Frontend custom-built image
- API custom-built image
- Nginx Proxy Manager
- MinIO
- PostgreSQL
Create a GitHub Workflow (.github/workflows/ci.yml) that automatically builds your frontend and api images on every push to the main branch.
Because developers use different computers (e.g., Apple Silicon M-chips vs. Intel/AMD), you must build images for both linux/amd64 and linux/arm64 architectures.
Tips:
- Use
actions/checkout@v4to checkout the repository code to the runner. - Use
docker/login-action@v3to log into GHCR. - Use
docker/setup-qemu-action@v3to prepare for multiple architecture builds. - Use
docker/setup-buildx-action@v3to set up the advanced builder. - Use
docker/build-push-action@v5to build and push each image (setplatforms: linux/amd64,linux/arm64).
To allow GitHub Actions to push images to your account, you must provide the right permissions.
- Generate a GitHub Personal Access Token (Classic) with the
write:packagesscope. - Add this token as a Repository Secret (e.g.,
GHCR_PAT) and use it in yourdocker/login-actionstep. - Important: Once your pipeline successfully pushes the images, go to your GitHub Profile -> Packages, and change the visibility of both your newly built packages from Private to Public.
Now that your images are built in the cloud, update your local docker-compose.yml to pull the images directly from the GitHub Container Registry, instead of building them locally.
The API may still struggle to connect to the database depending on the boot sequence. To mitigate this issue, we must fix both the API and Database services.
On the API side:
- Add a database connection retry mechanism in your Node.js code.
- After 5 failed retries, stop the container completely by triggering an error return code (
process.exit(1)).
On the Database side:
- Edit
docker-compose.ymlso the database service has a health check that confirms PostgreSQL is actually ready to accept connections.
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
interval: 5s
timeout: 5s
retries: 5- Update the API: Modify the API's depends_on block in docker-compose.yml to wait for the database's new health condition:
depends_on:
postgres-db:
condition: service_healthy