Making a CI/CD pipeline for my To-Do app using Github Actions


Why have a CI/CD pipeline?

My To-Do app consists of a front-end which is a React App built using create-react-app and a back-end which is built using the Python framework FastAPI. The project also uses PostgreSQL as a database and RabbitMQ as a message broker. For a more detailed post on the technologies used to build the project see this link https://garethbreezecode.com/2023/03/how-i-built-my-to-do-app-an-overview/

Continuous Integration (CI) ensures that every change to your code is automatically tested and integrated, reducing the chances of breaking your application.

Continuous Deployment or Delivery (CD) takes it a step further by automating the process of releasing your application to production. Automating these processes saves time and reduces human error from doing the process manually.

In this blog post I am going to explain how I made a CI/CD pipeline for my To-Do app. To set up the CI/CD pipeline I used GitHub Actions as when I was doing a tutorial about CI/CD pipelines this is what the instructor used.

GitHub Actions uses a simple .yaml configuration file to define and automate these processes, making it easy to set up and manage CI/CD pipelines.

To set this up, I create a .yaml file and placed it in the .github/workflows directory of the repository.

CI pipeline

The CI pipeline simply builds your project on a virtual machine ensuring the code changes you’ve made will not break the project in production.

This CI pipeline automates building and testing for the front-end, back-end and any services the app relies on i.e. the database.

Here is my CI pipeline

name: Build and Deploy Code

on: [push, pull_request]

jobs:
  build:
    runs-on: ubuntu-latest
    environment:
      name: testing
    env:
      DATABASE_HOSTNAME: ${{secrets.DATABASE_HOSTNAME}}
      DATABASE_PORT: ${{secrets.DATABASE_PORT}}
      DATABASE_PASSWORD: ${{secrets.DATABASE_PASSWORD}}
      DATABASE_NAME: ${{secrets.DATABASE_NAME}}
      DATABASE_USERNAME: ${{secrets.DATABASE_USERNAME}}
      SECRET_KEY: ${{secrets.SECRET_KEY}}
      ALGORITHM: ${{secrets.ALGORITHM}}
      ACCESS_TOKEN_EXPIRE_MINUTES: ${{secrets.ACCESS_TOKEN_EXPIRE_MINUTES}}
      REFRESH_SECRET_KEY: ${{secrets.REFRESH_SECRET_KEY}}
      REFRESH_TOKEN_EXPIRE_MINUTES: ${{secrets.REFRESH_TOKEN_EXPIRE_MINUTES}}
      BROKER_PROTOCOL: ${{secrets.BROKER_PROTOCOL}}
      BROKER_USERNAME: ${{secrets.BROKER_USERNAME}}
      BROKER_PASSWORD: ${{secrets.BROKER_PASSWORD}}
      BROKER_HOST: ${{secrets.BROKER_HOST}}
      BROKER_PORT: ${{secrets.BROKER_PORT}}
      BROKER_VHOST: ${{secrets.BROKER_VHOST}}
      PATH_BACKEND_DIR: ${{secrets.PATH_BACKEND_DIR}}
      EMAIL_SENDER: ${{secrets.EMAIL_SENDER}}
      EMAIL_APP_PASSWORD: ${{secrets.EMAIL_APP_PASSWORD}}

    services:
      postgres:
        image: postgres
        env:
          POSTGRES_PASSWORD: ${{secrets.DATABASE_PASSWORD}}
          POSTGRES_DB: ${{secrets.DATABASE_NAME}}_test
          POSTGRES_USER: ${{secrets.DATABASE_USERNAME}}
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
      rabbitmq:
        image: rabbitmq
        ports:
          - 5672
        env:
          RABBITMQ_DEFAULT_USER: ${{secrets.BROKER_USERNAME}}
          RABBITMQ_DEFAULT_PASS: ${{secrets.BROKER_PASSWORD}}
          RABBITMQ_DEFAULT_VHOST: ${{secrets.BROKER_VHOST}}
    steps:
      - name: Pulling git repo
        uses: actions/checkout@v3
      - name: Install Python version 3.8
        uses: actions/setup-python@v4
        with:
          python-version: "3.8"
      - name: Update pip
        run: python -m pip install --upgrade pip
      - name: Install all dependencies
        run: pip install -r requirements.txt
        working-directory: ./backend
      - name: Run Automated Tests
        run: pytest
        working-directory: ./backend
      - name: Login to Docker Hub
        uses: docker/login-action@v2
        with:
          username: ${{ secrets.DOCKER_HUB_USERNAME }}
          password: ${{ secrets.DOCKER_HUB_ACCESS_TOKEN }}
      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v2
      - name: Build and push backend container
        uses: docker/build-push-action@v3
        with:
          context: ./backend
          builder: ${{ steps.buildx.outputs.name }}
          push: true
          tags: ${{ secrets.DOCKER_HUB_USERNAME }}/todo-backend:latest
          cache-from: type=registry,ref=${{ secrets.DOCKER_HUB_USERNAME }}/todo-backend:buildcache
          cache-to: type=registry,ref=${{ secrets.DOCKER_HUB_USERNAME }}/todo-backend:buildcache,mode=max
      - name: Set Up Node
        uses: actions/setup-node@v3
        with:
          node-version: 16
      - name: Install node package dependencies
        run: npm install
        working-directory: ./frontend
      - name: Execute test cases
        run: npm run test
        working-directory: ./frontend
      - name: Login to Docker Hub
        uses: docker/login-action@v2
        with:
          username: ${{ secrets.DOCKER_HUB_USERNAME }}
          password: ${{ secrets.DOCKER_HUB_ACCESS_TOKEN }}
      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v2
      - name: Build and push frontend container
        uses: docker/build-push-action@v3
        with:
          context: ./frontend
          builder: ${{ steps.buildx.outputs.name }}
          push: true
          tags: ${{ secrets.DOCKER_HUB_USERNAME }}/todo-frontend:latest
          cache-from: type=registry,ref=${{ secrets.DOCKER_HUB_USERNAME }}/todo-frontend:buildcache
          cache-to: type=registry,ref=${{ secrets.DOCKER_HUB_USERNAME }}/todo-frontend:buildcache,mode=max
      - name: Create test build
        run: npm run build
        working-directory: ./frontend


This is a summary of what the CI pipeline does

First of all the CI pipeline is triggered on push and pull_request events.

Secrets are injected for environment variables, including database and broker credentials.

Services for PostgreSQL and RabbitMQ are set up for testing.

The back-end is tested using Python and pytest, with dependencies installed first.

The front-end is tested using Node.js and npm before a production build is created.

Docker images for the back-end and front-end are built and pushed to Docker Hub.

This setup ensures automated testing and deployment for both front-end and back-end.

CD pipeline

The CD (Continuous Deployment) pipeline automates deploying the application to a production environment on a remote Ubuntu server. This can then be accessed via the internet.

I have set this CD pipeline to deploy to a remote server that I have bought and configured on Digital Ocean

Here is my CD pipeline

deploy:
    runs-on: ubuntu-20.04
    needs: [ build ]
    environment:
      name: production
    env:
      DATABASE_HOSTNAME: ${{secrets.DATABASE_HOSTNAME}}
      DATABASE_PORT: ${{secrets.DATABASE_PORT}}
      DATABASE_PASSWORD: ${{secrets.DATABASE_PASSWORD}}
      DATABASE_NAME: ${{secrets.DATABASE_NAME}}
      DATABASE_USERNAME: ${{secrets.DATABASE_USERNAME}}
      SECRET_KEY: ${{secrets.SECRET_KEY}}
      ALGORITHM: ${{secrets.ALGORITHM}}
      ACCESS_TOKEN_EXPIRE_MINUTES: ${{secrets.ACCESS_TOKEN_EXPIRE_MINUTES}}
      REFRESH_SECRET_KEY: ${{secrets.REFRESH_SECRET_KEY}}
      REFRESH_TOKEN_EXPIRE_MINUTES: ${{secrets.REFRESH_TOKEN_EXPIRE_MINUTES}}
      BROKER_PROTOCOL: ${{secrets.BROKER_PROTOCOL}}
      BROKER_USERNAME: ${{secrets.BROKER_USERNAME}}
      BROKER_PASSWORD: ${{secrets.BROKER_PASSWORD}}
      BROKER_HOST: ${{secrets.BROKER_HOST}}
      BROKER_PORT: ${{secrets.BROKER_PORT}}
      BROKER_VHOST: ${{secrets.BROKER_VHOST}}
      PATH_BACKEND_DIR: ${{secrets.PATH_BACKEND_DIR}}
      EMAIL_SENDER: ${{secrets.EMAIL_SENDER}}
      EMAIL_APP_PASSWORD: ${{secrets.EMAIL_APP_PASSWORD}}
    steps:
      - name: Pulling git repo
        uses: actions/checkout@v4
      - name: deploy to ubuntu server via ssh
        uses: appleboy/[email protected]
        with:
          host: ${{ secrets.HOST }}
          username: ${{ secrets.USERNAME }}
          key: ${{ secrets.SSH_PRIVATE_KEY }}
          script: |
            exec > /home/$USER/deployment_logs.log 2>&1
            echo "Time: $(date). Time and date so know when written to logs."
            export DATABASE_USERNAME=${{ secrets.DATABASE_USERNAME }}
            export DATABASE_HOSTNAME=${{ secrets.DATABASE_HOSTNAME }}
            export DATABASE_PORT=${{ secrets.DATABASE_PORT }}
            export DATABASE_PASSWORD=${{ secrets.DATABASE_PASSWORD }}
            export DATABASE_NAME=${{ secrets.DATABASE_NAME }}
            cd todo-app/src
            git pull
            cd backend
            activate () {
            . $PWD/venv/bin/activate
            }
        
            activate
            pip3 install -r requirements.txt
            alembic upgrade head
            cd ..
            cd frontend
            npm install
            echo ${{ secrets.PRODUCTION_PASSWORD }} | sudo -S cp -r ./build/* /var/www/garethbreeze-todo-app.xyz/html
            echo "Restarting api service"
            echo ${{ secrets.PRODUCTION_PASSWORD }} | sudo -S systemctl restart backend_api.service
            echo "Restarting celery"
            echo ${{ secrets.PRODUCTION_PASSWORD }} | sudo -S systemctl restart celery.service
            echo "rebooting server"

This is a summary of what it does

The CD pipeline runs after the build (CI script ) job is completed successfully.

Uses environment secrets for configuration (e.g., database credentials, broker settings, SSH access).

Pulls the latest changes from the Git repository.

Connects to the production server using SSH with a private key for authentication.

Updates the application back-end and front-end it does this by

Activates the Python virtual environment and installs back-end dependencies.

Runs database migrations with Alembic.

Installs front-end dependencies and deploys the built front-end files to the web server directory.

Restarts the FastAPI and celery back-end services to apply the changes.

As this pipeline runs it logs deployment output to a file on the server for tracking.

This pipeline ensures that the latest application changes are deployed seamlessly while keeping the production environment up to date and operational.

Proudly powered by WordPress | Theme: Code Blog by Crimson Themes.