π° News Summary App: Automated CI/CD with GitHub Actions, ArgoCD & GKE
Project Overview
This project fetches news exclusively from the United States using the NewsAPI.org service, applies a basic mock summarization (using headlines to generate fake summaries), and displays them through a modern React interface. All data is saved in a MySQL database and the entire system is deployed on Google Kubernetes Engine (GKE) using a fully automated CI/CD pipeline powered by GitHub Actions and ArgoCD.
This is a real-world example of a modern GitOps pipeline where DevOps meets full-stack deployment.
Project Goals
- Demonstrate how to build and deploy a full-stack web app using a real CI/CD pipeline
- Provide developers with a local development environment and DevOps engineers with a cloud-native deployment setup
- Enable scheduled tasks via GitHub Actions (e.g., pull news every 12 hours)
Technologies Used
Backend: Node.js + Express.js
Frontend: React + Vite + Tailwind CSS
Database: MySQL (deployed as a StatefulSet on GKE)
Containers: Docker, Docker Compose
CI/CD: GitHub Actions
GitOps: ArgoCD
βCloud Infrastructure: Google Kubernetes Engine (GKE)
News Source: NewsAPI.org
System Requirements
Ensure the following tools are installed on your system:
Node.js β Backend development runtime : https://nodejs.org/
Docker β Container platform: https://www.docker.com/products/docker-desktop
Docker Compose β Multi-container orchestration : (comes with Docker Desktop)
DBeaver β GUI for managing MySQL databases: https://dbeaver.io/
Git β Version control system: https://git-scm.com/downloads
gcloud CLI β Google Cloud command-line interface: https://cloud.google.com/sdk/docs/install
kubectl β Interact with Kubernetes clusters: https://kubernetes.io/docs/tasks/tools/
VS Code β Code editor: https://code.visualstudio.com/
Required Accounts
GitHub β To host code and run GitHub Actions workflows
Google Cloud Platform β For GKE, Artifact Registry, IAM, and Kubernetes resources
NewsAPI.org β For fetching U.S. news (requires a free API key)
π Project Structure
news-summary-gke/
βββ frontend/ β React-based news viewer
βββ src/ β Node.js backend (Express.js)
βββ mysql/init.sql β MySQL database + table creation script
βββ .github/workflows/ β CI/CD + cron-job definitions
βββ manifests/ β Kubernetes manifests for GKE
βββ news-app-local-dev/ β Local dev with Docker Compose
βββ scripts/ β GKE & ArgoCD install scripts
βββ Dockerfile β Production backend image
βββ README.mdπ§© Highlights:
frontend/: Filters, pagination, and live news display
src/: Connects to NewsAPI, processes news, stores in DB
manifests/: Defines all Kubernetes objects
.github/workflows/: CI/CD pipeline and scheduled jobs
news-app-local-dev/: Local testing with Docker Compose
How It Works
- Every 12 hours, a scheduled GitHub Actions job (
fetch-news.yml) is triggered - It fetches U.S. news using the NewsAPI.org API
- A mock summary is generated using the article title
- The news items are saved to a MySQL database
- The React frontend queries the backend API (
/news) and lists the news summaries
The βFetch Latest Newsβ button in the dashboard does not trigger a new fetch β it simply refreshes from the database.
1οΈβ£ Clone the Project & Run Locally
Before deploying to the cloud, you should test the app locally using Docker Compose.
Step 1: Clone the Repository
git clone https://github.com/hakanbayraktar/news-summary-gke.git
cd news-summary-gke/news-app/news-app-local-devStep 2: Create or Edit .env File
Inside the news-app-local-dev/ directory, create or edit a .env file:
mv .env.example .env.env file
ENV=local
PORT=3000
DB_HOST=mysql
DB_USER=root
DB_PASS=my-secret-pw
DB_NAME=newsdb
VITE_API_URL=http://app:3000
NEWS_API_KEY=your_news_api_key_hereπ You can get a free NEWS_API_KEY key from: https://newsapi.org
Step 3: Run with Docker Compose
docker-compose up -dThis launches:
- π₯οΈ Frontend β React app on http://localhost:5173
- βοΈ Backend β Express API on http://localhost:3000/news
- π¬ MySQL DB β Local instance on
localhost:3306
β Local Testing Checklist
- β News items are visible in the React dashboard
- β Filtering, pagination, and search work as expected
- β βFetch Latest Newsβ displays the most recent records
- β
Data is actually inserted into the
newstable in MySQL
You can manually check the DB:
docker exec -it local-mysql mysql -u root -p
# Enter password: my-secret-pw
USE newsdb;
SELECT * FROM news LIMIT 5;π’ Once local testing is confirmed, youβre ready to push the code and start setting up the cloud infrastructure.
After development, to free up space by removing unused Docker images, containers, volumes, and networks from your local machine, you can run:
docker compose down -v
docker system prune -a --volumes -fβ οΈ Warning:
β This command will permanently delete all stopped containers, unused images, volumes, and networks.
It is highly recommended to use this only in your local development environment.
Never use it in a production environment.
2οΈβ£ GitHub Repository & Personal Access Token (PAT) Setup
To automate deployment and CI/CD processes, we need:
- A GitHub repository that ArgoCD can watch
- A Personal Access Token (PAT) so GitHub Actions can push changes (like updated Kubernetes manifests) back to the repository
Step 1: Create a New GitHub Repository
Go to https://github.com/ and create a new repo:
- Name:
news-summary-gke - Description: A news summarization app with GKE + ArgoCD + GitHub Actions
- Visibility: Public or Private (your choice)
- Donβt add README,
.gitignore, or license (they already exist in the project)
Step 2: Generate a GitHub Personal Access Token (PAT)
You need a PAT with permission to push code from GitHub Actions back to your repo.
How to generate:
- Go to
Settings β Developer settings β Personal access tokens β Tokens (classic) - Click Generate new token (classic)
- Select scopes:
repoβworkflowβ
4. Name it something like news-summary-deploy-token
5. Choose expiration (90 days or no expiration)
6. Click Generate token and copy the value (youβll only see it once)
Step 3: Add the Token to GitHub Secrets
- Go to your repository β Settings > Secrets and variables > Actions > Secrets
- Click New repository secret
- Name:
GH_PAT - Value: paste your token
This token is used in the
ci-cd.ymlworkflow file to push updated manifest files back to the repo:
git push https://x-access-token:${{ secrets.GH_PAT }}@github.com/${{ github.repository }}.gitIf your repository is private, this token will also be needed for ArgoCD to access it.
Step 5: Workflow Permission Setting
GitHub repo βSettings->Actions->General->Workflow permissions->Read and write permissions->Save
3οΈβ£ Google Cloud Project & Service Account Setup
To deploy your app to Google Kubernetes Engine (GKE), youβll first need to:
- Create a new Google Cloud project
- Enable required APIs
- Create a service account with permissions for GKE and Artifact Registry
- Generate a JSON key for GitHub Actions
π§ Step 1: Create a New GCP Project
Go to https://console.cloud.google.com/ . From the top navigation, open the project selector β click βNEW PROJECTβ
Set a project name (e.g., news-summary-project) and Click βCREATEβ
Make sure youβve switched to the new project (top bar)
Step 2: Enable Required GCP APIs
Go to: APIs & Services β Enable APIs and Services
Search and enable the following:
- βοΈ Kubernetes Engine API
- π¦ Artifact Registry API
These APIs are needed to provision GKE clusters and to store Docker images securely.
Step 3: Create a Service Account
A service account is required for GitHub Actions and ArgoCD to authenticate and manage GCP resources.
β Create the Service Account:
- Go to
IAM & Admin β Service Accounts - Click β+ CREATE SERVICE ACCOUNTβ
- Name:
gke-deployer - Description: For GKE & Artifact Registry access
π Assign Roles
While creating the service account, assign the following roles:
Kubernetes Engine AdminArtifact Registry AdministratorService Account User- Service Usage Admin
These roles allow GitHub Actions to deploy workloads, push Docker images, and manage Kubernetes clusters.
Step 4: Create and Download a Key
- After the service account is created, open it
- Go to the βKeysβ tab
- Click βAdd Key β Create new keyβ
- Choose JSON format
Download the file β save it as:
scripts/gcp-key.json
π Do not commit this file to GitHub! It should already be listed in
.gitignore.
Step 5: Add the Key to GitHub Secrets
In your GitHub repo:
- Go to
Settings β Secrets and variables β Actions β Secrets - Add a new secret:
- Name:
GCP_CREDENTIALS - Value: paste the contents of
gcp-key.jsonas a single-line JSON string
4οΈβ£ Create Artifact Registry (Docker Image Storage)
Docker images built by GitHub Actions need to be stored in a secure, region-based container registry. For that, weβll use Google Artifact Registry.
π¦ Step 1: Go to Artifact Registry Console
- Visit: https://console.cloud.google.com/artifacts
- In the left sidebar, click βRepositoriesβ
- Click β+ CREATE REPOSITORYβ
π Repository Settings
Fill out the form:
- Name:
news-app - Format: Docker
- Mode: Standard
- Location type: Region
- Region:
us-central1 - Encryption: Google-managed
- Immutable image tags: Disabled
- Cleanup policy: Dry run (optional)
- Vulnerability scanning: Enabled
Then click Create.
Result
Once created, your Docker image path will look like this:
us-central1-docker.pkg.dev/YOUR_PROJECT_ID/news-app/This path will be used in the GitHub Actions workflow and in your Kubernetes manifests:
image: us-central1-docker.pkg.dev/${{ vars.GCP_PROJECT_ID }}/news-app/news-backend:${{ env.IMAGE_TAG }}β Youβve now prepared your container registry. Next, weβll set up the Kubernetes cluster where these images will run.
5οΈβ£ Create GKE Cluster Using Automation Script
Weβll now create a Kubernetes cluster on Google Kubernetes Engine (GKE) using a simple Bash script. This script automates the entire setup, including:
- Authenticating with your service account
- Enabling necessary APIs
- Creating a Kubernetes cluster
- Retrieving
kubectlcredentials
Step 1: Configure gke-setup.sh Script
Open the file at:
scripts/gke-setup.shUpdate the following values at the top of the script:
PROJECT_ID="your-project-id"
REGION="us-central1"
CLUSTER_NAME="devops-cluster"βΆοΈ Step 2: Run the Script
In your terminal:
cd news-summary-gke/scripts
bash gke-setup.shWhat the Script Does
- β
Authenticates using
gcp-key.json - β Sets the active project
- β Enables GKE and Artifact Registry APIs
- β Creates a cluster using:
- Machine type:
e2-small - Disk: 50 GB, standard
- Nodes: 3
- β
Fetches credentials for
kubectl
π§ͺ Step 3: Validate Your Cluster
Run the following to confirm everything is working:
gcloud container clusters list
kubectl get nodes
kubectl get pods -AIf you see at least 3 nodes in a READY state β your GKE cluster is live!
Also, you can visually confirm the cluster via the GCP Console:
You now have a production-grade Kubernetes cluster ready for your application.
6οΈβ£ ArgoCD Installation and GitOps Setup
ArgoCD is a declarative, GitOps-based continuous delivery tool for Kubernetes. In this project, ArgoCD will automatically detect changes in your GitHub repo and sync your Kubernetes manifests with the GKE cluster.
π§° What This Step Does
- Installs ArgoCD on the GKE cluster
- Exposes the ArgoCD UI via LoadBalancer
- Retrieves admin login credentials
- Connects ArgoCD to your GitHub repository
Step 1: Run the ArgoCD Installer Script
In the root of your project, run:
bash scripts/argocd-install.shThis script installs ArgoCD into a dedicated Kubernetes namespace (
argocd) and patches theargocd-serverservice to be exposed via LoadBalancer.
Step 2: Access the ArgoCD UI
Once the script finishes, it will output a public IP address:
Open this IP in your browser and log in using the admin username and the password to access the ArgoCD UI.
Default credentials:
- Username:
admin - Password: Run the command below to get it:
kubectl -n argocd get secret argocd-initial-admin-secret -o jsonpath="{.data.password}" | base64 -dStep 3: ArgoCD Application Setup
ArgoCD will track your Kubernetes manifests from GitHub. The configuration is defined in: manifests/argocd-app.yaml
Key fields to review:
- repoURL: https://github.com/YOUR_USERNAME/news-summary-gke
- path : manifests
Important: You must replace
YOUR_USERNAMEinrepoURLwith your actual GitHub username or repo path.
Otherwise, ArgoCD will not be able to fetch your manifests.
π Note for Private GitHub Repos
If your GitHub repository is private, ArgoCD needs authentication.
β
You can reuse your GH_PAT token (created earlier)
β
Add it as a Git repository credential inside the ArgoCD UI
7οΈβ£ GitHub Secrets and Variables Setup
To make your GitHub Actions workflows work seamlessly with GCP, MySQL, and ArgoCD, youβll need to define:
- π Secrets β Encrypted credentials (API keys, DB passwords, service account JSON)
- βοΈ Variables β Public config values (project ID, region, cluster name)
These are stored in:
GitHub Repository β Settings β Secrets and variables β Actionsπ Required GitHub Secrets
In your GitHub repository, go to Settings β Secrets and variables β Actions β Secrets β New repository secret and add the following:
ποΈ GCP_CREDENTIALS β Paste the contents of gcp-key.json as a single-line string
ποΈ DB_USER β Example: root
ποΈ DB_PASSβ Example: my-secret-pw
ποΈ DB_NAME β Example: newsdb
ποΈ NEWS_API_KEY β Your NewsAPI key from https://newsapi.org
ποΈ DB_HOST_EXTERNAL β MySQL nodeport IP + port
ποΈ GH_PAT β Your Personal Access Token (used by GitHub Actions to push back changes)
βοΈ Required GitHub Variables
In your GitHub repository, go to Settings β Secrets and variables β Actions β Variables β New repository variable and add the following:
π§ GCP_PROJECT_ID β Your GCP project ID
π§ GKE_CLUSTER_NAME β Example: devops-cluster
π§ GKE_REGION β Example: us-central1
π§ VIRE_NEWS_API_URLβ External IP of the backend service(news-service LB)(e.g., http://34.173.62.102)
π‘ How Theyβre Used
These secrets and variables are referenced inside your GitHub Actions workflow files:
credentials_json: '${{ secrets.GCP_CREDENTIALS }}'
project_id: ${{ vars.GCP_PROJECT_ID }}8οΈβ£ Pushing the Code to Your GitHub Repository
Before pushing your code to trigger the CI/CD pipeline, make sure to update the Docker image tags manually in the following deployment files:
π manifests/deployment-backend.yamlπ manifests/deployment-frontend.yaml
- Clean Up Any Existing .git History
If you cloned the project and it already contains a.gitdirectory from another repository, remove it first:
rm -rf .git- Initialize Git and Connect to Your Own Repository
These commands will initialize a fresh Git repository and link it to your own GitHub repo:
git init
git add .
git commit -m "first commit"
git branch -M main
git remote add origin https://github.com/yourusername/news-summary-gke-test.git
git push -u origin main9οΈβ£ Setting Up Infrastructure via CI/CD Workflow
Once the code is pushed to GitHub and all required Secrets and Variables are configured, you should manually trigger the .github/workflows/setup-infra.yml file once to set up the Kubernetes infrastructure.
This process deploys the following components onto GKE:
- π¬ MySQL (StatefulSet + PersistentVolume)
- βοΈ Backend API (Node.js + Express)
- π₯οΈ Frontend UI (React + Vite)
π Check the Services
kubectl get svcπ Access URLs
π₯οΈ Frontend UI β http://34.28.153.207
βοΈ Backend API β http://34.173.62.102/news
π¬ MySQL DB β 34.118.238.93:30306
Check the Pods
kubectl get podsβ MySQL Validation
kubectl exec -it mysql-0 -- sh
mysql -u root -p'YourPassword'
use newsdb;
select * from news;β API and Frontend Testing
- Backend API: http://34.173.62.102/news
- Frontend UI: http://34.28.153.207
β οΈ Ensure Frontend Can Fetch News
You must update the following GitHub configuration values:
- Variables
VITE_NEWS_API_URL=http://34.173.193.11
You can trigger the GitHub Actions CI/CD workflow either manually or via a code push.
π GitHub Actions CI/CD Pipeline (Auto-Deploy with ArgoCD)
This project uses a fully automated CI/CD pipeline, combining GitHub Actions for builds and ArgoCD for GitOps-based deployments into GKE.
π Workflow Responsibilities
- GitHub Actions β Builds images, updates manifests, pushes changes
- ArgoCD β Continuously watches for Git changes and deploys to GKE
π οΈ CI/CD Pipeline Flow
π .github/workflows/ci-cd.yml
- Triggered on every push to the
mainbranch - Builds Docker images for backend and frontend
- Pushes them to Google Artifact Registry
- Updates image tags in
deployment-backend.yamlanddeployment-frontend.yaml - Commits updated manifests back to the repository
- ArgoCD detects changes and deploys automatically to GKE
π .github/workflows/fetch-news.yml
- Runs every 12 hours (or manually)
- Fetches news from
newsapi.org - Generates mock summaries from headlines
- Saves everything to MySQLBackend API
π .github/workflows/setup-infra.yml
Manually triggered once via the GitHub Actions UI
π§ What It Does:
- Authenticates with Google Cloud
- Gets GKE cluster credentials
- Creates Kubernetes Secret with DB and API credentials
- Deploys:
ConfigMap(initial SQL script)MySQL StatefulSetwith persistent volumeJobto enable remote MySQL root access
β
Run this workflow once after setting all secrets and variables.
ArgoCD will take over deployments afterward.
Backend API
Frontend UI
β Conclusion
In this project, we successfully built a fully automated CI/CD pipeline that combines the power of GitHub Actions, ArgoCD, and GKE. The system ensures that every code push triggers a complete deployment workflow β from building Docker images to syncing Kubernetes manifests.
Key outcomes:
- GitOps workflow using ArgoCD
- Secure image storage in Google Artifact Registry
- Dynamic updates with GitHub Actions workflows
- Production-ready structure with monitoring and scalability potential
This setup significantly reduces manual intervention and provides a robust, scalable, and reproducible deployment strategy for modern cloud-native applications.
All code files and configuration can be found in the following GitHub repository: https://github.com/hakanbayraktar/news-summary-gke
