All Articles

How to Set Up Argo Workflows for CI/CD Pipelines in Kubernetes

Step-by-step guide to setting up Argo Workflows for Kubernetes-native CI/CD. Covers installation, workflow templates, artifact management, CI pipeline examples, and integration with ArgoCD.

DevOpsBoysMar 26, 20267 min read
Share:Tweet

Jenkins is a VM-era tool. GitHub Actions runs on someone else's infrastructure. If you want CI/CD that's Kubernetes-native, scalable, and fully under your control, Argo Workflows is the answer.

Argo Workflows is a CNCF graduated project that runs CI/CD pipelines as Kubernetes pods. Each step in your pipeline is a container. You get Kubernetes-native retries, parallelism, DAG-based execution, artifact passing, and full observability — all defined in YAML.

This guide takes you from zero to a working CI/CD pipeline.

Prerequisites

  • Kubernetes cluster (1.26+)
  • kubectl configured
  • Helm 3.x installed
  • A container registry (DockerHub, GHCR, or ECR)

Step 1: Install Argo Workflows

bash
# Create namespace
kubectl create namespace argo
 
# Install with Helm
helm repo add argo https://argoproj.github.io/argo-helm
helm repo update
 
helm install argo-workflows argo/argo-workflows \
  --namespace argo \
  --set server.extraArgs="{--auth-mode=server}" \
  --set controller.workflowNamespaces="{argo,default,production}"

Verify the installation:

bash
kubectl get pods -n argo
NAME                                                  READY   STATUS    RESTARTS   AGE
argo-workflows-server-6b4d7f8c9-x2k4m               1/1     Running   0          2m
argo-workflows-workflow-controller-5f9d8b7c6-9j3kl   1/1     Running   0          2m

Step 2: Install the Argo CLI

bash
# Linux
curl -sLO https://github.com/argoproj/argo-workflows/releases/latest/download/argo-linux-amd64.gz
gunzip argo-linux-amd64.gz
chmod +x argo-linux-amd64
sudo mv argo-linux-amd64 /usr/local/bin/argo
 
# Verify
argo version

Step 3: Access the Argo UI

bash
kubectl -n argo port-forward svc/argo-workflows-server 2746:2746

Open https://localhost:2746 in your browser.

Step 4: Configure RBAC

Argo Workflows needs permissions to create pods and manage workflows:

yaml
# argo-rbac.yaml
apiVersion: v1
kind: ServiceAccount
metadata:
  name: argo-workflow
  namespace: argo
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
  name: argo-workflow-role
rules:
  - apiGroups: [""]
    resources: ["pods", "pods/log"]
    verbs: ["get", "list", "watch", "create", "delete"]
  - apiGroups: [""]
    resources: ["secrets"]
    verbs: ["get"]
  - apiGroups: ["argoproj.io"]
    resources: ["workflows", "workflowtemplates", "cronworkflows"]
    verbs: ["get", "list", "watch", "create", "update", "delete"]
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
  name: argo-workflow-binding
roleRef:
  apiGroup: rbac.authorization.k8s.io
  kind: ClusterRole
  name: argo-workflow-role
subjects:
  - kind: ServiceAccount
    name: argo-workflow
    namespace: argo
bash
kubectl apply -f argo-rbac.yaml

Step 5: Your First Workflow

Let's start with a simple workflow to understand the basics:

yaml
# hello-workflow.yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: hello-
  namespace: argo
spec:
  entrypoint: say-hello
  serviceAccountName: argo-workflow
  templates:
    - name: say-hello
      container:
        image: busybox:1.36
        command: [echo]
        args: ["Hello from Argo Workflows!"]
bash
argo submit hello-workflow.yaml --watch

Step 6: Build a Real CI Pipeline

Now let's build a proper CI pipeline with multiple stages:

yaml
# ci-pipeline.yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: ci-pipeline-
  namespace: argo
spec:
  entrypoint: ci-pipeline
  serviceAccountName: argo-workflow
  arguments:
    parameters:
      - name: repo-url
        value: "https://github.com/your-org/your-app.git"
      - name: branch
        value: "main"
      - name: image-name
        value: "ghcr.io/your-org/your-app"
      - name: image-tag
        value: "latest"
 
  volumeClaimTemplates:
    - metadata:
        name: workspace
      spec:
        accessModes: ["ReadWriteOnce"]
        resources:
          requests:
            storage: 2Gi
 
  templates:
    - name: ci-pipeline
      dag:
        tasks:
          - name: clone
            template: git-clone
            arguments:
              parameters:
                - name: repo-url
                  value: "{{workflow.parameters.repo-url}}"
                - name: branch
                  value: "{{workflow.parameters.branch}}"
 
          - name: install
            template: npm-install
            dependencies: [clone]
 
          - name: lint
            template: npm-lint
            dependencies: [install]
 
          - name: test
            template: npm-test
            dependencies: [install]
 
          - name: build
            template: npm-build
            dependencies: [lint, test]
 
          - name: docker-build-push
            template: docker-build
            dependencies: [build]
            arguments:
              parameters:
                - name: image-name
                  value: "{{workflow.parameters.image-name}}"
                - name: image-tag
                  value: "{{workflow.parameters.image-tag}}"
 
    - name: git-clone
      inputs:
        parameters:
          - name: repo-url
          - name: branch
      container:
        image: alpine/git:2.43.0
        command: [sh, -c]
        args:
          - |
            git clone --branch {{inputs.parameters.branch}} --depth 1 \
              {{inputs.parameters.repo-url}} /workspace/source
            cd /workspace/source
            echo "Cloned $(git rev-parse --short HEAD)"
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: npm-install
      container:
        image: node:20-alpine
        command: [sh, -c]
        args:
          - |
            cd /workspace/source
            npm ci --prefer-offline
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: npm-lint
      container:
        image: node:20-alpine
        command: [sh, -c]
        args:
          - |
            cd /workspace/source
            npm run lint
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: npm-test
      container:
        image: node:20-alpine
        command: [sh, -c]
        args:
          - |
            cd /workspace/source
            npm test -- --coverage
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: npm-build
      container:
        image: node:20-alpine
        command: [sh, -c]
        args:
          - |
            cd /workspace/source
            npm run build
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: docker-build
      inputs:
        parameters:
          - name: image-name
          - name: image-tag
      container:
        image: gcr.io/kaniko-project/executor:latest
        args:
          - "--dockerfile=/workspace/source/Dockerfile"
          - "--context=/workspace/source"
          - "--destination={{inputs.parameters.image-name}}:{{inputs.parameters.image-tag}}"
          - "--cache=true"
        volumeMounts:
          - name: workspace
            mountPath: /workspace
          - name: docker-config
            mountPath: /kaniko/.docker/
      volumes:
        - name: docker-config
          secret:
            secretName: docker-registry-credentials

The DAG execution looks like:

clone → install → lint  ─┐
                  test  ─┤→ build → docker-build-push

Lint and test run in parallel after install. Build only starts when both pass.

Step 7: Create Reusable Workflow Templates

Don't copy-paste workflows. Create templates:

yaml
# ci-template.yaml
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
  name: ci-node-app
  namespace: argo
spec:
  arguments:
    parameters:
      - name: repo-url
      - name: branch
        value: "main"
      - name: node-version
        value: "20"
      - name: image-name
      - name: image-tag
 
  entrypoint: pipeline
 
  templates:
    - name: pipeline
      dag:
        tasks:
          - name: clone
            template: git-clone
          - name: install
            template: run-npm
            dependencies: [clone]
            arguments:
              parameters:
                - name: command
                  value: "ci --prefer-offline"
          - name: lint
            template: run-npm
            dependencies: [install]
            arguments:
              parameters:
                - name: command
                  value: "run lint"
          - name: test
            template: run-npm
            dependencies: [install]
            arguments:
              parameters:
                - name: command
                  value: "test"
          - name: build
            template: run-npm
            dependencies: [lint, test]
            arguments:
              parameters:
                - name: command
                  value: "run build"
 
    - name: git-clone
      container:
        image: alpine/git:2.43.0
        command: [sh, -c]
        args:
          - |
            git clone --branch {{workflow.parameters.branch}} --depth 1 \
              {{workflow.parameters.repo-url}} /workspace/source
        volumeMounts:
          - name: workspace
            mountPath: /workspace
 
    - name: run-npm
      inputs:
        parameters:
          - name: command
      container:
        image: "node:{{workflow.parameters.node-version}}-alpine"
        command: [sh, -c]
        args: ["cd /workspace/source && npm {{inputs.parameters.command}}"]
        volumeMounts:
          - name: workspace
            mountPath: /workspace

Use the template:

yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: my-app-ci-
spec:
  workflowTemplateRef:
    name: ci-node-app
  arguments:
    parameters:
      - name: repo-url
        value: "https://github.com/your-org/my-app.git"
      - name: branch
        value: "feature/new-api"
      - name: image-name
        value: "ghcr.io/your-org/my-app"
      - name: image-tag
        value: "pr-42"

Step 8: Trigger Workflows from GitHub Webhooks

Install Argo Events to trigger workflows from Git push/PR events:

bash
helm install argo-events argo/argo-events \
  --namespace argo

Create an event source for GitHub webhooks:

yaml
apiVersion: argoproj.io/v1alpha1
kind: EventSource
metadata:
  name: github-webhook
  namespace: argo
spec:
  github:
    app-repo:
      repositories:
        - owner: your-org
          names: [your-app]
      webhook:
        endpoint: /push
        port: "12000"
        method: POST
      events:
        - push
        - pull_request
      apiToken:
        name: github-token
        key: token

Create a sensor that triggers the CI workflow:

yaml
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
  name: ci-trigger
  namespace: argo
spec:
  dependencies:
    - name: github-push
      eventSourceName: github-webhook
      eventName: app-repo
  triggers:
    - template:
        name: ci-workflow
        argoWorkflow:
          operation: submit
          source:
            resource:
              apiVersion: argoproj.io/v1alpha1
              kind: Workflow
              metadata:
                generateName: ci-
              spec:
                workflowTemplateRef:
                  name: ci-node-app
                arguments:
                  parameters:
                    - name: repo-url
                    - name: branch
          parameters:
            - src:
                dependencyName: github-push
                dataKey: body.repository.clone_url
              dest: spec.arguments.parameters.0.value
            - src:
                dependencyName: github-push
                dataKey: body.ref
              dest: spec.arguments.parameters.1.value

Now every push to your repo automatically triggers the CI pipeline in your Kubernetes cluster.

Step 9: Connect to ArgoCD for Full GitOps

The real power comes from combining Argo Workflows (CI) with ArgoCD (CD):

Push code → GitHub webhook → Argo Workflows runs CI →
Builds image → Updates image tag in GitOps repo →
ArgoCD detects change → Deploys to cluster

Add an image update step to your workflow:

yaml
- name: update-gitops-repo
  container:
    image: alpine/git:2.43.0
    command: [sh, -c]
    args:
      - |
        git clone https://github.com/your-org/gitops-repo.git /tmp/gitops
        cd /tmp/gitops
 
        # Update image tag in kustomization
        sed -i "s|newTag:.*|newTag: {{workflow.parameters.image-tag}}|" \
          apps/my-app/kustomization.yaml
 
        git config user.email "ci@your-org.com"
        git config user.name "Argo CI"
        git add .
        git commit -m "ci: update my-app image to {{workflow.parameters.image-tag}}"
        git push
    env:
      - name: GIT_TOKEN
        valueFrom:
          secretKeyRef:
            name: github-token
            key: token

Monitoring Your Pipelines

Argo Workflows exposes Prometheus metrics:

bash
kubectl -n argo port-forward svc/argo-workflows-workflow-controller 9090:9090

Key metrics:

  • argo_workflows_count — total workflows by status
  • argo_workflows_pods_count — active workflow pods
  • workflow_condition — workflow success/failure rates
  • argo_workflows_queue_depth — pending workflow queue

Why Argo Workflows Over Jenkins/GitHub Actions

FeatureJenkinsGitHub ActionsArgo Workflows
Runs in your clusterPlugin neededNoNative
Kubernetes-nativeNoNoYes
DAG executionLimitedNoYes
Artifact managementPluginBasicNative
Auto-scalingManualManagedPod-based
Cost controlAlways runningPer-minute billingScale to zero
ObservabilityPluginsLimitedPrometheus + UI

Next Steps

Once you have Argo Workflows running:

  1. Set up artifact storage with S3/MinIO for build caches
  2. Add retry policies for flaky steps
  3. Implement cron workflows for scheduled builds
  4. Set up resource quotas to prevent runaway pipelines

For deeper Kubernetes and CI/CD training, KodeKloud has excellent hands-on courses that include Argo ecosystem tools. And if you need an affordable cluster for running your Argo setup, DigitalOcean Kubernetes is great for getting started.


Argo Workflows gives you Jenkins-level power with Kubernetes-native simplicity. Once you try it, you won't go back.

Newsletter

Stay ahead of the curve

Get the latest DevOps, Kubernetes, AWS, and AI/ML guides delivered straight to your inbox. No spam — just practical engineering content.

Related Articles

Comments