Tutorials
  • Welcome to Ozone
  • Quick Onboarding
    • Creating a New Project
    • Creating Environments
    • Adding a Registry
    • Adding a Repository
    • Attaching Clusters
    • Creating a Microservice
    • Using out-of-the-box Pipeline Templates
    • Creating a new pipeline on the Ozone Pipeline Studio
    • Configuring Triggers for Automated Deployments
    • Adding a CD Provider
      • Jenkins Pipeline
  • Documentation
    • Dashboard
      • Ozone Dashboard
      • Analyze Metrics & Logs for Kubernetes Clusters
    • CI/CD
      • Create Microservice
        • Link a Git Repo
        • Map a Registry
        • Map to Environments
        • Build Config (Specify where the Docker file exists)
      • Link Pipelines to your Microservice
        • Default Pipelines that are linked
        • What are Input Sets?
        • Execute a linked pipeline
      • Catalog
        • External Pipelines
          • Supported Integrations
          • How to Link an External CI Integration
          • Conversion Of external pipelines to Tekton Pipelines
        • Tasks
          • Create a Custom task
        • Releases (Templates and Runs)
          • What are releases composed of (Pipelines & Approvals)
          • Create a Release Template
          • Run a Release Template
        • Running Your First Pipeline
        • Pipelines (Templates & Runs)
          • Adding Nodes to Canvas
          • Configuring Rollbacks at Pipeline Template
          • Secret Injection + Secrets
          • Input-result mapping between tasks
        • Initiating Pipeline run
          • Manually
      • Triggers
        • Scheduling a pipeline and/or a release run
        • Triggering a pipeline and/or a release run
          • From Github events
          • From GitLab events
          • From Jira events
          • Custom Webhook
          • From Harbor events
          • From Azure DevOps events
          • From Bitbucket events
          • From Dockerhub events
      • Observe your Microservice
      • Verify Your Microservice With AI
    • Helm
      • Create a Helm Channel
      • Create a Helm Release
      • Edit a Helm Release
    • DevSecOps
      • Security Dashboard
      • Scans
      • Supported Integrations
      • Run Your First Security Pipeline
      • Shift Left Policy Management
        • Policies
    • Backups
      • Pre-requisites
      • How do I schedule a backup to create snapshots?
      • How to take snapshots and how do I know the status of backups?
      • How do I restore snapshots to clusters?
    • Setup
      • Manage Cluster
        • Public Cluster
        • Reattach Cluster
      • Setting up Environments
      • Manage Secret
      • Manage Repos
      • Manage Registries
      • Integrations
        • Managing Cloud Integrations
          • AWS
          • Azure
          • GCP
        • Managing Source Code Integrations
          • GitHub
          • GitLab
          • Bit bucket
          • Azure DevOps Repos
          • Git Repo
          • Bitbucket Datacenter
        • Managing Container Registry
          • Docker
          • GCR
          • Harbor
          • Quay
          • Azure ACR
          • Adhoc Registry
        • Managing Container Orchestration
          • AWS EKS
          • GKE
          • Azure AKS
        • Managing Issue Trackers
        • Managing Continuous Deployment
          • Argo CD
          • Azure DevOps
          • Ansible Tower
        • Managing SSO
        • Managing Private Catalogs
        • Managing Notifications
        • Managing Security
          • Snyk
          • Prisma Cloud
        • Managing APM
          • NewRelic
        • Managing Cloud Storage
          • Minio
          • AWS S3 Bucket
          • Google Cloud Storage
          • Azure Blob Storage
        • Managing Network Tunnels
        • Manage Testing
          • K6
        • Managing Secret Store
          • Azure Key Vault
          • Google Secret Manager
          • AWS Secrets Manager
          • Hashicorp Vault
    • Settings
      • Role Based Access Control
        • Create a new role
        • Clone an Existing Role
        • Apply a role to a member
      • Ozone Identity Management
      • Audit Trails
      • Private Cluster Management
      • SSO
        • Pre-Requisites
        • Azure AD
      • Projects
        • Create a new Project
        • Archive a Project
        • Import and remove resources into the project
        • Add Members to a Project
      • Setup Alerts and Notifications
  • Release Notes
    • August - 2024
    • July - 2024
    • June - 2024
    • April - 2024
    • February - 2024
    • November - 2023
    • October - 2023
    • September - 2023
    • August - 2023
    • July - 2023
    • June - 2023
    • May - 2023
    • April - 2023
    • September - 2022
    • August - 2022
    • July - 2022
    • May - 2022
    • April - 2022
    • Mar - 2022
    • Jan - 2022
    • Nov - 2021
  • FAQ
    • In House Applications
    • COTS Applications
    • Tasks
    • Pipelines
    • Releases
    • Projects
    • Members
    • Environments
    • Variables
    • Roles
  • Use Cases
    • For Platform Engineers
      • Standardized Application Delivery Workflows
      • Unified Observability and Alerting
      • On Demand Workload Recovery
    • For Software Developers
      • On Demand Delivery
      • Scalable and Re-usable Workflows
Powered by GitBook
On this page
  1. Documentation
  2. DevSecOps

Run Your First Security Pipeline

PreviousSupported IntegrationsNextShift Left Policy Management

Last updated 10 months ago

This tutorial will help you to run your first security pipeline. Before that, we will learn some terminologies.

🔓 A vulnerability is a weakness in a system or software that can be exploited by attackers to compromise security.

Let’s get started, Pre-Requisites required before running a pipeline, Let’s get started, Pre-Requisites required before running a pipeline, How to run your first pipeline: Link How to create a provider: Link

Hope you learned how to create a pipeline and run it! Let's go to the next step, In our task and pipeline catalog, we support many scanning tools such as trivy, sonarQube, snyk, and many more. To run a security pipeline integrate these tasks in your pipelines and run it. How simple is that? I will show how it is done in the next step with a detailed explanation. SCA (Software Composition Analysis): Identifies and manages open-source and third-party components in software to detect vulnerabilities and ensure compliance.

SAST (Static Application Security Testing): Analyzes source code without executing it to identify vulnerabilities and weaknesses in the early stages of development.

DAST (Dynamic Application Security Testing): Tests running applications for vulnerabilities by simulating attacks from outside the application.

IAC (Infrastructure as Code) Scanning: Reviews infrastructure code to identify security vulnerabilities and compliance issues before deployment.

In the security dashboard, we have scans that show all vulnerabilities and policies (Global level, Cluster level, Environment level, Microservice level) for setting up some predefined rules. Don’t worry we got your back, let me show you an example.

For example: After scanning we get vulnerabilities (i.e. critical, high, medium, and low). Let’s say we set permission for critical vulnerabilities as a block then in the future when we run the pipeline and if scanning tools find critical vulnerabilities in the pipeline then the pipeline fails. Suppose if we allow all the vulnerabilities then, even if scanning tools find all the vulnerabilities the pipeline will not fail because we have allowed them.

Now we have covered all the terminologies in the aspect of security. good work till now. let's move to the next step.

As you can see in pipeline templates, there is a pipeline template called to build and deploy with DevSecOps. If we open that we can see all categories here from SAST, SCA, DAST to IAC at various stages. How cool is this! you don't even have to create a pipeline we have covered most of the use cases in our catalog. If you still want to create a pipeline then we have these scanning tools in the form of tasks you can use them

Run the above pipeline. That’s it done ✅ great job…👏

Now click on build and deploy with devsecops. Checkout Logs, Results for SBOM’s, and the Security section for vulnerabilities.

SBOM, or Software Bill of Materials, is a detailed inventory listing of all components used in a software product.

( spdx and cyclonedx are formats )SPDX is a standard for documenting and exchanging software package metadata, while CycloneDX is a lightweight SBOM specification designed for easy integration into modern CI/CD pipelines. LOGS: In logs, you can check for the vulnerability list as shown below. Even you can check if scanned by sonarQube.

RESULTS: In results, you can check for SBOM reports.

SECURITY: In security, you can see all information regarding vulnerabilities such as vulnerabilityID, severity, package, microservices etc.

You can just click on the links of vulnerability to fix it as shown below.

Wow 🤩, you made it. you not only learned how to run a pipeline with security Integration but also learned the core concept of DevSecOps.