Jenkins CI/CD for Java web App: Deploy to ECS with DevSecOps
In the landscape of modern cloud-native development, accelerating release cycles is a primary goal. However, this velocity cannot come at the expense of security. Integrating security practices directly into the automated pipeline—a philosophy known as DevSecOps—is no longer a "nice-to-have" but a fundamental requirement. For teams running Java applications on AWS, leveraging a Jenkins CI/CD DevSecOps pipeline to deploy to the Elastic Container Service (ECS) represents a powerful, scalable, and secure solution.
This comprehensive guide details the entire process, from source code to a running, secure container in the cloud. We will construct a declarative Jenkins pipeline that automatically builds, tests, and packages a Java application. More importantly, we'll embed critical security scanning—SAST, SCA, and container vulnerability scanning—before promoting the artifact to AWS ECS. This article provides a production-ready template for building security into your deployment lifecycle.
Prerequisites: Laying the Foundation
Before you begin, ensure you have the following components set up and accessible. This guide assumes you have administrator-level access to these systems.
- Jenkins Server: A stable Jenkins controller with necessary plugins installed.
- AWS Account: An active AWS account with permissions for IAM, ECR, and ECS. The AWS CLI v2 should be installed and configured on the Jenkins agent.
- Git Repository: A Java web application (e.g., Spring Boot) hosted in a Git-based SCM like GitHub or GitLab.
- Docker: Docker must be installed and runnable by the
jenkinsuser on the agent node. - Security Tools (External):
- A running SonarQube server for Static Application Security Testing (SAST).
- (Optional) An account with a vulnerability scanning tool like Trivy or Snyk. We will use Trivy, which can be installed on the agent.
Essential Jenkins Plugins
Navigate to Manage Jenkins > Plugins > Available and install the following:
Pipeline(and its dependencies)Pipeline: AWS Steps(provideswithAWSwrapper)Docker Pipeline(providesdocker.build()anddocker.withRegistry())SonarQube Scanner for JenkinsOWASP Dependency-Check Plugin
Configuring AWS Credentials
For production-grade security, do not use static AWS keys in your Jenkins configuration. The best practice is to attach an IAM Role to the EC2 instance running your Jenkins agent. This role needs permissions for ECR (push/pull) and ECS (update services, register task definitions).
If you must use keys, add them to Jenkins under Manage Jenkins > Credentials with the ID aws-credentials and use the withAWS step.
Architecting the DevSecOps Pipeline
Our pipeline will be defined in a Jenkinsfile, embracing the "Pipeline as Code" model. This ensures the pipeline is version-controlled, repeatable, and auditable. The stages will be as follows:
- Checkout: Pull the source code from the Git repository.
- Security - SAST: Perform Static Application Security Testing (SAST) on the Java source code using SonarQube to find code-level vulnerabilities.
- Security - SCA: Perform Software Composition Analysis (SCA) using OWASP Dependency-Check to find vulnerabilities in third-party libraries (
pom.xmldependencies). - Build & Test: Compile the Java application, run all unit tests, and package it into a
.jarfile using Maven. - Build & Push Image: Build a Docker image containing the Java app, tag it, and push it to a private AWS ECR repository.
- Security - Container Scan: Scan the newly built Docker image in ECR for OS-level vulnerabilities using a tool like Trivy.
- Deploy to ECS: Register a new task definition with the new image and update the ECS service to trigger a rolling deployment.
Step 1: The Jenkinsfile - Initial Setup & Security Stages
In the root of your Java project, create a file named Jenkinsfile. We'll start by defining the agent, environment variables, and our initial security stages.
pipeline { agent any // It's better to use a label for a specific agent: agent { label 'docker-java-agent' } environment { // Define common variables AWS_REGION = "us-east-1" AWS_ACCOUNT_ID = "123456789012" // Change this ECR_REPO_NAME = "my-java-web-app" ECS_CLUSTER_NAME = "production-cluster" ECS_SERVICE_NAME = "java-web-app-service" ECS_TASK_FAMILY = "java-web-app-task" } tools { maven 'Maven 3.8' // Name from Global Tool Configuration jdk 'JDK 11' // Name from Global Tool Configuration } stages { stage('1. Checkout SCM') { steps { git branch: 'main', url: 'https://github.com/your-org/your-java-app.git' } } stage('2. Security: SAST (SonarQube)') { steps { script { // Assumes SonarQube server is configured in "Manage Jenkins" def scannerHome = tool 'SonarQubeScanner' withSonarQubeEnv('MySonarQubeServer') { sh "${scannerHome}/bin/sonar-scanner \ -Dsonar.projectKey=my-java-app \ -Dsonar.sources=src/main/" } } } } stage('3. Security: SCA (OWASP)') { steps { // Fails the build if any vulnerability with CVSS score >= 8 is found dependencyCheck additionalArguments: '--failOnCVSS 8 --format HTML', odcInstallation: 'OWASP' } post { always { publishHTML target: [ allowMissing: true, alwaysLinkToLastBuild: true, keepAll: true, reportDir: '.', reportFiles: 'dependency-check-report.html', reportName: 'OWASP SCA Report' ] } } } } }
These "shift-left" stages are critical. We check for vulnerabilities in our *own code* (SAST) and our *dependencies* (SCA) before we even waste time compiling or building an image. This provides the fastest possible feedback loop to developers.
Step 2: Build, Containerize, and Push to ECR
Next, we add the stages to build our application with Maven and then use Docker to package it. We'll leverage the docker-pipeline plugin to interact with our Dockerfile and push to AWS ECR.
Example Dockerfile
Your Java project should include a Dockerfile. A simple one for a Spring Boot app might be:
# Use a secure, minimal base image FROM amazoncorretto:11-alpine-jdk ARG JAR_FILE=target/*.jar COPY ${JAR_FILE} /opt/app/app.jar WORKDIR /opt/app # Expose the port the app runs on EXPOSE 8080 # Run the application ENTRYPOINT ["java", "-jar", "app.jar"]
Jenkinsfile: Build and Push Stages
We'll add the following stages to our Jenkinsfile. This part uses the withAWS block to handle ECR authentication automatically (assuming an IAM role).
// ... previous stages ... stage('4. Build & Test (Maven)') { steps { // -B runs in non-interactive (batch) mode // clean install runs tests and packages to target/ sh "mvn -B clean install" } } stage('5. Build & Push Docker Image') { steps { script { // Define image name and ECR URL def imageName = "${ECR_REPO_NAME}:${env.BUILD_NUMBER}" def ecrRepoUrl = "${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com" def ecrImageUrl = "${ecrRepoUrl}/${imageName}" // Build the Docker image def dockerImage = docker.build(imageName, "--build-arg JAR_FILE=target/*.jar .") // Log in to ECR and push docker.withRegistry(ecrRepoUrl, "ecr:${AWS_REGION}:aws-credentials") { // 'aws-credentials' is the Jenkins Credential ID // If using IAM roles, this can be configured differently echo "Tagging image: ${imageName} as ${ecrImageUrl}" dockerImage.push(ecrImageUrl) // We can also tag 'latest' dockerImage.push("${ecrRepoUrl}/${ECR_REPO_NAME}:latest") } // Store the image URL for the deployment stage env.ECR_IMAGE_URL = ecrImageUrl } } } } }
Step 3: Container Scanning and Deployment to ECS
This is the final phase of our Jenkins CI/CD DevSecOps pipeline. We will scan the image we just pushed and, if it's clean, deploy it to ECS.
Security: Container Scanning (Trivy)
It's crucial to scan the final artifact. Our code might be clean (SAST) and our libraries might be clean (SCA), but the underlying base image (e.g., amazoncorretto:11-alpine) could have OS-level vulnerabilities (like issues in openssl or glibc). Trivy is an excellent open-source scanner for this.
This stage assumes trivy is installed on the Jenkins agent.
// ... previous stages ...
stage('6. Security: Scan Container (Trivy)') {
steps {
script {
if (env.ECR_IMAGE_URL == null) {
error "ECR_IMAGE_URL is not set. Cannot scan."
}
// Scan the image we just pushed to ECR
// --exit-code 1: Fails the build if high/critical vulnerabilities are found
// --ignore-unfixed: Only report on vulnerabilities that have a fix
// We must log in to ECR for Trivy to pull the private image
sh "aws ecr get-login-password --region ${AWS_REGION} | docker login --username AWS --password-stdin ${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com"
sh "trivy image --exit-code 1 --severity HIGH,CRITICAL --ignore-unfixed ${env.ECR_IMAGE_URL}"
}
}
}
Deploy to AWS ECS
The final step. We won't create a new task definition from scratch. Instead, we'll use an existing task definition as a template. The aws ecs describe-task-definition command gives us the full JSON. We can then modify just the image field and register a new version.
This approach is more robust than maintaining a separate task-definition.json file, as it respects all current settings (memory, CPU, logging, etc.) from the *running* version.
// ... previous stages ...
stage('7. Deploy to ECS') {
steps {
script {
echo "Starting deployment of ${env.ECR_IMAGE_URL} to ECS..."
// Use withAWS wrapper for all AWS commands
withAWS(region: AWS_REGION) {
// 1. Get the current task definition JSON
def taskDefJson = sh(
script: "aws ecs describe-task-definition --task-definition ${ECS_TASK_FAMILY} --query taskDefinition",
returnStdout: true
).trim()
// 2. Use 'jq' to create a new task definition
// It updates the image and removes fields that aren't allowed for registration
def newContainerDef = sh(
script: "echo '${taskDefJson}' | jq '.containerDefinitions[0].image = \"${env.ECR_IMAGE_URL}\"'",
returnStdout: true
).trim()
def newTaskDefJson = sh(
script: """
echo '${taskDefJson}' | jq '.containerDefinitions = [${newContainerDef}] |
del(.taskDefinitionArn) |
del(.revision) |
del(.status) |
del(.requiresAttributes) |
del(.compatibilities) |
del(.registeredAt) |
del(.registeredBy)'
""",
returnStdout: true
).trim()
// 3. Register the new task definition revision
def registerResponse = sh(
script: "aws ecs register-task-definition --cli-input-json '${newTaskDefJson}' --query taskDefinition.taskDefinitionArn",
returnStdout: true
).trim()
echo "Registered new task definition: ${registerResponse}"
// 4. Update the service to use the new task definition
// This triggers a rolling deployment
sh "aws ecs update-service --cluster ${ECS_CLUSTER_NAME} --service ${ECS_SERVICE_NAME} --task-definition ${registerResponse}"
echo "Deployment to ECS service ${ECS_SERVICE_NAME} initiated."
}
}
}
}
} // End of stages
} // End of pipeline
Frequently Asked Questions
What is the difference between SAST, SCA, and Container Scanning?
- SAST (Static Application Security Testing): Scans your source code for security flaws (e.g., SQL injection, hardcoded secrets). Example: SonarQube.
- SCA (Software Composition Analysis): Scans your dependencies (e.g.,
.jarfiles inpom.xml) for known vulnerabilities (CVEs). Example: OWASP Dependency-Check. - Container Scanning: Scans the final artifact (the Docker image) for vulnerabilities in the base OS and system libraries. Example: Trivy. A full DevSecOps pipeline needs all three.
This pipeline seems complex. How can I simplify it?
You can start simple. A good first step is to automate just the Build & Push stages (1, 4, 5). Once that is stable, add the security stages one by one. Start with SCA, as it often provides the most high-value findings for minimal effort. Then add SAST, and finally container scanning and automated deployment.
How should I manage database passwords or API keys?
Never put them in the Jenkinsfile, Dockerfile, or source code. Store them in AWS Secrets Manager or AWS Systems Manager (SSM) Parameter Store. Then, in your ECS Task Definition, you can securely inject them into the container as environment variables. This pipeline does not need to know the secrets; it only needs to deploy the container that *knows how to get* the secrets.
Conclusion
Building a secure CI/CD pipeline is a foundational pillar of a mature DevOps practice. By following this guide, you have created a robust, automated, and security-conscious process. This pipeline "shifts security left," providing developers with rapid feedback on code quality, dependency vulnerabilities, and image security *before* issues reach production.
This end-to-end Jenkins CI/CD DevSecOps pipeline not only deploys your Java application to AWS ECS reliably but also enforces security at every step. This automation frees your engineers to focus on building features, confident that the path to production is both fast and secure.

Comments
Post a Comment