How to deploy to ec2 using jenkins pipeline script? - jenkins-pipeline

I've linked up to the 'github webhook' now.
Now all we have to do is build and deploy to 'ec2'.
Is it a place to create and test files to be uploaded to ec2 in the build phase? If yes, I wonder what the code would look like.
And I wonder how to utilize dir in build .
I'm curious about the code how to write it in the deployment stage
Below is the code I have written so far
pipeline {
agent any
environment {
SLACK_CHANNEL = '#'
}
stages {
stage('Start') {
steps {
slackSend (channel: SLACK_CHANNEL, color: '#FFFFOO', message: "STARTED: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]' (${env.BUILD_URL})")
}
}
stage('Checkout') {
steps {
git branch: 'feature/signin.up',
credentialsId: 'github_access_token',
url: 'https://github.com/###/westudy.git'
}
}
stage('Build') {
steps {
{
}
}
}
stage('Deploy') {
steps {
}
}

Related

Copy a file to a different directory during Jenkins build on Windows

I am trying to do something that seems pretty basic but is giving me a lot of trouble. I am trying to copy the jar file that is built into a different directory but the build keep failing on that step. Here is my Jenkinsfile and the "Copy" step at the bottom is causing the failure.
pipeline {
agent any
stages {
stage("Clean Up"){
steps{
deleteDir()
}
}
stage("Clone Repo"){
steps {
bat "git clone https://github.com/CBoton/spring-boot-hello-world.git"
}
}
stage("Build"){
steps {
dir("spring-boot-hello-world") {
bat "mvn clean install"
}
}
}
stage("Test") {
steps {
dir("spring-boot-hello-world") {
bat "mvn test"
}
}
}
stage("Package") {
steps {
dir("spring-boot-hello-world") {
bat "mvn package"
}
}
post {
success {
dir("spring-boot-hello-world") {
archiveArtifacts 'target/*.jar'
}
}
}
}
stage("Copy"){
steps {
dir("spring-boot-hello-world"){
bat ("copy target/*.jar C:\\Users\\Curti\\Downloads")
}
}
}
}
}
How I have it now produces the following error
C:\ProgramData\Jenkins\.jenkins\workspace\spring\spring-boot-hello-world>copy target/*.jar C:\Users\Curti\Downloads
The syntax of the command is incorrect.
I have tried many variations for the path but can't get it to work. This is starting to drive me crazy because it seems like it would be so simple. Any help would be greatly appreciated.
#ycr, I was trying to do it without plugins but ultimately the File Operations plugin worked, thank you. My new copy step looks like
stage("Copy"){
steps {
fileOperations([fileCopyOperation(
excludes: '',
flattenFiles: false,
includes: '**/*.jar',
targetLocation: "C:\\output"
)])
}
}

Pipeline created by Job-DSL fails to run maven

I am trying to setup Jenkins. I am using the docker image BlueOcean. I am trying to create a Jenkins pipeline using a Job-DSL. When I create the pipeline myself, and run it, it works. But when I run the pipeline created by the Job-DSL, it fails because of maven.
I looked on internet, but I couldn't find a solution proper to my case.
He is the Jenkinsfile
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
and this is the job-DSL
job('PROJ-unit-tests') {
scm {
git('git://github.com/Jouda-Hidri/Transistics.git') { node ->
node / gitConfigName('DSL User')
node / gitConfigEmail('hxxxa#gmail.com')
}
}
triggers {
scm('*/15 * * * *')
}
steps {
maven('-e clean test')
}
}

Execute all stage and steps if one stage or step get failed aslo

I'm setting up Jenkins pipeline which is mentioned below. My build gets aborted if the 1st stage is got failed but I want to execute 1st all stage and steps which are mentioned in stages.
pipeline {
agent none
stages {
stage("build and test the project") {
agent {
docker "coolhub/vault:jenkins"
}
stages {
stage("build") {
steps {
sh 'echo "build.sh"'
}
}
stage("test") {
steps {
sh 'echo "test.sh" '
}
}
}
}
}
}
I'd like to execute 1st all stage and steps which are mentioned in stages.
after all, stage gets executed then finally need to get abort Jenkins job and show stage and steps which are failed.
Yeah, well there's no way currently to do that apart from try catch blocks in a script.
More here: Ignore failure in pipeline build step.
stage('someStage') {
steps {
script {
try {
build job: 'system-check-flow'
} catch (err) {
echo err
}
}
echo currentBuild.result
}
}
In hakamairi's answer, the stage is not marked as failed. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.

Running a script post successful build in jenkins

I am trying to run a bash script after a successful build in jenkins.
stages {
stage("test") {
steps {
...
}
post {
success {
steps {
sh "./myscript"
}
}
}
}
}
I am getting an error saying that method "steps" does not exist. How can I run a script after a successful build?
You need to remove the ”steps” inside the ”success” block. call the script directly inside the ”success” block.
according to the docs which is quite confusing, the ”success” is a container for steps (so no need to add another nested ”steps” ):
https://jenkins.io/doc/book/pipeline/syntax/#post
stages {
stage("test") {
steps {
...
}
post {
success {
sh "./myscript"
}
}
}
}

Referencing variable in declarative jenkins pipeline

I am using the groovy below to call a bat command, that no matter how i reference the LOCAL_WORKSPACE within the bat command it does not evaluate it.
What am i missing?
Error
nuget restore $env.LOCAL_WORKSPACE
"Input file does not exist: $env.LOCAL_WORKSPACE"
Script
pipeline {
agent any
stages {
stage('Clone repo') {
steps {
deleteDir()
git branch: 'myBranch', changelog: false, credentialsId: 'myCreds', poll: false, url: 'http://myRepoURL'
}
}
stage ("Set any variables") {
steps{
script{
LOCAL_BUILD_PATH = "$env.WORKSPACE"
}
}
}
stage('Build It, yes we can') {
parallel {
stage("Build one") {
steps {
echo LOCAL_BUILD_PATH
bat 'nuget restore %LOCAL_WORKSPACE%'
}
}
}
}
}
}
You cannot set variables to share data between stages. Basically each script has its own namespace.
What you can do is use an environment directive as described in the pipeline syntax docs. Those constants are globally available, but they are constants, so you cannot change them in any stage.
You can calculate the values though. For example I use an sh step to get the current number of commits on master like this:
pipeline {
agent any
environment {
COMMITS_ON_MASTER = sh(script: "git rev-list HEAD --count", returnStdout: true).trim()
}
stages {
stage("Print commits") {
steps {
echo "There are ${env.COMMITS_ON_MASTER} commits on master"
}
}
}
}
You can use environment variables to store and access to/from stages. For example, if you define LOCAL_ENVR as Jenkins parameter, you can manipulate the variable from stages:
stage('Stage1') {
steps {
script{
env.LOCAL_ENVR = '2'
}
}
}
stage('Stage2') {
steps {
echo "${env.LOCAL_ENVR}"
}
}

Resources