I'm running a Jenkins job on a GitHub project (Project A), as part of this job I want to checkout another different GitHub project (Project B) using shell script command
If you really want to use a shell command for that you could go for
$ git clone https://<YOUR_REPOSITORY_URL>.
However, if you are using a Jenkins pipeline job you might consider using following command in your Jenkinsfile:
stage('Checkout') {
git branch: '<BRANCH_NAME>', credentialsId: '<JENKINS_CREDENTIAL_ID>', url: 'git#<URL_OF_REPOSITORY>'
}
Related
I decided to port all my Jenkins job over to Jenkins-pipeline. I did a simple test with the following Jenkinsfiles in to the UI:
pipeline {
agent any
stages {
stage('Clone Repo') {
steps {
git changelog: false, credentialsId: 'xxxxxxxxxx ', url: 'https://github.com/xxxxx/xxx.git'
}
}
}
}
This works fine. I created a repo in github and checked in this Jenkinsfile. I changes Jenkins to Pipeline script from SCM and it is finding the Jenkinsfile but falling over with the error message below. I know I've missed something basic, but reading all the documentations; I couldn't work it out. Any help is appreciated.
Here's the Jenkins job. There's a jenkinsfile the ndh_poc.
In the Script Path field, specify the location (and name) of your Jenkinsfile. This location is the one that Jenkins checks out/clones the repository containing your Jenkinsfile, which should match that of the repository’s file structure.
In your case if your jenkinsfile name is Jenkinsfile and is present within the directory ndh_poc at the root of your repository then the Script Path should be ndh_poc/Jenkinsfile
I am completely new to both sbt and Jenkins. I am trying to construct a build plan using Jenkins piplelines. The following commands run just fine within shell script: sbt compile & sbt package
However, sbt dist does not work. I get the error not a valid command
What is puzzling me, I can run the command from terminal just fine.
For what its workth, here is the contents of jenkinsfile (I know sbt dist should be in build, but I am still experimenting):
pipeline {
agent any
stages {
stage('Build') {
parallel {
stage('Build') {
steps {
sh 'echo "Compiling... "'
sh 'sbt compile'
}
}
stage('Deploy') {
steps {
sh 'echo "Deploying... "'
sh 'echo "packaging.. "'
sh 'sbt dist'
}
}
}
}
}
}
Jenkins 2.89.1, sbt tried both: 0.13 and 1.0.x
I see now from comments that you are working on Play application, so it is clear now where the dist task comes from.
In Jenkins first thing that you need to make when you running SBT tasks is that you have loaded to SBT your build.xml of your project. In your shell script it is not actually visible whether you are in the root project folder or not.
Also Jenkins has integration with SBT better then just running it through shell script. You need to add this plugin to Jenkins first. Look at the attached screenshot, where part of one of my Jenkins projects is shown:
There is a special kind of build step in Jenkins. In your project configuration you should see it under the "Build step" dropdown:
I believe you should start from there. If you still need to run through shell script make sure you also cd /path/to/your/project folder before running SBT commands.
I accepted #Alexander Arendar answer, as it is the correct answer. I just want to elaborate on how I got it right, so it can be useful for others.
The answer as #Alexander mentioned is If you still need to run through shell script make sure you also cd /path/to/your/project folder before running SBT commands. I was actually doing that. But what I was completely oblivious to, is that cd to directory and running sbt dist must be withiin the same jenkins step. I was doing that in two separate steps, and ending up running sbt dist in the original directory.
One worthy note is, cd to project was needed only for sbt dist since it comes with Play. Standard sbt commands (e.g.: sbt compile) were running fine outside the play project directory.
I haven't been able to find any info about this, so i hope you guys can help me on this one
I've a maven project hosted in bitbucket that has a BitBucket WebHook pointing to someurl/bitbucket-hook/ , this hooks triggers the build of my project that is defined by a pipeline that has this structure:
node {
stage 'Checkout'
git url: 'https:...'
def mvnHome = tool 'M3'
#Various stages here
...
stage 'Release'
sh "${mvnHome}/bin/mvn -B clean install release:prepare release:perform release:clean"
}
the problem is that maven release plugin pushes changes to BitBucket, and this triggers again the jenkins script, making an infinite loop of builds, is there a way to prevent this?
I've tried setting a quiet period in Jenkins with no success
From my perspective you should have specific jobs for build and release, and the release job should be triggered manually. Anyway, if there is some reason to have them in the job you can check for the message of the last commit:
node {
git 'https...'
sh 'git log -1 > GIT_LOG'
git_log = readFile 'GIT_LOG'
if (git_log.contains('[maven-release-plugin]')) {
currentBuild.result = 'ABORTED'
return
}
... // continue with release or whatever
}
A New Way to Do Continuous Delivery with Maven and Jenkins Pipeline article approach solves the infinite loop:
Use the Maven release plugin to prepare a release with
pushChanges=false (we are not going to push the release commits back
to master) and preparationGoals=initialize (we don't care if the tag
is bad as we will only push tags that are good)
sh "${mvnHome}/bin/mvn -DreleaseVersion=${version} -DdevelopmentVersion=${pom.version} -DpushChanges=false -DlocalCheckout=true -DpreparationGoals=initialize release:prepare release:perform -B"
Another solution can be to change the git hook (post-receive) and add a conditional curl similar to this script:
#!/bin/bash
git_log=$(git log --branches -1)
if ! [[ $git_log =~ .*maven-release-plugin.* ]] ;
then
curl http://buildserver:8080/git/notifyCommit?url=ssh://git#server/projects/name.git;
fi
I am new to gradle.I want to checkout remote repository using gradle script.
It is possible to run shell command Inside gradle task to clone remote repository with datetime stamp shell commands.
Yes, you can use a gradle Exec type task to execute any arbitrary command in the OS shell. See documentation and examples here.
You didn't say what type of repo you're using, but there is a gradle git plugin to do git operations including checkout.
Here's an example of how we do it:
project.tasks.create(
name: "checkOutCurrent", group: "Server", type:Exec,
description: "Checks out the current commit on the remote server.") {
workingDir project.rootDir
commandLine 'git', 'checkout', this.commit
}
This question is very specific to Bluemix DevOps.
I have a Java backend application that has a sizeable JavaScript front-end. So I created a GRUNT task to do the needed: uglify, minify, CDNify etc. My current setup is to have the Bluemix build just running mvn -B package and the Grunt task beforehand as a script on my local machine:
#!/bin/bash
grunt build
git add --all
git commit
git push origin master
But that precludes any edit using the online editor. So I'd like to have both task to run by the pipeline. I see 3 options:
Run both tasks in one build block triggered by git push as separate tasks
Run them in one build script triggered by git push
Run 2 pipeline steps, the first triggered by git push, the second by the completion of the first
something else
I haven't tried it yet (shame on me), just wanted to ask if someone did that before (If yes - cool, if no I will post my findings later on)
Solved it. This is what I tried:
modify the script in build and prefix with npm install npm or mvn (depends on what I selected) wasn't found)
add 2 jobs to one build stage, one grunt one maven (the deploy task would not find the war file)
Use 2 pipeline stages (see picture below) : Horray --- that worked.
None of the build steps required setting a directory, which is a little trap, since mvn sets target as default directory, so remove this. The script for Bower/Grunt is this:
#!/bin/bash
npm install
grunt build
the script for the maven task:
#!/bin/bash
mvn -B package
works like a charm (just be careful not to add npm modules you don't actually need, it slows the build quite a bit)