I am trying to setup a Jenkins pipeline to trigger builds using gradle for multiple environments.My requirement is that the artifacts produced when I run gradlew clean build should produce artifacts with name indicating the environment for which the pipeline was run. Example my-application-dev.jar
The value of the environment would be selected by the user when build will be triggered.
What is the optimal way to achieve this ? Does build allow to configure any such property via command line or do I need to define a task in my build.gradle and define properties within that task for which I will pass value from command line
There are basically two ways.
The first one is to pass these naming-relevant pieces of information to the gradlew process, e.g. via -D or -P properties.
Then you need the Gradle build to be aware of these parameters and craft the artifact names it produces accordingly.
The second one is arguably better and more contained. You simply rename the artifacts produced by the gradlew command after it completes, in the Jenkinsfile. This works well if the pipeline decides what to do with these artifacts (e.g. publish to a repository) as opposed to the gradle script doing it (in which case you would most likely be better off using the first method).
I have created a pipeline project with a bitbucket git project. It is checking out the Jenkins file using https://Username#bitbucket.org/myproject.git
No problems there.
I have created a multi branch pipeline project to the same bitbucket git project. I used the same credentials as the single pipeline project but now it is NOT checking out the Jenkins (or other) files.
It keeps complaining about: "could not read Password for 'https://Username#bitbucket.org': terminal prompts disabled"
I tried to use the SSH variant but that does not work at all even for the single branch pipeline project it is causing problems so that might be a completely different problem.
Is there a difference between single branch pipeline and a multi branch pipeline I need to know wit regards to user credentials?
If more information is needed the please let me now.
I am trying to wrap my head around Travis-ci scripts but I am trying to figure out if what I want is even possible:
Repo #1:
- my app to be built via travis-ci
Repo #2:
- contains selenium/nightwatch tests that should run once repo #1 is finished building
So this is the workflow I am aiming for:
a PR in Repo #1 gets merged
Travis-ci builds Repo #1 and completes successfully & deploys
Travis-ci tells Repo #2 to start building
Repo #2 installs the repo which triggers browserstack to begin running the selenium tests
Is this at all possible? I've been researching this for a few days and couldn't find a way to trigger a separate repo to build.
Any help is appreciated.
Thanks!
Possible Duplicate of: Triggering builds of dependent projects in Travis CI
Nevertheless, pasting the answer here: Yes, it is possible to trigger another Travis job after a first one succeeds. You can use the trigger-travis.sh script that is part of the plume-lib library.
The script's documentation tells how to use it -- set an environment variable and add three lines to your .travis.yml file.
You can setup repo#1 and repo#2 using different jobs and use the above approach to trigger downstream job after successfully deployment has completed using job 1
I am working on continuous integration project to auto build and deploy ETL workflow and Autosys jil file to target environment.
We are using Perforce P4 for source code repository and Nexus for artefacts repository. Both ETL and Autosys applications are hosted on linux server.
- Developers extract workflow in the form of xml using Repository Manager from Informatica and check-in to source repository in Perforce.
- Developers extract jil file of Autosys job and check-in to source repository in Perforce.
Requirement:
As part of CI process, when developers check-in their code to source repository build process should get triggered and create artefacts of checked-in code and copy to artefacts repository.
Deployment process should get automatically trigger when it find any new artefacts and deploy artefact to target environment.
I would highly appreciate if someone helps me to know:
build and deployment steps
requirement of manifest file
Regarding build/deployment steps its nothing more than:
open build configuration->build steps
create new step in e.g. following way:
Runner type: command line
step name: that_one_from_autosys
working directory: %system.autosys.home%
command executable: run_autosys_.bat
Please check this article, I fully support the author.
You should have Jil templates and Environment contexts (+ other variables).
Than you need to have script that will generate Jil files for each Environment using templates and Env context.
Upload generated Jils to Artifact repository with sufficient version number.
Deploy Jil files using script that will operate with Autosys CLI commands. For instance, you need to stop running jobs, load BOXs before JOBs etc..
For now I have a batch file with commands for update projects using svn and calling maven 'clean install'. How to create some job in Jenkins for similar actions?
Should I write it to ant file (sorry if it's stupid idea, I've just heard about it but I don't know what is it exactly and what can I do with this) or there is other way?
Thanks
Like arghtype suggested, you need to be using Jenkin's own Source Code Management by configuring SVN as SCM source and supplying credentials as part of Maven build job.
If you have to use your own local working copy, you are organizing it wrong, you will lose on all the benefits of having Jenkins manage SVN changes, and in the end, this organization will give you more unsolvable problems in the future. Think about the advice people are giving here and come with up a reason why you need to have a local workspace outside of Jenkins management on a Jenkins build machine. My only guess is: your Jenkins and Development machine are the same. That again is not how it should be organized. Jenkins is a CI-server, not a personal build "automator".
Regardless, if you still want to do what you say.
What you think you want
Create a new Freestyle job
Under Build Steps, click Add build step
Select Execute Windows batch command
Write your batch execute command in there. Your working directory will be Jenkins's $WORKSPACE, so change your path accordingly to where you want to run it.
But with the above configuration, you might have as well put the batch file under windows scheduler... You are not really using Jenkins with the above.
What you should do instead
Create a new maven2/3 build job
Under Source Code Management, select Subversion
Under Repository URL enter the remote SVN repo (i.e. http://your.svnsever.com/path/to/project)
Under Build, enter your Root POM location (this will be relative to the location of your SVN checkout, so if your POM is under http://your.svnserver.com/path/to/project/maven/pom.xml, then enter maven/pom.xml.
Under Goals and options, enter clean install
Click Save
The Source Code Management section will take care of setting up a local workspace and checkout the repository into that workspace. By default, every time a new build is triggered, it will run svn update on that workspace for you.
The Maven Build step will take care of running your Maven, however note that it is configured to use default ~/.m2/repository location. If your local maven repo needs to be different, change this under Jenkins Global Configuration
Create a new job.
In Source Management choose Subversion, specify your repo and credentials.
Add a new build step - maven build, specify your maven goals ('clean install').
Jenkins is a CI(contiounus integration) server. It can be used to generate scheduled builds of ant or maven based projects. It can also start building projects by some triggering event such as a commit to SCM (git, svn, mercurial,...)connected to it. You really have to read its documentation to get a better understanding. It has nice tutorials.