I've Jenkins installed on a Windows server and I wishes to copy the published VisualStudio files to multiple Windows Hosts [LoadBalanced environment].
What is the most recommended way to copy files from Jenkins [hosted on windows] to multiple windows hosts, running on a LB farm. Is there any direct plugin for this?
If there any direct ways of copying the build files to destination servers, apart from multiple jenkins build steps?
If you use pipelines you can split a task into multiple tasks which can then be executed in parallel. You could do something like this:
pipeline {
agent none
stages {
stage('build') {
parallel {
stage('build-1') {
agent {
label "windows"
}
steps {
bat "build-1.bat"
}
}
stage('build-2') {
agent {
label "windows"
}
steps {
bat "build-2.bat"
}
}
stage('build-3') {
agent {
label "windows"
}
steps {
bat "build-3.bat"
}
}
}
}
}
}
The files would be copied from your SCM (Subversion, Git, etc) and the build scripts would be launched to perform whatever build actions you need to accomplish.
Related
I am looking for help with with our Jenkins Pipeline setup. I had a Jenkins pipeline job working just fine, where the groovy script was checked out from a Perforce stream (in stage "Declarative: Checkout SCM") and then run. The script itself performs, at its core, a p4 sync and a p4 reconcile.
pipeline {
agent {
node {
customWorkspace "workspaces/MY_WORKSPACE"
}
}
stages {
stage('Sync') {
steps {
script {
p4sync(
charset: 'none',
credential: '1',
format: "jenkins-${NODE_NAME}-MY_WORKSPACE",
populate: syncOnly(force: false, have: true, modtime: false, parallel: [enable: false, minbytes: '1024', minfiles: '1', threads: '4'], pin: '', quiet: true, revert: true),
source: streamSource('//depot/STREAM')
)
}
}
}
stage('Reconcile') {
steps {
script {
withCredentials([usernamePassword(credentialsId: '1', passwordVariable: 'SVC_USER_PW', usernameVariable: 'SVC_USER_NAME')]) {
bat label: 'P4 reconcile', true, script:
"""
p4 -c "%P4_CLIENT%" -p "%P4_PORT%" -u ${SVC_USER_NAME} -P ${SVC_USER_PW} -s reconcile -e -a -d -f "//depot/STREAM/some/folder/location/*.file"
"""
}
}
}
}
}
}
Due to an exterior requirement, we decided to move all our pipeline script files to a separate depot on the same Perforce server and changed the pipeline script checkout accordingly.
Now, the pipeline script checkout step ("Declarative: Checkout SCM") will create a new workspace called jenkins-NODE_NAME-buildsystems (for the pipeline script depot //buildsystems) which will use the same local workspace root directory D:\some\path\workspaces\MY_WORKSPACE on the build node as the actual workspace jenkins-NODE_NAME-MY_WORKSPACE, created and synced in the first pipeline step by p4sync. This means that Perforce creates two workspaces with the same local workspace root directory (which can cause all sorts of problems in itself). In addition, in the pipeline script, the P4 environment variable P4_CLIENT points to the wrong workspace jenkins-NODE_NAME-buildsystems (so the reconcile won't work), which should only have been used by the pipeline script checkout, not by the pipeline itself.
Which brings me to my question. How can I separate the workspaces of the pipeline script checkout and of the p4sync in the pipeline script? In the pipeline I can specify a customWorkspace, but not in the Jenkins configuration for the pipeline script checkout, and the latter weirdly seems to follow that customWorkspace statement, maybe because jenkins-NODE_NAME-MY_WORKSPACE had already been opened by Perforce on the node...?
Any hints are much appreciated.
Thanks,
Stefan
Good afternoon, friends from Stack !
I am running a SonarQube in my pipeline in a jenkins instance. I have an issue and I'm following the documentation and I am kid of new to this.
https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner+for+Jenkins
But I have a Windows slaves. And everytime I configure it according to the documentation I get an error...
[Sonar-Pipeline] Running batch script
C:\Program Files (x86)\Jenkins\workspace\Sonar-Pipeline>C: \Program Files (x86)\Jenkins\tools \hudson.plugins.sonar.SonarRunnerInstallation\SONAR_RUNNER\bin\sonar-scanner
'C:\Program' not‚ recongnized as an internal or external command.
After which I beleive that there is a space here and Jenkins is trying to execute only 'C:Program ' as the above shows. Does any body now this?
This is my pipeline...
node {
stage('SonarQube analysis') {
// requires SonarQube Scanner 2.8+
def scannerHome = tool 'SONAR_RUNNER';
withSonarQubeEnv('SonarQube') {
bat "${scannerHome}/bin/sonar-scanner"
}
}
}
So this is what I am trying to execute according to the documentation. The only thing that is asked, because I am using only windows is to switch to bat instead of sh on scannerHome execution. Because it is a pipeline also and not a normal option. And I do have all the files too.
Please use below code to run the sonar-scanner on windows
node {
stage('SonarQube analysis') {
// requires SonarQube Scanner 2.8+
def scannerHome = tool 'SONAR_RUNNER';
withSonarQubeEnv('SonarQube') {
bat "\"${scannerHome}\\bin\\sonar-scanner.bat\""
}
}
I'm generating a Jenkins pipeline using a script. This pipeline is then loaded and executed. This works fine on any linux node and with sh statements etc.
I have now encountered an issue executing this on a windows node with the powershell and bat steps:
The checkout scm works fine, but any steps with the powershell or bat steps hang indefinitely. Copying the data from the generated file and replaying everything in a single Jenkinsfile works as expected.
An example of what is running (excluding the library):
"Main" Jenkinsfile:
def pod_label = "my_linux_node"
node(pod_label) {
stage("Checkout") {
checkout scm
}
stage("Pipeline generation") {
// genarate pipeline and save in "gen_pipeline.groovy"
}
stage("Run Pipeline") {
pipeline = load "gen_pipeline.groovy"
pipeline.run_pipeline()
}
}
Script1:
def run_pipeline(){
node('my_win_node') {
checkout scm
bat "echo foobar"
}
}
return this
In Jenkins ver. 2.121.3 using pipeline trying to delete the file. Its giving script not permitted error message.
Is there a alternate way to delete the file in Jenkins with-out using OS command?
Scripts not permitted to use method java.io.File delete. Administrators can decide whether to approve or reject this signature.
[Pipeline] End of Pipeline
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use method java.io.File delete
Pipeline code
stage('Delete test.zip file') {
if (fileExists('test.zip')) {
new File('test.zip').delete()
} else {
println "test.zip file not found"
}
}
There are several alternative ways:
By means of jenkins shared library you can wrap this code up to function or class:
#!/usr/bin/groovy
package org.utils
class PipelineUtils {
static def deleteFile(String name) { new File(name).delete() }
}
in your pipeline script, you need to import the library:
#Library('your-jenkins-library')_
import static org.utils.PipelineUtils.*
deleteFile('test.zip')
As #Sean has suggested to approve the script via "Manage Jenkins > In-process Script Approval".
There is File Operations Plugin:
fileOperations([script.fileDeleteOperation(excludes: '', includes: 'test.zip')])
There is Workspace Cleanup Plugin, but you need to find suitable exclude-patterns, otherwise this will clean all files:
def new_exclude_patterns = [[pattern: ".git/**", type: 'EXCLUDE']]
cleanWs deleteDirs: false, skipWhenFailed: false, patterns: new_exclude_patterns
If you are running pipeline on linux slave (or windows slave with sh in path), you may use the below call to avoid interactive prompts.
sh(""" rm -rf "$directory" """)
Navigate to /scriptApproval/ (Manage Jenkins > In-process Script Approval) and approve the script.
Another way since Java 1.7/Groovy ?.? is:
Files.delete(Path.of(FQFN))
In the newer versions of the yarn package manager, almost all the commands have a --no-color option.
I'm running yarn under a continuous integration server (Jenkins) and the color escape chars pollute the output. I'd like to put something in the .yarnrc file to prevent the output of these escape chars. But I'd also to leave it on for when the developers run it on the terminal.
How to globally configure the --no-color option?
You can set the environment variable FORCE_COLOR to 0 to disable any colored output (this option comes from chalk which is used by yarn to output color).
pipeline {
agent {
docker {
image 'node:10'
}
}
environment {
FORCE_COLOR = '0'
}
stages {
stage('run yarn') {
steps {
sh 'yarn'
}
}
}
}
For yarn 1.22 worked:
export FORCE_COLOR=false
in scripts.
Note: FORCE_COLOR=0 has no effect