Jenkinsfile | Upload documents from Jenkins workspace to confluence - jenkins-pipeline

I need to upload documents from Jenkins workspace to confluence via Jenkinsfile.
I followed up this link and started writing the basic code and sure that this will not work. Can anyone please add or comment or suggest me few links.
void Publish_Doc_Confluence(){
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: 'iam_user_jenkins']]) {
publishConfluence attachArchivedArtifacts: true, pageName: '', replaceAttachments: true, siteName: '', spaceName: ''
}
}
I am also using curl command to upload the file but in vain--
command-
stage('Publish to Confluence') {
steps {
withCredentials([usernamePassword(credentialsId: 'confluence', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
sh '''
curl -D- -u $USERNAME:$PASSWORD -X PUT -H "X-Atlassian-Token: nocheck" -F "file=#code/pydoc/*.html" -F "minorEdit=false" 'https://alm-tuigroup.atlassian.net/wiki/rest/api/content/504955238/child/attachment'
'''
}}}
And where exactly I will get the details like below in confluence page-
pageneme
sitename
spacename

Related

report folder does not exist error with htmlpublisher

I am trying to write a jenkins pipeline script for one of my playwright test. Below is one simple code which i have done so far.
pipeline {
agent any
stages {
stage('Run Playwright Test') {
steps {
runTest()
}
}
stage('Publish Report'){
steps {
script {
sh 'ls -lrta'
//print REPORT_FILES
}
publishHTML([
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
//reportDir: '.',
reportDir : "./TestReport2",
reportFiles: 'index.html',
reportName: "Html Reports",
reportTitles: 'Report title'
])
}
}
}
}
def runTest() {
node('MYNODE') {
docker.image('image details').inside('--user root'){
git branch: 'mybranchName',credentialsId: 'ID, url: 'url'
catchError() {
sh """
cd WebTests
npm install
npx playwright test --project=CHROME_TEST --grep #hello
"""
}
sh "cp -R WebTests/TestReport TestReport2"
sh 'cd TestReport2; ls -lrta'
}
When I use the above code, the test executed successfully however i am seeing an error while trying to publish the report.
Below is the error :
Specified HTML directory '/bld/workspace//TestReport2' does not exist.
observation: when i put a ls -ltr after the runTest code i could not see the TestReport2 folder even if it was copied successfully.
Another thing i tried is when i put the code to publish the HTML as part of the runTest() it worked fine and i am able to see the reports generated. Something is going on with the TestReport2 folder when the block of code for runTest() is completed.
Does anyone have an eye on what is the root cause. Any suggestion will be appreciated

Run a set of linux commands using Jenkinsfile in Jenkins

I had to create a jenkins job to automate certain tasks that will perform certain operations like Updating the public site, Changing public version to latest public release, Updating Software on public site and Restarting Server these include certain operations such as copy files to a tmp folder, log in to a an on-prem server, go to the folder and unzip the file etc.
I have created the jenkinsfile as follows:
pipeline {
options {
skipDefaultCheckout()
timestamps()
}
parameters {
string(name: 'filename', defaultValue: 'abc', description: 'Enter the file name that needs to be copied')
string(database: 'database', defaultValue: 'abc', description: 'Enter the database that needs to be created')
choice(name: 'Run', choices: '', description: 'Data migration')
}
agent {
node { label 'aws && build && linux && ubuntu' }
}
triggers {
pollSCM('H/5 * * * *')
}
stages {
stage('Clean & Clone') {
steps {
cleanWs()
checkout scm
}
}
stage('Updating the public site'){
steps{
sh "scp ./${filename}.zip <user>#<server name>:/tmp"
sh "ssh <user>#<server name>"
sh "cp ./tmp/${filename}.zip ./projects/xyz/xyz-site/"
sh "cd ./projects/xyz/xyz-site/ "
sh "unzip ./${filename}.zip"
sh "cp -R ./${filename}/* ./"
}
stage('Changing public version to latest public release') {
steps {
sh "scp ./${filename}.sql.gz <user>#<server name>:/tmp"
sh "ssh <user>#<server name>"
sh "mysql -u root -p<PASSWORD>"
sh "show databases;"
sh "create database ${params.database};"
sh "GRANT ALL PRIVILEGES ON <newdb>.* TO 'ixxyz'#'localhost' WITH GRANT OPTION;"
sh "exit;"
sh "zcat tmp/${filename}.sql.gz | mysql -u root -p<PASSWORD> <newdb>"
sh "db.default.url="jdbc:mysql://localhost:3306/<newdb>""
sh "ps aux|grep monitor.sh|awk '{print "kill "$2}' |bash"
}
}
stage('Updating Software on public site') {
steps {
sh "scp <user>#<server>:/tmp/abc<version>_empty_h2.zip"
sh "ssh <user>#<server name>"
sh "su <user>"
sh "mv tmp/<version>_empty_h2.zip ./xyz/projects/xyz"
sh "cd xyz/projects/xyz"
sh "cp latest/conf/local.conf <version>_empty_h2/conf/"
}
}
stage('Restarting Server') {
steps {
sh "rm latest/RUNNING_PID"
sh "bash reload.sh"
sh "nohup bash monitor.sh &"
}
}
}
}
Is there a way I can dynamically obtain the zip filename in the root folder? I used ${filename}.zip , but it doesn't seem to work.
Also, is there a better way to perform these operations using jenkins? Any help is much appreciated.
You could write all your steps in one shell script for each stage and execute under one stage.
Regarding filename.zipeither you can take this as a parameter and pass this value to your stages. OR You can also use find command as a shell command or shell script to find .zip files in a current directory. find <dir> -iname \*.zip find . -iname \*.zip .
Example:
pipeline {
options {
skipDefaultCheckout()
timestamps()
}
parameters {
string(name: 'filename', defaultValue: 'abc', description: 'Enter the file name that needs to be copied')
choice(name: 'Run', choices: '', description: 'Data migration')
}
stage('Updating the public site'){
steps{
sh "scp ./${params.filename}.zip <user>#<server name>:/tmp"
...
}
}
}
For executing script at a certain location based on your question , you could use dir with path where your scripts are placed.
OR you can also give the path directly sh label: 'execute script', script:"C:\\Data\\build.sh"
stage('Your stage name'){
steps{
script {
// Give path where your scripts are placed
dir ("C:\\Data") {
sh label: 'execute script', script:"build.sh <Your Arguments> "
...
}
}
}
}

Jenkins custom pipeline and how to add property to be set in jenkinsfile

I'm trying to create a custom pipeline with groovy but I can't find anywhere on the web where it is discussed how to add a property that can be set in the jenkinsfile. I'm trying to add a curl command but need the URL to be set in the jenkinsfile because it will be different for each build.
Can anyone explain how that should be done or links where it has been discussed?
Example Jenkinsfile:
msBuildPipelinePlugin
{
curl_url = "http://webhook.url.com"
}
custom pipeline groovy code:
def response = sh(script: 'curl -i -X POST -H 'Content-Type: application/json' -d '{"text","Jenkins Info.\nThis is more text"}' curl_url, returnStdout: true)
Thanks
If you want to specify the URL as a string during every build, you can do either of the following:
Declarative Pipeline
Use the parameters {} directive:
pipeline {
agent {
label 'rhel-7'
}
parameters {
string(
name: 'CURL_URL',
defaultValue: 'http://www.google.com',
description: 'Enter the URL for file download'
)
}
stages {
stage('download-file') {
steps {
echo "The URL is ${params.CURL_URL}"
}
}
}
}
Scripted Pipeline
Use the properties([parameters([...])]) step:
parameters([
string(
name: 'CURL_URL',
defaultValue: 'http://www.google.com',
description: 'Enter the URL for file download'
)
])
node('rhel-7') {
stage('download-file') {
echo "The URL is ${params.CURL_URL}"
}
}
You can choose to leave the values of defaultValue and description empty.
Job GUI
Either of the above syntax will be rendered in the GUI as:
I got it to work using
//response is just the output of the curl statement
def response = ["curl", "-i", "-v", "-X", "POST", "--data-urlencode", "payload={\"text\":\"message body\"}", "curl url goes here"].execute().text
Thanks

Unable to print credentials set in Jenkins Pipeline

Credentials are configured in Jenkins but there's an error suggesting they are not.
I've followed documentation provided by Jenkins website.
agent {
node {
label 'master'
}
}
environment {
AWS_ACCESS_KEY_ID = credentials('jenkins-aws-secret-key-id')
AWS_SECRET_ACCESS_KEY = credentials('jenkins-aws-secret-access-key')
}
stages {
stage('checkout') {
steps {
git(url: 'git#bitbucket.org:user/bitbucketdemo.git', branch: 'master', credentialsId: 'jenkins')
echo 'hello'
}
}
stage('packer') {
steps {
echo $AWS_ACCESS_KEY_ID
}
}
}
}```
It should print out the value of the environment variable
I used the Cloudbees AWS Credentials plugin. Once installed, I was able to add my AWS credentials (additional selection in Credentials pull-down menu)
enter image description here
Then use the following snippet in my Jenkinsfile
withCredentials(
[[
$class: 'AmazonWebServicesCredentialsBinding',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
credentialsId: 'AWS',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
sh 'packer build -var aws_access_key=${AWS_ACCESS_KEY_ID} -var aws_secret_key=${AWS_SECRET_ACCESS_KEY} example4.json'
}

Jenkins pipeline credentials for all stages

I have Jenkins scripted pipeline with multiple stages, all of the stages require the same password for interaction with third-party API.
node {
stage ('stage1') {
sh 'curl --user login:password http://trird-party-api'
}
stage ('stage2') {
sh 'curl --user login:password http://trird-party-api'
}
}
For obvious reasons I want to keep this password safe, e.g. in Jenkins credentials.
The only secure way I've found is to add withCredentials section, but it must be added to each pipeline stage, e.g:
node {
stage ('stage1') {
withCredentials([string(credentialsId: '02647301-e655-4858-a7fb-26b106a81458', variable: 'mypwd')]) {
sh 'curl --user login:$mypwd http://trird-party-api'
}
}
stage ('stage2') {
withCredentials([string(credentialsId: '02647301-e655-4858-a7fb-26b106a81458', variable: 'mypwd')]) {
sh 'curl --user login:$mypwd http://trird-party-api'
}
}
}
This approach is not OK because real pipeline is really complicated.
Any alternatives?
According to this other stackoverflow question and this tutorial, you should be able to specify the needed credentials in a declarative pipeline like so:
environment {
AUTH = credentials('02647301-e655-4858-a7fb-26b106a81458')
}
stages {
stage('stage1') {
sh 'curl --user $AUTH_USR:$AUTH_PSW http://third-party-api'
}
stage('stage2') {
sh 'curl --user $AUTH_USR:$AUTH_PSW http://third-party-api'
}
With a scripted pipeline, you're pretty much relegated to using withCredentials around the things you want to have access to them. Have you tried surrounding the stages with the credentials, as in:
node {
withCredentials([string(credentialsId: '02647301-e655-4858-a7fb-26b106a81458', variable: 'mypwd')]) {
stage ('stage1') {
sh 'curl --user login:password http://trird-party-api'
}
stage ('stage2') {
sh 'curl --user login:password http://trird-party-api'
}
}
}

Resources