RPM compression not working with spring-boot gradle - spring

Ive made a spring boot project with gradle, and have made an rpm build for it. The jar file works fine before i run my rpm, but after its compressed i get this error:
Exception in thread "main" java.lang.IllegalStateException: Failed to get nested archive for entry BOOT-INF/lib/castor-core-1.3.3.jar
at org.springframework.boot.loader.archive.JarFileArchive.getNestedArchive(JarFileArchive.java:109)
at org.springframework.boot.loader.archive.JarFileArchive.getNestedArchives(JarFileArchive.java:87)
at org.springframework.boot.loader.ExecutableArchiveLauncher.getClassPathArchives(ExecutableArchiveLauncher.java:70)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:49)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
Caused by: java.io.IOException: Unable to open nested jar file 'BOOT-INF/lib/castor-core-1.3.3.jar'
at org.springframework.boot.loader.jar.JarFile.getNestedJarFile(JarFile.java:253)
at org.springframework.boot.loader.jar.JarFile.getNestedJarFile(JarFile.java:238)
at org.springframework.boot.loader.archive.JarFileArchive.getNestedArchive(JarFileArchive.java:104)
... 4 more
Caused by: java.lang.IllegalStateException: Unable to open nested entry 'BOOT-INF/lib/castor-core-1.3.3.jar'. It has been compressed and nested jar files must be stored without compression. Please check the mechanism used to create your executable jar file
at org.springframework.boot.loader.jar.JarFile.createJarFileFromFileEntry(JarFile.java:281)
at org.springframework.boot.loader.jar.JarFile.createJarFileFromEntry(JarFile.java:261)
at org.springframework.boot.loader.jar.JarFile.getNestedJarFile(JarFile.java:249)
... 6 more
I have tried diabling rpms compression with %global __os_install_post %{nil} at the top of my .spec file, and then it works. So ive narrowed it down to my issue being that it is compressed when i build the rpm. However im not really interested in it not being compressed, as the file may eventually get rather large . So how do i actually fix this. Do i have to add something to gradle telling it to decompress before running or something?
rpm was build with this .sh file:
#!/bin/bash
# This script expects the Jenkins job to have moved the securityservice-files into the rpmbuild/SOURCES dir
echo ""
echo ""
echo ""
echo ""
echo ""
echo ""
echo ""
source common.sh
# Remove old RPMS, just for good measure
rm -rf rpmbuild/RPMS/*
# Move build files into SOURCES
cp ../sf_securityservice/build/libs/securityservice-0.0.1.jar rpmbuild/SOURCES
cp ../sf_securityservice/dist-resources/log4j2.xml rpmbuild/SOURCES
cp ../sf_securityservice/dist-resources/config.xml rpmbuild/SOURCES
#Define paths
BIN_DIR=$C3A_BIN_DIR/securityservice
ETC_DIR=$C3A_ETC_DIR/securityservice
LOG_DIR=$C3A_LOG_DIR/securityservice
VAR_DIR=$C3A_VAR_DIR/securityservice
JDK_VERSION_STRING=$(select_java_version "Frankfurt")
RELEASE=$(package_version)
VERSION="1.0"
#Set default args
cat <<DEFAULT_ARGS > rpmbuild/SOURCES/default_args
#
# Changes in this file WILL be replaced on update.
# Persistent changes goes in ${ETC_DIR}/custom_args.
#
JVM_ARGS="-Dlog4j.configurationFile=/opt/c3a/etc/securityservice/log4j2.xml"
CMD_ARGS="${ETC_DIR}/config.xml"
DEFAULT_ARGS
#Make empty file for custom args
touch rpmbuild/SOURCES/custom_args
cat <<RUNSH > rpmbuild/SOURCES/run.sh
#!/bin/bash
source $BIN_DIR/default_args
source $ETC_DIR/custom_args
/usr/bin/java \$JVM_ARGS -XX:OnOutOfMemoryError='kill -9 %p' -jar $BIN_DIR/securityservice-0.0.1.jar \$CMD_ARGS
RUNSH
cat <<EOF > rpmbuild/SOURCES/c3a-securityservice.service
[Unit]
Description=Cetrea SecurityService
After=syslog.target network.target
[Service]
Type=simple
WorkingDirectory=$VAR_DIR
User=securityservice
Group=c3a
Restart=always
StartLimitInterval=900
StartLimitBurst=3
RestartSec=10
ExecStart=$BIN_DIR/run.sh
SyslogIdentifier=cas
LimitNOFILE=25000
[Install]
WantedBy=multi-user.target
EOF
cd rpmbuild
rm -rf ./BUILDROOT/*
rpmbuild --target noarch \
-D "VERSION $VERSION" \
-D "RELEASE $RELEASE" \
-D "BIN_DIR $BIN_DIR" \
-D "ETC_DIR $ETC_DIR" \
-D "LOG_DIR $LOG_DIR" \
-D "VAR_DIR $VAR_DIR" \
-D "JDK_VERSION_STRING $JDK_VERSION_STRING" \
-D "_topdir ${PWD}" \
-bb SPECS/securityservice.spec
RPM=$DIR/rpmbuild/RPMS/noarch/securityservice-${VERSION}-${RELEASE}.noarch.rpm
echo $RPM
and this .spec
%description
this package contains Cetrea SecurityService.
%prep
#Unpack the release here using %setup
%build
#nothing here
%install
#Make target folders
mkdir -p $RPM_BUILD_ROOT%{LOG_DIR}
mkdir -p $RPM_BUILD_ROOT%{BIN_DIR}
mkdir -p $RPM_BUILD_ROOT%{ETC_DIR}
mkdir -p $RPM_BUILD_ROOT%{VAR_DIR}
mkdir -p $RPM_BUILD_ROOT/usr/lib/systemd/system/
cp $RPM_SOURCE_DIR/c3a-securityservice.service $RPM_BUILD_ROOT/usr/lib/systemd/system/
# Copy things into said folders
cp $RPM_SOURCE_DIR/default_args $RPM_BUILD_ROOT%{BIN_DIR}
cp $RPM_SOURCE_DIR/run.sh $RPM_BUILD_ROOT%{BIN_DIR}
cp $RPM_SOURCE_DIR/custom_args $RPM_BUILD_ROOT%{ETC_DIR}
cp $RPM_SOURCE_DIR/log4j2.xml $RPM_BUILD_ROOT%{ETC_DIR}
cp $RPM_SOURCE_DIR/config.xml $RPM_BUILD_ROOT%{ETC_DIR}
cp $RPM_SOURCE_DIR/securityservice.log $RPM_BUILD_ROOT%{VAR_DIR}
install -m 644 $RPM_SOURCE_DIR/securityservice-0.0.1.jar $RPM_BUILD_ROOT%{BIN_DIR}/
%pre
groupadd c3a || true
useradd -d %{VAR_DIR} -g c3a securityservice || true
%post
# Start the SecurityService once the install has completed
systemctl daemon-reload
if [ $1 -gt 1 ] ; then
echo "restart SecurityService"
systemctl restart c3a-securityservice || true
else
echo "start SecurityService"
systemctl enable c3a-securityservice || true
systemctl start c3a-securityservice || true
fi
%preun
# the preun section is where you can run commands before the rpm is removed
if [ $1 == 0 ] ; then
echo "stop SecurityService"
systemctl stop c3a-securityservice || true
systemctl disable c3a-securityservice || true
systemctl daemon-reload
systemctl reset-failed c3a-securityservice || true
fi
%clean
#rm -rf $RPM_BUILD_ROOT
#rm -rf %{_tmppath}/%{name}
#rm -rf %{_topdir}/BUILD/%{name}
%files
# list files owned by the package here
%attr(-, securityservice, c3a) %dir %{BIN_DIR}
%attr(-, securityservice, c3a) %dir %{LOG_DIR}
%attr(-, securityservice, c3a) %dir %{VAR_DIR}
%attr(644, securityservice, c3a) %{BIN_DIR}/securityservice-0.0.1.jar
%attr(-, securityservice, c3a) %{BIN_DIR}/default_args
%attr(750, securityservice, c3a) %{BIN_DIR}/run.sh
%attr(-,securityservice, c3a) %VAR_DIR/securityservice.log
%attr(664, securityservice, c3a) %config(noreplace) %{ETC_DIR}/*
/usr/lib/systemd/system/c3a-securityservice.service
%changelog
The gradle file:
buildscript {
ext {
springBootVersion = '2.0.0.RC2'
}
repositories {
mavenCentral()
maven { url "https://repo.spring.io/snapshot" }
maven { url "https://repo.spring.io/milestone" }
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
}
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management'
apply plugin: 'jacoco'
group = 'com.cetrea'
version = '0.0.1'
sourceCompatibility = 1.8
repositories {
mavenCentral()
maven { url "https://repo.spring.io/snapshot" }
maven { url "https://repo.spring.io/milestone" }
}
// Exclude springboots own logging framework as we want log4j2
configurations {
all*.exclude group: 'org.springframework.boot', module: 'spring-boot-starter-logging'
}
dependencies {
compile('org.springframework.boot:spring-boot-starter')
compile('org.springframework.boot:spring-boot-starter-web')
compile('org.springframework.boot:spring-boot-starter-security')
compile('org.springframework.boot:spring-boot-starter-log4j2')
compile('org.springframework:spring-oxm')
compile('org.codehaus.castor:castor-xml:1.3.3')
compile('org.apache.commons:commons-collections4:4.1')
testCompile('org.springframework.boot:spring-boot-starter-test')
testCompile('org.springframework.security:spring-security-test')
}
jacoco{
toolVersion = "0.7.9"
}
jacocoTestReport {
group = "reporting"
description = "generate Jacoco coverage reports after running tests."
additionalSourceDirs = files(sourceSets.main.allJava.srcDirs)
}
jar {
baseName = 'securityservice'
from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } }
manifest {
attributes 'Main-Class': 'com.cetrea.securityservice.SecurityService'
}
}
Is this a gradle problem or an rpm problem?

Related

running sh command in jenkinsfile to find all extensio file in the folder

I am running a shell script within the Jenkins pipeline and i want to run find command to get all extension files and copy them to the scan folder inside the "AppName" folder
here's the code:
stage("SCA Check"){
node("default"){
checkout scm
conf.findAll { key, value -> key.contains("token.") }.each { key, value ->
tokens << string(credentialsId: value, variable: key.replace('token.',''))
}
withEnv(vars) {
withCredentials(tokens){
dir(dirpath) {
dir(AppName) {
git url: gitURL,
credentialsId: 'bitbucket-https-url-rdonly',
branch: branchName
}
sh """#!/bin/bash
set
echo '${scaInfo}'
python --version
cp Action.py '${AppName}' && cd '${AppName}' && mkdir scan
"find . -regex '.*\.\(sql\|conf\|py\|csv\|coveragerc\|css\|eot\|etlconf\|hql\|html\|idx\|ini\|js\|json\|log\|map\|md\|pack\|pdf\|sample\|sh\|svg\|ttf\|txt\|woff\|woff2\)$' -exec cp {} scan/ \;"
echo find
cd scan && zip scan.zip * && mv scan.zip .. && cd ..
python Action.py '${scaInfo}'
"""
}
}
}
}}
}
But this gives error while running pipeline:
Branch event
Obtained jenkinsfile from 1ace31daa88df82dd21cb4a04251065b78562fdf
Running in Durability level: PERFORMANCE_OPTIMIZED
[Bitbucket] Notifying commit build result
[Bitbucket] Build result notified
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 108: unexpected char: '\' # line 108, column 22.
find . -regex '.*\.\(sql\|conf\|py\|csv\|coveragerc\|css\|eot\|etlconf\|hql\|html\|idx\|ini\|js\|json\|log\|map\|md\|pack\|pdf\|sample\|sh\|svg\|ttf\|txt\|woff\|woff2\)$' -exec cp {} scan/ \;
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
How would i resolve the issue so that i would get all the extension files within the dir & sub-dir & copy to the "scan" folder so that i would go ahead and create zip file of the folder and ship to different location

AWS CodeBuild buildspec bash syntax error: bad substitution with if statement

Background:
I'm using an AWS CodeBuild buildspec.yml to iterate through directories from a GitHub repo. Before looping through the directory path $TF_ROOT_DIR, I'm using a bash if statement to check if the GitHub branch name $BRANCH_NAME is within an env variable $LIVE_BRANCHES. As you can see in the error screenshot below, the bash if statement outputs the error: syntax error: bad substitution. When I reproduce the if statement within a local bash script, the if statement works as it's supposed to.
Here's the env variables defined in the CodeBuild project:
Here's a relevant snippet from the buildspec.yml:
version: 0.2
env:
shell: bash
phases:
build:
commands:
- |
if [[ " ${LIVE_BRANCHES[*]} " == *"$BRANCH_NAME"* ]]; then
# Iterate only through BRANCH_NAME directory
TF_ROOT_DIR=${TF_ROOT_DIR}/*/${BRANCH_NAME}/
else
# Iterate through both dev and prod directories
TF_ROOT_DIR=${TF_ROOT_DIR}/*/
fi
- echo $TF_ROOT_DIR
Here's the build log that shows the syntax error:
Here's the AWS CodeBuild project JSON to reproduce the CodeBuild project:
{
"projects": [
{
"name": "terraform_validate_plan",
"arn": "arn:aws:codebuild:us-west-2:xxxxx:project/terraform_validate_plan",
"description": "Perform terraform plan and terraform validator",
"source": {
"type": "GITHUB",
"location": "https://github.com/marshall7m/sparkify_end_to_end.git",
"gitCloneDepth": 1,
"gitSubmodulesConfig": {
"fetchSubmodules": false
},
"buildspec": "deployment/CI/dev/cfg/buildspec_terraform_validate_plan.yml",
"reportBuildStatus": false,
"insecureSsl": false
},
"secondarySources": [],
"secondarySourceVersions": [],
"artifacts": {
"type": "NO_ARTIFACTS",
"overrideArtifactName": false
},
"cache": {
"type": "NO_CACHE"
},
"environment": {
"type": "LINUX_CONTAINER",
"image": "hashicorp/terraform:0.12.28",
"computeType": "BUILD_GENERAL1_SMALL",
"environmentVariables": [
{
"name": "TF_ROOT_DIR",
"value": "deployment",
"type": "PLAINTEXT"
},
{
"name": "LIVE_BRANCHES",
"value": "(dev, prod)",
"type": "PLAINTEXT"
}
Here's the associated buildspec file content: (buildspec_terraform_validate_plan.yml)
version: 0.2
env:
shell: bash
parameter-store:
AWS_ACCESS_KEY_ID_PARAM: TF_AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY_PARAM: TF_AWS_SECRET_ACCESS_KEY_ID
phases:
install:
commands:
# install/incorporate terraform validator?
pre_build:
commands:
# CodeBuild environment variables
# BRANCH_NAME -- GitHub branch that triggered the CodeBuild project
# TF_ROOT_DIR -- Directory within branch ($BRANCH_NAME) that will be iterated through for terraform planning and testing
# LIVE_BRANCHES -- Branches that represent a live cloud environment
- export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID_PARAM
- export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY_PARAM
- bash -version || echo "${BASH_VERSION}" || bash --version
- |
if [[ -z "${BRANCH_NAME}" ]]; then
# extract branch from github webhook
BRANCH_NAME=$(echo $CODEBUILD_WEBHOOK_HEAD_REF | cut -d'/' -f 3)
fi
- "echo Triggered Branch: $BRANCH_NAME"
- |
if [[ " ${LIVE_BRANCHES[*]} " == *"$BRANCH_NAME"* ]]; then
# Iterate only through BRANCH_NAME directory
TF_ROOT_DIR=${TF_ROOT_DIR}/*/${BRANCH_NAME}/
else
# Iterate through both dev and prod directories
TF_ROOT_DIR=${TF_ROOT_DIR}/*/
fi
- "echo Terraform root directory: $TF_ROOT_DIR"
build:
commands:
- |
for dir in $TF_ROOT_DIR; do
#get list of non-hidden directories within $dir/
service_dir_list=$(find "${dir}" -type d | grep -v '/\.')
for sub_dir in $service_dir_list; do
#if $sub_dir contains .tf or .tfvars files
if (ls ${sub_dir}/*.tf) > /dev/null 2>&1 || (ls ${sub_dir}/*.tfvars) > /dev/null 2>&1; then
cd $sub_dir
echo ""
echo "*************** terraform init ******************"
echo "******* At directory: ${sub_dir} ********"
echo "*************************************************"
terraform init
echo ""
echo "*************** terraform plan ******************"
echo "******* At directory: ${sub_dir} ********"
echo "*************************************************"
terraform plan
cd - > /dev/null
fi
done
done
Given this is just a side project, all files that could be relevant to this problem are within a public repo here.
UPDATES
Tried adding #!/bin/bash shebang line but resulted in the CodeBuild error:
Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: #!/bin/bash
version: 0.2
env:
shell: bash
phases:
build:
commands:
- |
#!/bin/bash
if [[ " ${LIVE_BRANCHES[*]} " == *"$BRANCH_NAME"* ]]; then
# Iterate only through BRANCH_NAME directory
TF_ROOT_DIR=${TF_ROOT_DIR}/*/${BRANCH_NAME}/
else
# Iterate through both dev and prod directories
TF_ROOT_DIR=${TF_ROOT_DIR}/*/
fi
- echo $TF_ROOT_DIR
Solution
As mentioned by #Marcin, I used an AWS managed image within Codebuild (aws/codebuild/standard:4.0) and downloaded Terraform within the install phase.
phases:
install:
commands:
- wget https://releases.hashicorp.com/terraform/${TERRAFORM_VERSION}/terraform_${TERRAFORM_VERSION}_linux_amd64.zip -q
- unzip terraform_${TERRAFORM_VERSION}_linux_amd64.zip && mv terraform /usr/local/bin/
I tried to reproduce your issue, but it all works fine for me.
The only thing I've noticed is that you are using $BRANCH_NAME but its not defined anywhere. But even with missing $BRANCH_NAME the buildspec.yml you've posted runs fine.
Update using hashicorp/terraform:0.12.28 image

Gradle: find and build all projects in folder

I have a folder with a lot of projects inside it (too much to manually write build files for them)
The projects are mostly in a flat layout:
root
-project 1
-project 2
-project 3
-project 4
-project 5
( -project 5.1)
But can be nested as shown above, and I need to account for this.
Ideally the following should happen:
I can run user#user:/root gradle build and every project in the directory shoudl be built as long as it contains a gradle build file
if a build fails just continue with the next one
How can I make this possible ?
How about this one-liner (not tested):
find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && gradle build || true" \;
Or, more verbose:
dirs=($(find . -type d))
for dir in "${dirs[#]}"; do
cd "$dir"
gradle build || true
done
I came up with a working solution:
def flist = []
// change to you workspace name
new File('./Workspace').eachDir {
//blacklist any folders you want
if (it.name !='.gradle' && it.name != 'master' && it.name!= 'Build-All') {
flist << it.name
}
}
// build task objects
flist.each { folder ->
task "${folder}"(type: GradleBuild) {
buildFile = "./Workspace/"+ folder + "/build.gradle"
dir = './' + folder
tasks = ['build']
}
}
// create super task
task (all, dependsOn: flist) {
}
You need to invoke it as such in the root directory: gradle :all --continue this has the benefit that any failing project builds will not halt the other builds.
Another bonus is that gradle gives a neat report about all failing builds.

How to make a Gradle project versioneye-friendly?

I have a Gradle project that I want to import to Versioneye to check if my dependencies are up to date, but it's a complex config file (with external variables etc.) and Versioneye does not manage to handle the dependencies properly.
I don't want to install the Versioneye gradle plugin.
How can I export the dependencies from my repo to Versioneye?
You can list all the dependencies gradle app:dependencies.
With a bit of string manipulation, you can export a "clean" dependencies file and manually upload it to Versioneye.
#!/bin/bash
OUT_DIR='versioneye'
OUT_FILE="${OUT_DIR}/build.gradle"
mkdir -p "${OUT_DIR}"
touch "${OUT_FILE}"
# copy your maven repositories closure below from build.gradle
tee "${OUT_FILE}" <<EOF >/dev/null
allprojects {
repositories {
maven {
url 'https://maven.google.com/maven-google-remote'
}
maven {
url "https://jitpack.io"
}
}
}
EOF
echo 'dependencies {' >> "${OUT_FILE}"
./gradlew app:dependencies | grep '^+---' | sed 's|+--- |compile "|' | sed 's| (\*)||g' | sed 's|$|"|' | sort -u >> "${OUT_FILE}"
echo '}' >> "${OUT_FILE}"
cat "${OUT_FILE}"
cd "${OUT_DIR}"
start .
cd -
echo 'Now, open versioneye.com and manually upload the genreated build.gradle file.'
This will generate a file that looks like this:
allprojects {
repositories {
maven {
url 'https://maven.google.com/maven-google-remote'
}
maven {
url "https://jitpack.io"
}
...
}
}
dependencies {
compile "com.android.support.test.espresso:espresso-contrib:2.2.2"
compile "com.android.support.test.espresso:espresso-core:2.2.2"
compile "com.android.support.test.espresso:espresso-intents:2.2.2"
compile "com.facebook.android:facebook-android-sdk:4.17.0"
compile "com.facebook.fresco:fresco:1.5.0"
compile "com.facebook.fresco:imagepipeline-okhttp3:1.5.0"
...
}
This file can be imported to Versioneye with a file upload and will be processed correctly.

Fortify plugin for gradle

I have been running fortify scan for some Java components. Below are the general steps followed:
For java Project:
mvn com.fortify.ps.maven.plugin:sca-maven-plugin:4.30:clean
mvn install -DskipTests -DSTABILITY_ID=1 -DRELEASE_NUMBER=0 -DBUID_ID=1
mvn -Dfortify.sca.debug=true -Dfortify.sca.Xmx=1800M -Dfortify.sca.Xss=5M -DSTABILITY_ID=2 -DRELEASE_NUMBER=2 package com.fortify.ps.maven.plugin:sca-maven-plugin:4.30:translate
sourceanalyzer -b build_id -Xmx1800M -Xss4M -scan -f build_id_results.fpr -logfile scan.log -clobber-log -debug-verbose
After this fpr files gets generated and is uploaded to the server.
Now I have to do the same for a component using gradle.
What would be the commands that I will have to use to generate the fpr files.
I have to remove duplicity, improve a little bit and probably create a plugin, but basically, try the following snippet.
/*
* Performs the Fortify security scan.
*
* 1) Runs source code translation.
* 2) Creates the export session file.
* 3) Submits the export session file for processing through the scp.
*
* Credentials and url for the scp are obtained from the gradle.properties file
* (or can be passed from the command line through the -P switch).
* <ul>
* <li>fortifyUploadUsername</li>
* <li>fortifyUploadPassword</li>
* <li>fortifyUploadUrl</li>
* </ul>
*/
task fortify(group: 'fortify', description: 'Security analysis by HP Fortify') << {
def fortifyBuildId = 'myProjectId'
logger.debug "Running command: sourceanalyzer -b $fortifyBuildId -clean"
exec {
commandLine 'sourceanalyzer', '-b', fortifyBuildId, '-clean'
}
def classpath = configurations.runtime.asPath
logger.debug "Running command: sourceanalyzer -b ${fortifyBuildId} -source ${sourceCompatibility} -cp $classpath src/**/*.java"
exec {
commandLine 'sourceanalyzer', '-b', fortifyBuildId, '-source', sourceCompatibility, '-cp', classpath, 'src/**/*.java'
}
def fortifyBuildFolder = 'build/fortify'
new File(fortifyBuildFolder).mkdirs()
def fortifyArtifactFileName = "$fortifyBuildId#${project.version}.mbs"
def fortifyArtifact = "$fortifyBuildFolder/$fortifyArtifactFileName"
logger.debug "Running command: sourceanalyzer -b ${fortifyBuildId} -build-label ${project.version} -export-build-session $fortifyArtifact"
exec {
commandLine 'sourceanalyzer', '-b', fortifyBuildId, '-build-label', project.version, '-export-build-session', "$fortifyArtifact"
}
logger.debug "Running command: sshpass -p <password> scp $fortifyArtifact <user>#$fortifyUploadUrl:$fortifyArtifactFileName"
exec {
commandLine 'sshpass', '-p', fortifyUploadPassword, 'scp', "$fortifyArtifact", "$fortifyUploadUsername#$fortifyUploadUrl:$fortifyArtifactFileName"
}
}

Resources