I have set of zip files that are in Artifact Repository and I need to get the latest artifact. The structure of the artefacts are as listed below
Homeloan
-> test-application-dev-local_1.zip
-> test-application-dev-local_2.zip
-> test-application-dev-local_3.zip
-> test-application-dev-local_4.zip
-> test-application-dev-local_5.zip
-> test-application-dev-local_6.zip
-> test-application-dev-local_7.zip
All these artefacts are an output of Msbuild. Everytime when the user checks-in their code it gets built in TeamCity and artefacts are uploaded to Jfrog.
Now I have a another TeamCity build which is triggered on adhoc basics which needs to get the latest artefact and in this case I need "test-application-dev-local_7.zip".
I'm using TeamcityArtifactory plugin to get artefacts and below is the spec I tried.
{
"files": [{
"aql": {
"items.find": {
"#build.name": "test-application-dev-local_*.zip"
}
},
"target": "somepath",
}]
}
With the above spec I get all the 7 zip files. I tried adding limit to the above spec(I'm not sure if this is the correct way) but I'm getting the error
{
"files": [{
"aql": {
"items.find": {
"#build.name": "test-application-dev-local_*.zip"
}
},
"limit":1
"target": "somepath",
}]
}
Error occurred while resolving dependencies from the spec: Unrecognized field "limit" (class org.jfrog.build.extractor.clientConfiguration.util.spec.Aql), not marked as ignorable (one known property: "items.find"])
I'm not sure how to retrieve the artefacts that was uploaded recently.
Even I was stuck at this , as the Artifactory documentation says -
"Currently sortBy, sortOrder, limit and offset are not supported in
TeamCity."
So had to look around for any other way to implement these . What worked for me ?
->
I was able to do so by JFrog CLI command .
Had to recreate the spec file on command line and for sort and limit purpose , used the CLI command instead of adding sort and limit in the spec file .
echo ">>> Downloading the archive from artifactory ..."
echo "{" > downloadSpec.json
echo " \"files\": [" >>downloadSpec.json
echo " {" >> downloadSpec.json
echo " \"pattern\": \"artifactory-repo-path/artifact-name*.tar.gz\"," >> downloadSpec.json
echo " \"target\": \"target-path\-to-download"" >> downloadSpec.json
echo " }" >> downloadSpec.json
echo " ]" >> downloadSpec.json
echo "}" >> downloadSpec.json
jfrog rt dl --spec=downloadSpec.json
--url="your-artifactory-server-url" --user="your-artifactory-user-name" --
password="your-artifactory-password" --sort-by=updated --sort-order=desc
--limit=1
where jfrog rt dl -> is the command to download
Related
I am struggle to get
1.latest successful commit id of github proj (this won't help: echo GIT_PREVIOUS_SUCCESSFUL_COMMIT %GIT_PREVIOUS_SUCCESSFUL_COMMIT%)
2.get this id from one job pipeline and pass to the other jenkins job pipeline
actually at step 2 I have issue
I have tried to save the id in temp file using
rm logfile.txt
echo $commitID | tee logfile.txt
and but jenkins gives an err
getting err
groovy.lang.MissingPropertyException: No such property: logfile for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
It won't allow to create temp file
also I have checked https://plugins.jenkins.io/parameterized-trigger but unable to use it in multiline or maven proj ( can't understand how to use it )
also tried to inject variable that is commit id I want to store as env.commitID but still unable to save this id anywhere at temp location or as env variable
further after save want to pass to another pipeline
pipeline {
agent any
options {
timeout(time: 1, unit: 'HOURS')
}
environment {
def myVariable = "foo"
py2Ana="DEFAULT"
SOURCE_CODE_URL = 'https://github.com/himanshukgit/hardwareInventory.git'
RELEASE_BRANCH = 'master'
}
stages {
stage('Git') {
agent any
steps {
sleep(5)
// Clean dir
deleteDir()
}
}
stage('transfer'){
steps{
bat """
#echo off
echo GIT_COMMIT %GIT_COMMIT%
echo GIT_BRANCH %GIT_BRANCH%
echo GIT_LOCAL_BRANCH %GIT_LOCAL_BRANCH%
echo GIT_PREVIOUS_COMMIT %GIT_PREVIOUS_COMMIT%
echo GIT_PREVIOUS_SUCCESSFUL_COMMIT %GIT_PREVIOUS_SUCCESSFUL_COMMIT%
echo GIT_URL %GIT_URL%
echo GIT_URL_N - %GIT_URL_N%
echo GIT_AUTHOR_NAME %GIT_AUTHOR_NAME%
echo GIT_COMMITTER_EMAIL %GIT_COMMITTER_EMAIL%
"""
// Checkout branch
script{
git branch: "$RELEASE_BRANCH", url: "$SOURCE_CODE_URL"
echo "============================================================"
echo GIT_PREVIOUS_SUCCESSFUL_COMMIT %GIT_PREVIOUS_SUCCESSFUL_COMMIT%
echo "============================================================"
echo "remove temp logfile"
sh rm logfile.txt
echo "adding GIT_PREVIOUS_SUCCESSFUL_COMMIT into the file"
echo "============================================================"
echo GIT_PREVIOUS_SUCCESSFUL_COMMIT %GIT_PREVIOUS_SUCCESSFUL_COMMIT% | tee logFile.txt
echo "============================================================"
}
}
}
}
}
For the first part, the git (or checkout) keywords return the relevant parameters as a dictionary, so you can just keep the returned value of the git keyword and through it access the relevant parameters you need. Something like:
node {
def gitVars = git branch: "$RELEASE_BRANCH", url: "$SOURCE_CODE_URL"
// gitVars will contain the following keys: GIT_BRANCH, GIT_COMMIT, GIT_LOCAL_BRANCH, GIT_PREVIOUS_COMMIT, GIT_PREVIOUS_SUCCESSFUL_COMMIT, GIT_URL
println gitVars
println "Previous successful commit is : ${gitVars.GIT_PREVIOUS_SUCCESSFUL_COMMIT}"
}
For the second part you can use the built in build keyword for triggering another job (of any type) you have Multiple Options for controlling the execution.
Here is a simple example:
node {
...
build job: 'My_Downstream_Job', wait: true, propagate: true, parameters: [string(name: 'CommitSha1', value: gitVars.GIT_PREVIOUS_SUCCESSFUL_COMMIT)]
}
In my scenario I am trying to add tag name as a branch name and date with version number using groovy script. If we print the branch name date we can see them in console. But if we are trying to add both as tag names we are getting error. Please find the below script for reference.
pipeline {
environment {
dockerImage = ''
imageName = 'gcr.io/projectName/web-ui'
tag = VersionNumber(versionNumberString: '${BUILD_DATE_FORMATTED,"yyyyMMdd"}-${BUILDS_TODAY}');
local = ''
}
stages {
stage('Dockerize'){
steps {
script {
local = "${env.GIT_BRANCH}".replace("feature/", "").replace("/", "-")
echo "${local}"
dockerTag = "${local}-${tag}"
echo "${dockerTag}"
sh 'docker build -t ${imageName}:${dockerTag} .'
sh 'docker push ${imageName}:${dockerTag}'
}
}
}
}
}
Below is the error message I am getting.
docker build -t gcr.io/projectname/web-ui: .
invalid argument "gcr.io/projectname/web-ui:" for "-t, --tag" flag: invalid reference format
Console messages for echo statement is :
test
[Pipeline] echo
test-20210416-11
When I run this command I get the following:
$ docker build -t grc.io/projectName/web-ui:test-20210416-11 .
invalid argument "grc.io/projectName/web-ui:test-20210416-11" for "-t, --tag"
flag: invalid reference format: repository name must be lowercase
See 'docker build --help'.
I don't know if projectName is your actual repository name or if this is just obfuscated, but you should review the naming restrictions on docker repositories. Namely:
The repository name needs to be unique in that namespace, can be two to 255 characters, and can only contain lowercase letters, numbers, hyphens (-), and underscores (_).
I have a Jenkins scripted pipeline set up where I execute a number of Maven builds. I want to treat one of them as non-fatal if the root cause is a known one.
I have tried to achieve that by inspecting the Exception's message, e.g.
try {
sh "mvn -U clean verify sonar:sonar ${sonarcloudParams}"
} catch ( Exception e ) {
if ( e.getMessage().contains("not authorized to run analysis")) {
echo "Marking build unstable due to missing SonarCloud onboarding. See https://cwiki.apache.org/confluence/display/SLING/SonarCloud+analysis for steps to fix."
currentBuild.result = 'UNSTABLE'
}
}
The problem is that the exception's message is not the one from Maven, but instead "script returned exit code 1".
There is no further information in e.getCause().
How can I access the cause of the Maven build failure inside my scripted pipeline?
You can get the command output, then parse it containers specific message.
def output = sh(
script: "mvn -U clean verify sonar:sonar ${sonarcloudParams}",
returnStdout: true
).trim()
echo "mvn cmd output: ${output}"
if(output.contains('not authorized to run analysis')) {
currentBuild.result = 'UNSTABLE'
}
// parse jenkins job build log
def logUrl = env.BUILD_URL + 'consoleText'
def cmd = "curl -u \${JENKINS_AUTH} -k ${logUrl} | tail -n 50"
def output = sh(returnStdout: true, script: cmd).trim()
echo "job build log: ${output}"
if(output.contains('not authorized to run analysis')) {
currentBuild.result = 'UNSTABLE'
}
One option is to inspect the last log lines using
def sonarCloudNotEnabled = currentBuild.rawBuild.getLog(50).find {
line -> line.contains("not authorized to run analysis")
}
However, this does not work by default. On the Jenkins instance I'm using it errors out with
Scripts not permitted to use method org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper getRawBuild. Administrators can decide whether to approve or reject this signature.
Generally, to get the artifact of the latest successful build, I do a wget on the below URL:
http://jenkins.com/job/job_name/lastSuccessfulBuild/artifact/artifact1/jenkins.txt
Is there a way, I can do a wget on lastSuccessfulBuild and get a build_id like below?
build_id=`wget http://jenkins.p2pcredit.local/job/job_name/lastSuccessfulBuild`
Yes, there is a way and it is pretty straightforward:
$ build_id=`wget -qO- jenkins_url/job/job_name/lastSuccessfulBuild/buildNumber`
$ echo $build_id
131 # that's my build number
I think the best solution is using groovy with zero dependencies.
node {
script{
def lastSuccessfulBuildID = 0
def build = currentBuild.previousBuild
while (build != null) {
if (build.result == "SUCCESS")
{
lastSuccessfulBuildID = build.id as Integer
break
}
build = build.previousBuild
}
println lastSuccessfulBuildID
}
}
You do not need specify jenkins_url or job_name etc to get last successful build id.
Then you could use it easily in all Jenkinsfile in repositories without useless configurations.
Tested on Jenkins v2.164.2
I find very useful querying permalinks file inside Jenkins workspace.
This allows you, to not only get the last successful build, but also other builds Jenkins considers relevant.
You can see it's content adding this line in Build section, in Execute Shell panel:
cat ../../jobs/$JOB_NAME/builds/permalinks
For example, in my case:
+ cat ../../jobs/$JOB_NAME/builds/permalinks
lastCompletedBuild 56
lastFailedBuild 56
lastStableBuild 51
lastSuccessfulBuild 51
lastUnstableBuild -1
lastUnsuccessfulBuild 56
From there, you would want to parse the number of the last successful build, or any other provided by permalinks, you can do this running:
lastSuccesfulBuildId=$(cat ../../jobs/$JOB_NAME/builds/permalinks | grep lastSuccessfulBuild | sed 's/lastSuccessfulBuild //')
If you want the DisplayName of the last successful job and not build number:
curl --user <username>:<tokenOrPassword> https://<url>/job/<job-name>/lastSuccessfulBuild/api/json | jq -r '.displayName'
Or in groovy
def buildName = Jenkins.instance.getItem('jobName').lastSuccessfulBuild.displayName
Pipeline script solution :
import groovy.json.JsonSlurper
def jResponse = httpRequest "https:/<yourjenkinsjoburlpath>/lastSuccessfulBuild/buildNumber"
def json = new JsonSlurper().parseText(jResponse.content)
echo "Status: ${json}"
jenkins console output:
HttpMethod: GET
URL: https://***/lastSuccessfulBuild/buildNumber
Sending request to url: https://***/lastSuccessfulBuild/buildNumber
Response Code: HTTP/1.1 200 OK
Success code from [100‥399]
[Pipeline] echo
Status: 20
To get the last successful build number using curl:
curl --user userName:password https://url/job/jobName/api/xml?xpath=/*/lastStableBuild/number
to get the job build number simply do:
def build_Number = Jenkins.instance.getItem('JobName').lastSuccessfulBuild.number
Im having issues trying to set up go to run the current file from Sublime text 2.
Here's what I have in my go.sublime-build file
{
"cmd": [ "go", "run", "${file}" ]
}
When I try to run build on a go source file, I get the error
[Error 6] The handle is invalid
[cmd: [u'go run', u'C:\\Users\\gprasant\\Documents\\GitHub\\programming_pearls\\src\\go\\quicksort.go']]
[dir: C:\Users\gprasant\Documents\GitHub\programming_pearls\src\go]
Is there any way to get this fixed ? Or is there another plugin in Sublime text for Go development?
Installing GoSublime should get this working for you. After installing and restarting ST2: do ctrl-B, type "run" and hit enter.
I got by with
{
"cmd": "go run $file",
"shell" : true
}
In ST3: it is changed to be:
{
"shell_cmd": "go run ${file}"
}
On my mac, I needed the following code in:
/Users/your_user_name/Library/Application Support/Sublime Text 2/Packages/User/go.sublime-build
go.sublime-build
{
"cmd": ["go run '${file}'"],
"selector": "source.go",
"path": "/usr/local/go/bin",
"shell": true
}
"cmd" line quoting is to correctly handle file paths with spaces.
"shell" line is needed since commenting it out breaks it.
"path" line is needed because the basic shell, doesn't have access to my .zshrc file include the export GOPATH statement defining the go path.
After that any .go file should build and run with command+B, leaving the stdout message in a console built into sublime text 2.
what about:
{
"cmd": ["go", "run", "${file}"],
"path": "/user/local/go/bin"
}
I like GoSublime, just hate to type run each time when click Command + B
SublimeText 2
build-system for golang, making F4/shift-F4 work (next error/prev error)
1st, create a file: ~/gosublime_build.sh
GOPATH=~/go
export GOPATH
echo "GOPATH:$GOPATH"
if [ "$3." = "RUN." ]
then
EXENAME=${1##*/}
EXENAME=$GOPATH/bin/$EXENAME
echo $EXENAME
$($EXENAME)
echo "code: $?"
exit
fi
echo "go build $2"
cd /usr/local/go/bin
./go build -o ~/temp.go.compiled $2
if [ $? -eq 0 ]
then
cd $1
echo "Project: " $1
/usr/local/go/bin/go install
echo "go install exit code: $?"
else
echo "go build exit code: $?"
fi
2nd:
chmod 777 ~/gosublime_build.sh
3rd: create a new sublime2 build-system for "go" (Tools/Build System/New)
{
"cmd": ["~/gosublime_build.sh $file_path $file"]
,"shell": true
,"selector": "source.go"
,"file_regex": "([\\w/_-]+[.]{1}[\\w./_-]+?):([0-9]+):?([0-9]+)?(.*)?"
}
4th: select your new build-system (Tools/Build System)
5th: build with Ctrl-B, F4/Shift-F4: next/prev error
If anybody knows how to instruct the go compiler to inform FULL PATH of file and line for each error, this process can be simplified