I'm trying to upload an artifact from Bamboo CI using CLI for Jfrog Artifactory
I need to upload .p2 plugins and I have two options:
Upload the .zip and deploy it as Bundle Artifact
Upload the uncompressed folder with all subfolders and data.
I'm trying to upload the uncompressed folder with all the subfolders and data using this command:
jfrog rt upload --include-dirs=true ${bamboo.build.working.directory}/unzip/${bamboo.public.name-update-site}/* p2-release-local/${bamboo.public.name-update-site}/
But the problem is that the subfolders are empty.
Also I try to use this command:
jfrog rt upload --flat=false ${bamboo.build.working.directory}/unzip/${bamboo.public.name-update-site}/* p2-release-local/${bamboo.public.name-update-site}/
This command upload all the subfolders with all the data but the path isn't correct because is:
/name-update-site/datos/agents-home/xml-data/build-dir/PREDEL-RELPLU-JOB1/unzip/name-update-site
The content of the variable ${bamboo.build.working.directory} is
/datos/agents-home/xml-data/build-dir/PREDEL-RELPLU-JOB1/
EDITED: Log info:
INFO: Listing Bamboo directory
prueba-update-site.zip
unzip
INFO: Listing files from unzip folder
prueba-update-site
INFO: Listing files from custom folder
artifacts.jar
content.jar
features
plugins
site.xml
uninstall_fortify_plugins.cmd
Any help?
Thanks.
Solved!
The solution is:
Unzip the artifact .zip archive
Upload with this command:
jfrog rt upload --flat=false "${bamboo.public.name-update-site}/*" p2-release-local/
Thanks.
Related
Copy files from artifacts to a folder
I am using Azure Devops Deployment group job to copy some files to the server
The 2 steps I am following are 1. Stop a Windows service using Powershell and looks OK
Then using CopyFiles task to copy the files from the Artifacts location
The artifcat is a zip file and the location is mapped correctly
I have to unzip the contents of the zip file and move to the target folder
Or move the zip file to the target folder and unzip the file contents after its copied
But the CopyFiles task is not copying any files [ I tried various combinations in Contents property to include zip extension etc but no luck]
Is any of the steps are missing or i am doing wrong? The screen shot of PIpeline is like below
Based on your requirement, you need to unzip the contents of the zip file and copy files.
I suggest that you can use Extract files to unzip the zip file.
In Copy files task, you don't need to specify a specific zip file in the source folder. You just need to specify the folder path where the files are located.
Here is an example: $(system.defaultworkingdirectory)/_Data Cloud Service Live/service
Background: I am renaming a certain set of env specific files in my source code via jenkins pipeline before publishing the artifact to nexus using mvn deploy goal
I'm using mv command to rename the file
mv <env>config.properties config.properties
This does rename the files successfully and when I download the zip from output console of jenkins, I have the renamed files(config.properties) in zip
But the very next time I download it from nexus or from the output console again, It gives a zip file with old file name config.properties. It is somehow renaming it back to original name.
Has anyone faced this before? Any inputs would help
I need download artifact level folder from artifactory on shell script. I have found some same topics but I not found solution. Please give me solution.
A tip for non PRO users:
Through the GUI go to the repository path (e.g. http://artifactory.mycompany.com:8081/artifactory/list/libs-release-local/path/to/my/folder/
Use the folder to recursively WGET the contents (e.g. wget -r --no-parent -nH --cut-dirs=4 --proxy=off http://artifactory.mycompany.com:8081/artifactory/list/libs-release-local/path/to/my/folder/)
When using the PRO version of Artifactory you can use this REST API for downloading a complete folder.
i'm having trouble with uploading zip file to nexus via jenkins using the nexus artifact uploader plugin
the weirdest part is that i'm uploading two files to Nexus, the first zip is being uploaded, no problems at all... but then, the second zip is doing the problems (the only difference between them, is their size, the second zip is bigger the 1GB)
my job is doing "gulp deploy" and compress it to a zip file, then, uploading it to nexus, and after that, i'm compiling a big (1.6GB) MSBuild project and compress (1.1GB) it too, but it won't upload it...
i'm using a windows slave to do the msbuild and the upload. when i try to upload the same file from the local centos to the out ip it works.enter image description here
my bad!
out of disk storage! so sorry!
the only thing that still bothers me is how the linux could have upload the same file?
maybe linux zip command is more efficient the power shell command?
Dunno... anyway... thanks
My team has code being built and tested in Jenkins and when the build process is done Jenkins produces a SNAPSHOT.jar file. I need to unpacked the snapshot.jar file and send the extracted files and folders to a network drive. What is the best way to do that?
I've tried a few Jenkins plugins, the most recent being artifactDeployer, but when the plugins deploy the artifacts, as a post-build action, they don't unpack the jar files; I would have to execute a windows batch command after they are deployed to unpack them but I cant because the plugin runs as a "post-build action" and the batch commands are done before the post-build actions. Is there a way to deploy the artifacts and unpack them without using a plugin? Or is there a plugin that will do both? What is the best way to achieve this?
The way I accomplished this was by using 7zip in a Windows batch command as a post-step in the jenkins project configuration.
The command is:
`7z x %WORKSPACE%\target\*.jar -oX:\"mapped network drive location" -y`
This extracts the artifacts out of the snapshot.jar file and places those artifacts into the network drive. I needed the files contained in the snapshot.jar to be sent to the network drive when the build completed. I am new to jenkins and the plugins I tried were post-build actions and only copied the snapshot.jar to a given location; they did not extract the artifacts out of the jar file. That is why I chose this route.